WO2018150917A1 - Steered object, moving device, imaging device, movement control method, movement assist method, movement control program, and movement assist program - Google Patents

Steered object, moving device, imaging device, movement control method, movement assist method, movement control program, and movement assist program Download PDF

Info

Publication number
WO2018150917A1
WO2018150917A1 PCT/JP2018/003690 JP2018003690W WO2018150917A1 WO 2018150917 A1 WO2018150917 A1 WO 2018150917A1 JP 2018003690 W JP2018003690 W JP 2018003690W WO 2018150917 A1 WO2018150917 A1 WO 2018150917A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
movement
imaging device
unit
subject
Prior art date
Application number
PCT/JP2018/003690
Other languages
French (fr)
Japanese (ja)
Inventor
聡司 原
浩一 新谷
真琴 尾崎
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2018150917A1 publication Critical patent/WO2018150917A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters

Definitions

  • the present invention relates to a steered object, a moving device, an imaging device, a movement control method, a movement assist method, a movement control program, and a movement assist program.
  • the unmanned air vehicle measures the distance between the airframe and the object to be imaged while flying on a predetermined flight route, and determines the zoom magnification of the camera from the result to image the object to be imaged.
  • the present invention has been made in view of the above, and a steered object, a moving device, an imaging device, a movement control method, a movement that can perform appropriate processing even in a situation where a moving route at the time of shooting is not fixed
  • An object is to provide an auxiliary method, a movement control program, and a movement auxiliary program.
  • a steered body is capable of communicating with an imaging device that captures an image of a subject and generates image data, and the imaging device. Whether or not the steered body is provided with a movable device that can be held and moved together with the imaging device, and is movable according to a change in a relative relationship between a predetermined imaging target and the imaging device. To detect the information necessary for determining whether or not the information can be used to determine whether or not movement according to the change in the relative relationship is possible, and to perform control according to the determination result A body control unit is provided.
  • the moving device is a moving device that can communicate with an imaging device that captures an image of a subject and generates image data, and that can move with the imaging device while holding the imaging device.
  • a first control unit is provided that determines whether or not the object is movable according to a change in a relative relationship between the object to be imaged and the imaging device, and performs control according to the determination result.
  • An imaging apparatus is an imaging apparatus that is communicable with a mobile device and is held by the mobile device, and that captures an image of a subject to generate image data.
  • a second control unit that detects information necessary for the mobile device to determine whether it can move according to a change in a relative relationship with the imaging device and transmits the detected information to the mobile device It is provided with.
  • the movement control method is a movement control method performed by a moving apparatus that is capable of communicating with an imaging apparatus that captures an image of a subject and generates image data, and that is movable with the imaging apparatus.
  • a determination step for determining whether or not the object can be moved in accordance with a change in a relative relationship between the imaging object specified in advance and the imaging device, and a determination result in the determination step is read from the recording unit.
  • a movement assistance method is a movement assistance method performed by an imaging apparatus that is communicable with a mobile device and is held by the mobile device and that captures an image of a subject and generates image data.
  • the movement control program is capable of communicating with an imaging apparatus that captures an image of a subject and generates image data.
  • the movement assistance program according to the present invention is communicable with a mobile device and is held by the mobile device, and captures an image of a subject and generates image data.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an imaging system according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram showing a functional configuration of the imaging system according to Embodiment 1 of the present invention.
  • FIG. 3 is a diagram schematically showing the external configuration of the operating device.
  • FIG. 4 is a diagram schematically showing operation assignment of the operation input unit of the operation device in the normal operation mode.
  • FIG. 5 is a diagram for explaining the normal lock-on mode.
  • FIG. 6 is a diagram for explaining the angle lock-on mode.
  • FIG. 7 is a diagram showing an outline of processing when shifting from the normal operation mode to the lock-on mode.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an imaging system according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram showing a functional configuration of the imaging system according to Embodiment 1 of the present invention.
  • FIG. 3 is a diagram schematically showing the external configuration of the operating device
  • FIG. 8 is a diagram illustrating a display example on the display unit of the controller device when the mode is shifted to the lock-on mode.
  • FIG. 9 is a diagram illustrating a display example on the display unit of the controller device when the composition adjustment mode is set.
  • FIG. 10 is a diagram schematically illustrating an example of operation assignment in the operation input unit when the composition adjustment mode is set.
  • FIG. 11 is a flowchart illustrating an outline of processing performed by the controller device.
  • FIG. 12A is a flowchart (part 1) showing an overview of processing performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 12B is a flowchart (part 2) illustrating an overview of the process performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 13 is a flowchart showing an outline of subject tracking control processing performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 14 is a diagram showing an outline of the movement distance calculation process performed by the movement determination unit of the mobile device according to Embodiment 1 of the present invention.
  • FIG. 15 is a flowchart showing an overview of processing of shooting direction tracking control performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 16 is a flowchart showing an overview of zoom control processing performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 17 is a diagram illustrating an outline of a moving distance calculation process performed by the movement determination unit of the moving device according to the first embodiment of the present invention during the zoom control process.
  • FIG. 18 is a flowchart showing an overview of slide control processing performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 19A is a flowchart (part 1) illustrating an overview of processing performed by the imaging apparatus according to Embodiment 1 of the present invention.
  • FIG. 19B is a flowchart (part 2) illustrating an overview of the process performed by the imaging device according to Embodiment 1 of the present invention.
  • FIG. 20 is a flowchart illustrating an overview of zoom control processing performed by the imaging apparatus according to Embodiment 1 of the present invention.
  • FIG. 21 is a diagram showing a schematic configuration of an imaging system according to Embodiment 2 of the present invention.
  • FIG. 22 is a block diagram showing a functional configuration of the imaging system according to Embodiment 2 of the present invention.
  • FIG. 23 is a flowchart illustrating an outline of a slide control process performed by the mobile device according to the second embodiment of the present invention.
  • FIG. 24 is a diagram schematically illustrating the relationship between the distal end portion of the endoscope and the subject when the size of the subject image changes.
  • FIG. 25 is a diagram schematically showing changes in the subject image in the case shown in FIG.
  • FIG. 26 is a diagram schematically illustrating a situation in which the moving device has advanced from the situation illustrated in FIG. 24 and has approached the subject.
  • FIG. 27 is a diagram schematically showing changes in the subject image in the case shown in FIG.
  • FIG. 28 is a flowchart illustrating an outline of slide control processing performed by the imaging apparatus according to Embodiment 2 of the present invention.
  • An imaging system includes a steered body in which an imaging device is mounted on a moving device.
  • the steered object detects information necessary for determining whether or not the steerable object can move in accordance with a change in the relative relationship between the imaging object specified in advance and the imaging apparatus, and uses the information. It is determined whether or not movement according to a change in relative relationship is possible, and control according to the determination result is performed.
  • the mobile device constituting the steered body is an unmanned aerial vehicle (Embodiment 1) and an endoscope (Embodiment 2) will be described by way of example.
  • Embodiment 1 unmanned aerial vehicle
  • Embodiment 2 an endoscope
  • these are merely examples, and the mobile device It is also possible to apply a self-running robot.
  • the moving device is an unmanned aerial vehicle.
  • the imaging device is mounted on an unmanned aerial vehicle.
  • the mobile device is wirelessly connected so as to be communicable with the operating device, and is driven according to an instruction from the operating device, and receives an instruction signal for the imaging device and transmits the instruction signal to the imaging device.
  • the steered object has a lock-on function, and tracks the shooting target while sequentially determining whether or not the movement of the shooting target can be tracked.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an imaging system according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram illustrating a functional configuration of the imaging system according to the first embodiment.
  • An imaging system 1 shown in FIGS. 1 and 2 includes a steered body 2 that has an imaging function and can be moved by flying, and an operating device 3 that steers imaging and movement of the steered body 2.
  • the steered body 2 includes a moving device 4 and an imaging device 5 that is detachably attached to the moving device 4.
  • the imaging device 5 may be attached to the moving device 4 via a stabilizer, a vibration correction mechanism (for example, a gimbal), a rig, or the like.
  • the mobile device 4 and the imaging device 5 are connected to each other via a cable 6 such as a USB (Universal Serial Bus) so as to be capable of bidirectional communication.
  • the mobile device 4 and the controller device 3 can perform wireless communication in a predetermined frequency band.
  • the mobile device 4 and the imaging device 5 are connected by the cable 6.
  • the present invention is not limited to this, and a configuration capable of wireless communication may be adopted.
  • the positional relationship between the moving device 4 and the imaging device 5 may be fixed, or the posture of the imaging device 5 with respect to the moving device 4 may be changed.
  • the to-be-steered body 2 detects information necessary for determining whether or not it can move according to a change in the relative relationship between the imaging object 5 specified in advance and the imaging device 5, and the detected information
  • the to-be-steered body control part 21 which determines whether the movement according to the change of the relative relationship is possible using is performed, and performs control according to the determination result is provided.
  • the steered body control unit 21 is configured using a CPU (Central Processing Unit) or the like, and controls the steered body 2 according to various instruction signals transmitted from the operation device 3.
  • the steered body control unit 21 includes a first control unit 49 of the moving device 4 and a second control unit 56 of the imaging device 5 which will be described later.
  • the steered body control unit 21 is a circuit unit that performs various types of control by specific sequence control in cooperation with a dedicated circuit or program, and includes a circuit unit for artificial intelligence as necessary. It is also possible to perform control using results such as deep learning and machine learning (this is also true for the third control unit 36 of the controller device 3).
  • a dedicated circuit or program cooperates with a circuit unit that performs various types of control with specific sequence control, including artificial intelligence circuit units as necessary, and results such as deep learning and machine learning.
  • a part of the configuration and circuit of the control unit that performs the used control is classified according to function. Note that some of the individual functions of the first control unit 49 and the second control unit 56 may be included in the other control unit.
  • the moving device 4 determines whether or not the moving device 4 can move according to a change in the relative relationship between the imaging object specified in advance and the imaging device 5, and performs control according to the determination result.
  • the moving device 4 is an unmanned aerial vehicle (UAV: Unmanned Aero Vehicle), and is configured as a drone of a rotary wing unmanned aircraft having four rotors 4a.
  • the number of rotors 4a is not limited to four and may be other numbers.
  • the moving device 4 is not limited to a drone of a rotary wing drone, but may be constituted by other unmanned aircraft such as a drone of a fixed wing drone.
  • the mobile device 4 may be any device that can be operated wirelessly and can be self-propelled, and may be, for example, a self-propelled robot, a car, a model car, a ship, or the like.
  • the mobile device 4 includes a propulsion unit 41, a power source 42, a spatial information acquisition unit 43, a position / orientation detection unit 44, an altitude / attitude detection unit 45, a first recording unit 46, a first communication unit 47, and a first communication unit 47. 2 communication unit 48 and first control unit 49.
  • the propulsion unit 41 uses a plurality of (four in the example of FIG. 1) rotors 4a and a plurality of motors (not shown) for driving the rotors 4a under the control of the first control unit 49.
  • the mobile device 4 is made to fly.
  • the power source 42 is configured using a battery and a booster circuit.
  • the power source 42 supplies a predetermined voltage to each part of the moving device 4.
  • the spatial information acquisition unit 43 is configured using a laser radar or the like.
  • the spatial information acquisition unit 43 irradiates the pulse laser beam radially from the moving device 4 and measures the feedback time of the reflected pulse laser beam to thereby detect an object (direction of each irradiation point, each of the irradiation points) around the moving device 4. Spatial information regarding the distance to the irradiation point) is acquired, and the acquired result is output to the first control unit 49.
  • the spatial information acquisition unit 43 may acquire the spatial information by irradiating ultrasonic waves instead of the laser.
  • the spatial information acquisition unit 43 may further include a non-contact type proximity sensor.
  • the position / orientation detection unit 44 is configured using a GPS (Global Positioning System) receiver, a magnetic direction sensor, or the like.
  • the position / orientation detection unit 44 includes the current position information regarding the current position of the mobile device 4 and an angle (azimuth) formed by the nose direction (the direction in which the nose faces) of the mobile device 4 with respect to a predetermined reference orientation (for example, north). )
  • Direction information is detected, and the detection result is output to the first control unit 49.
  • the altitude posture detection unit 45 is configured using an atmospheric pressure sensor, a gyro sensor (angular velocity sensor), an inclination sensor (acceleration sensor), and the like.
  • the altitude posture detection unit 45 includes altitude information regarding the altitude of the imaging device 5, tilt angle information regarding the tilt angle of the image capturing device 5 (the tilt angle with respect to the reference posture in which the four rotors 4 a are positioned in the horizontal plane), and rotation of the moving device 4. Rotation angle information related to the angle (rotation angle centered on a vertical line passing through the center position of the four rotors 4 a) is detected, and the detection result is output to the first control unit 49.
  • the first recording unit 46 is configured using a recording medium such as a flash memory or an SDRAM (Synchronous Dynamic Random Access Memory).
  • the first recording unit 46 records various programs for driving the mobile device 4 and temporarily records information being processed.
  • the first communication unit 47 is configured using a predetermined communication module.
  • the first communication unit 47 performs wireless communication with the operation device 3 that remotely controls the flight operation of the mobile device 4 under the control of the first control unit 49. Further, the first communication unit 47 transmits information input from the imaging device 5 to the operation device 3 under the control of the first control unit 49.
  • the second communication unit 48 is configured using a communication module.
  • the second communication unit 48 communicates with the imaging device 5 via the cable 6 under the control of the first control unit 49.
  • the first control unit 49 determines whether or not the first control unit 49 is movable in accordance with a change in the relative relationship between the imaging target specified in advance and the imaging device 5, and performs control according to the determination result.
  • the first control unit 49 is configured using a CPU or the like.
  • the first control unit 49 receives an instruction signal from the operation device 3 input via the first communication unit 47 and the imaging device 5 input via the second communication unit 48. In accordance with the data, the operation of the propulsion unit 41 (the flight operation of the moving device 4) and the communication with the imaging device 5 and the operation device 3 are controlled.
  • the first control unit 49 includes a power supply determination unit 491, a movement determination unit 492, an attitude determination unit 493, a propulsion control unit 494, a direction control unit 495, and a first communication control unit 496.
  • the power source determination unit 491 detects the remaining amount (power remaining amount) of the power source 42 and determines the flight time and the flight distance that the mobile device 4 is capable of based on the detection result.
  • the movement determination unit 492 has the spatial information acquired by the spatial information acquisition unit 43, the current position information and direction information detected by the position / orientation detection unit 44, the altitude information detected by the altitude / attitude detection unit 45, and the power source determination unit 491.
  • the moving distance of the moving device 4 is determined based on the flight time information and the like. Note that the function of the movement determination unit 492 may be included in the second control unit 56 of the imaging device 5.
  • the posture determination unit 493 determines the posture of the imaging device 5 based on the tilt angle information and the rotation angle information detected by the altitude posture detection unit 45.
  • the propulsion control unit 494 propels the mobile device 4 based on the determination results of the power source determination unit 491, the movement determination unit 492, and the posture determination unit 493. Specifically, the propulsion control unit 494 raises, moves forward, stops, moves backward, and moves down the moving device 4 by independently controlling the rotational speeds of the four rotors 4a. For example, the propulsion control unit 494 changes the moving direction of the moving device 4 to the front and rear (the nose direction of the moving device 4 is front) or to the left and right (right and left when viewed in the nose direction of the moving device 4).
  • the direction control unit 495 rotates the moving device 4 based on the determination results of the power source determination unit 491, the movement determination unit 492, and the posture determination unit 493. Specifically, the direction control unit 495 rotates the moving device 4 by independently controlling the number of rotations of the four rotors 4a.
  • the first communication control unit 496 controls communication between the first communication unit 47 and the second communication unit 48. Specifically, the first communication control unit 496 transmits information input from the imaging device 5 via the second communication unit 48 to the controller device 3 via the first communication unit 47. In addition, the first communication control unit 496 receives information including operation information input from the operation device 3 via the first communication unit 47 (hereinafter referred to as “mobile device information”) via the second communication unit 48. It transmits to the imaging device 5.
  • the moving device information is the current position of the moving device 4, control information (control information) related to a control signal for controlling the movement of the moving device 4 transmitted from the operating device 3, and the moving direction and moving of the moving device 4.
  • the altitude of the apparatus 4 and the state of the moving apparatus 4 for example, normal movement, return movement, remaining battery level).
  • the imaging device 5 detects and detects information necessary for the moving device 4 to determine whether or not it can move according to a change in the relative relationship between the imaging object specified in advance and itself. Information is transmitted to the mobile device 4.
  • the imaging device 5 is fixed so as not to move with respect to the moving device 4 in a state where the shooting direction is aligned with the nose direction of the moving device 4.
  • the imaging device 5 can perform still image shooting and moving image shooting.
  • the imaging device 5 may be fixed to the moving device 4 via a rotation mechanism that can rotate with respect to the moving device 4.
  • the imaging device 5 may be fixed to the moving device 4 via a stabilizer, a vibration correction mechanism (for example, a gimbal), a rig, or the like.
  • the imaging device 5 includes an imaging unit 51, an elevation angle direction detection unit 52, a voice input unit 53, a third communication unit 54, a second recording unit 55, and a second control unit 56.
  • the imaging unit 51 captures an image of a subject that is an object to be imaged under the control of the second control unit 56 and generates image data.
  • the imaging unit 51 includes an optical system 511 and an imaging element 512.
  • the optical system 511 includes a focus lens, a zoom lens, a shutter, and a diaphragm.
  • the optical system 511 forms a subject image on the light receiving surface of the image sensor 512.
  • the optical system 511 changes each shooting parameter such as zoom magnification, focus position, aperture value, and shutter speed under the control of the second control unit 56.
  • the image sensor 512 receives the subject image formed by the optical system 511 and performs photoelectric conversion to generate image data, and outputs the image data to the second control unit 56.
  • the imaging element 512 is configured using a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like.
  • the imaging element 512 includes pixels that generate image data arranged in a two-dimensional matrix.
  • the image sensor 512 may be further provided with a phase difference pixel for detecting the distance of the subject, or the image data generation pixel described above may be used as the phase difference pixel.
  • the elevation angle azimuth detecting unit 52 is configured using an azimuth sensor, a triaxial acceleration sensor, a gyro sensor, and a magnetic azimuth sensor.
  • the elevation angle azimuth detection unit 52 includes elevation angle information regarding the inclination angle (elevation angle) of the imaging device 5 (the optical axis of the optical system 511) with respect to the horizontal plane, and the shooting direction of the imaging device 5 with respect to a predetermined reference orientation (for example, north).
  • a predetermined reference orientation for example, north
  • the audio input unit 53 is configured using a microphone or the like that collects external sounds and generates audio data.
  • the third communication unit 54 communicates with the mobile device 4 via the cable 6 under the control of the second control unit 56.
  • the second recording unit 55 is configured using a flash memory, an SDRAM, a memory card, and the like.
  • the second recording unit 55 includes an image data recording unit 551 that records image data generated by the imaging unit 51, and a program recording unit 552 that records various programs executed by the imaging device 5.
  • the second control unit 56 detects information necessary for the moving device 4 to determine whether or not it can move in accordance with a change in the relative relationship between the photographing object specified in advance and itself, The detected information is transmitted to the mobile device 4.
  • the second control unit 56 is configured using a CPU or the like.
  • the imaging of the imaging unit 51 is controlled and the image data generated by the imaging unit 51 is transmitted to the mobile device 4.
  • the second control unit 56 instead of transmitting the image data as it is, the data thinned out temporally or pixelally (resized image data) or compressed image data (reduced image data or compression processing). Image data) may be transmitted.
  • the second control unit 56 may transmit the image data after performing a process for increasing the visibility (for example, contrast enhancement process or exposure value reduction process), or a result of analyzing the image (described later).
  • the feature amount of the image extracted by the subject detection unit 562 may be transmitted.
  • the second control unit 56 includes an image processing unit 561, a subject detection unit 562, a tracking processing unit 563, a distance calculation unit 564, a trimming unit 565, a second communication control unit 566, an imaging control unit 567, Have
  • the image processing unit 561 performs predetermined image processing on the image data generated by the imaging unit 51. Specifically, the image processing unit 561 performs gain-up processing, white balance processing, gradation processing, synchronization processing, and the like on the image data.
  • the subject detection unit 562 detects a subject included in the image data by extracting the feature amount of the image subjected to the image processing by the image processing unit 561.
  • the subject include a pre-designated photographing object and an object (obstacle) existing between the photographing object.
  • the subject detection unit 562 is based on the luminance value for each pixel in the image data, the amount of movement of the representative pixel from the previous frame, and the like. Detect information such as.
  • the subject detection unit 562 may include an artificial intelligence circuit unit as necessary to perform detection using results such as deep learning using big data and machine learning. Note that detection may be performed with reference to touch designation on the user's screen or voice designation.
  • a subject image to be a specific subject image is designated and input.
  • the determination result is a part of the image data, but the image data itself may be recorded as data as a determination result, or the color distribution, outline, size, etc. may be converted into data, and the image feature amount may be determined. It may be recorded.
  • a system may be constructed in which artificial intelligence can infer target information that the user wants to take by performing deep learning, machine learning, or the like. As a result, it is possible to perform learning using a frequently photographed image as a teacher image.
  • the tracking processing unit 563 tracks the subject in the image based on information detected by the subject detection unit 562 for a preset subject. At this time, the tracking processing unit 563 tracks the subject by performing pattern matching processing based on information detected by the subject detection unit 562, for example. Similarly to the subject detection unit 562, the tracking processing unit 563 may include an artificial intelligence circuit unit as necessary to perform detection using the results of deep learning using big data, machine learning, or the like.
  • the distance calculation unit 564 is configured from the imaging device 5 to the subject based on a contrast change between two pieces of image data that are generated before and after the time generated by the imaging unit 51 or phase difference information by phase difference pixels provided in the imaging element 512. The distance is calculated.
  • the distance calculation unit 564 may determine the size of the subject and calculate the distance to the subject based on a change in the size of the subject between frames.
  • the distance calculation unit 564 may calculate the distance to the subject using the spatial information acquired by the spatial information acquisition unit 43 of the mobile device 4.
  • the information on the distance to the subject can be used for focusing the optical system 511, and is also important information for the user who operates the moving device 4 as the distance information between the moving device 4 and the subject and the moving device 4 and the object. .
  • the user 100 may perform an operation for designating stationary control such as hovering by operating the operation device 3 or the first control of the moving device 4.
  • the unit 49 may automatically perform the same stationary control.
  • the distance information to the subject can be effectively used when, for example, the subject is photographed in a predetermined order while changing the distance, or when the subject is photographed from various angles with the distance kept constant.
  • the distance information can also be used when the imaging device 5 performs shooting to acquire accurate 3D information.
  • the functions of the subject detection unit 562, the tracking processing unit 563, and the distance calculation unit 564 may be included in the first control unit 49 of the moving device 4.
  • the trimming unit 565 generates trimming image data and thumbnail image data by performing trimming processing on the image data subjected to image processing by the image processing unit 561.
  • the second communication control unit 566 performs communication control of the third communication unit 54. Specifically, the second communication control unit 566 transmits information that can be displayed on the display unit 33 of the controller device 3 to the mobile device 4 via the third communication unit 54.
  • the imaging control unit 567 controls imaging of the imaging unit 51. Specifically, when the imaging control unit 567 receives an instruction signal instructing imaging from the mobile device 4 via the third communication unit 54, the imaging control unit 567 causes the imaging unit 51 to perform imaging according to the instruction signal. In the first embodiment, the imaging device 5 can perform moving image shooting and still image shooting.
  • the controller device 3 includes a fourth communication unit 31, an operation input unit 32, a display unit 33, an audio output unit 34, a third recording unit 35, and a third control unit 36.
  • the fourth communication unit 31 is configured using a communication module.
  • the fourth communication unit 31 transmits an instruction signal received by the operation input unit 32 by performing wireless communication with the first communication unit 47 of the mobile device 4 under the control of the third control unit 36.
  • the image data captured by the imaging device 5 is received and output to the third control unit 36.
  • the operation input unit 32 is configured by using a button, a switch, a cross key, or the like that receives an operation by the user, receives an input of an instruction signal corresponding to the operation by the user, and outputs the instruction signal to the third control unit 36.
  • the operation input unit 32 includes a touch panel that is provided so as to overlap the display unit 33 and receives an input of a signal corresponding to the contact position of an object from the outside.
  • the operation input unit 32 may further include a voice input unit such as a microphone, and may receive an input of an instruction signal corresponding to the user's voice input.
  • the display unit 33 displays various information related to the moving device 4 under the control of the third control unit 36.
  • the display unit 33 is configured using a display panel such as liquid crystal or organic EL (Electro Luminescence).
  • the display unit 33 displays an image corresponding to the image data generated by the imaging device 5 and various types of information regarding the imaging device 5 under the control of the third control unit 36.
  • the audio output unit 34 is configured using a speaker or the like that outputs audio, and outputs sound recorded corresponding to data such as a moving image captured by the imaging device 5.
  • FIG. 3 is a diagram schematically showing the external configuration of the operating device 3.
  • the operation device 3 shown in the figure is provided with a display unit 33 and an operation input unit 32 on the surface.
  • the operation input unit 32 includes a left stick 32a, a right stick 32b, and a touch panel 32c arranged on the left and right.
  • FIG. 4 is a diagram schematically showing operation assignment of the operation input unit 32 of the operation device 3 in the normal operation mode.
  • the movement of the left lever back and forth corresponds to the movement back and forth.
  • the left and right movements of the left stick 32a correspond to left and right rotations, respectively.
  • the forward / backward movement of the right stick 32b corresponds to ascending and descending, respectively.
  • the right and left movements of the right stick 32b correspond to the left and right dolly, respectively.
  • a stick operation corresponding to the operation may be performed after the user selects a desired operation on the touch panel 32c.
  • the display unit 33 shows the assignment of the stick operation after the selection input on the touch panel 32c. Further, all operations may be performed on the touch panel 32c.
  • the third recording unit 35 records various programs related to the operation device 3.
  • the third recording unit 35 is configured using a flash memory or an SDRAM.
  • the third control unit 36 is configured using a CPU and controls each unit of the controller device 3.
  • the third control unit 36 controls a mode control unit 361 that controls the operation mode of the steered object 2, and a third communication control unit 362 that controls communication between the fourth communication unit 31 and the first communication unit 47 of the mobile device 4.
  • a first display control unit 363 that controls the display of the display unit 33.
  • the mode control unit 361 can set a normal control mode and a lock-on mode as the operation mode of the control target 2.
  • the normal maneuvering mode is a mode in which the to-be-steered body 2 operates based on an operation signal in which the controller device 3 receives an input.
  • the lock on mode is a mode for tracking a subject to be locked on.
  • the tracking process is performed so that the lock-on target subject is at the same position in the screen and the distance to the lock-on target subject remains the same.
  • the normal lock-on mode is a mode in which the orientation of the subject 200 in the screen does not matter as shown in FIG. 5 when the subject is kept within the angle of view.
  • the angle lock-on mode is a mode in which the tracking process is performed so that the orientation of the subject 200 in the screen remains the same as shown in FIG. Therefore, for example, when the subject to be locked on is a person, the face direction of the person may change in the normal lock-on mode, while the face direction of the person does not change in the angle lock-on mode.
  • the user When shifting from the normal operation mode to the lock-on mode, the user first touches the subject 200 to be locked-on on the screen of the display unit 33 as shown in FIG. Thereafter, as shown in FIG. 8, the display unit 33 displays a normal lock-on mode icon 33a, an angle lock-on mode icon 33b, and a return button 33c.
  • the mode control unit 361 of the controller device 3 performs processing according to each icon or button. Specifically, the mode control unit 361 performs a process of setting a lock-on mode corresponding to the touched icon or resetting the normal operation mode with the return button 33c.
  • the composition adjustment mode is a mode in which the operation device 3 accepts input of only an instruction signal for composition adjustment for an image displayed on the display unit 33 and does not accept input of an instruction signal related to movement of the moving device 4.
  • the composition adjustment mode is set, the user can concentrate on the composition adjustment without being distracted by the operation of the mobile device 4.
  • FIG. 9 is a diagram showing a display example on the display unit 33 when the composition adjustment mode is set.
  • the display unit 33 displays a return button 33c, a shooting icon 33d, a zoom icon 33e, an ISO icon 33f, an aperture icon 33g, a shutter speed icon 33h, and an exposure correction icon 33i.
  • the return button 33c is touched, the display unit 33 returns to the display shown in FIG.
  • the controller device 3 transmits a signal instructing shooting to be performed by the imaging device 5 to the moving device 4.
  • the controller device 3 When any one of the zoom icon 33e, the ISO icon 33f, the aperture icon 33g, the shutter speed icon 33h, and the exposure correction icon 33i is touched, the controller device 3 is in a state where it can accept the change of the parameter corresponding to the touched icon. Transition.
  • FIG. 10 is a diagram schematically illustrating an example of operation assignment in the operation input unit 32 when the composition adjustment mode is set.
  • the forward / backward movement of the left stick 32a corresponds to an angle change by rotation (pitch) with the horizontal direction of the machine body as the rotation center.
  • the left / right movement of the left stick 32a corresponds to an angle change by rotation (roll) with the front-rear direction of the machine body as the center of rotation.
  • the forward direction and the backward direction of the right stick 32b correspond to the up and down of the selected parameter among the zoom icon 33e, ISO icon 33f, aperture icon 33g, shutter speed icon 33h, and exposure correction icon 33i, respectively.
  • the left direction of the right stick 32b corresponds to the left slide, and the right direction corresponds to the right slide.
  • the zoom up / down operation may be received by pinch out / pinch in on the touch panel 32c.
  • the left and right slide operations (dolly) may be received by left and right slides on the touch panel 32c.
  • the operating device 3 may not have a configuration with a lever.
  • a mobile terminal such as a smartphone may be realized as the operation device 3.
  • the input of the operation signal by the user may be realized by voice input, or may be realized by changing the direction of the main body of the operation device 3 or shaking the main body of the operation device 3.
  • step S101: Yes the mode control unit 361 sets the operation mode to the normal operation mode (step S102).
  • step S101: No the controller device 3 repeats step S101.
  • the third communication control unit 362 establishes communication with the mobile device 4 (step S103).
  • the first display control unit 363 receives the image data from the mobile device 4 and starts displaying the image corresponding to the received image data on the display unit 33 (step S104).
  • the controller device 3 transmits and receives necessary data in addition to the image data after establishing communication with the mobile device 4.
  • step S105: No the case where the operation input unit 32 does not accept the input of the mode change instruction signal (step S105: No) will be described.
  • a normal control signal corresponding to the operation signal is transmitted to the mobile device 4 (step S107).
  • the normal control signal here includes a signal for instructing the imaging device 5 to capture a still image or a moving image, a signal for requesting information on the remaining battery level to supply power to the moving device 4, and a flight of the moving device 4. It includes signals to steer.
  • step S106: No the controller device 3 proceeds to step S108.
  • step S108 when the power of the controller device 3 is cut off (step S108: Yes), the controller device 3 ends the process. On the other hand, when the power source of the controller device 3 is not cut off (step S108: No), the controller device 3 returns to step S105 if the operation mode is the normal control mode (step S109: Yes). If the operation mode is not the normal operation mode in step S109 (step S109: No), the controller device 3 returns to step S102.
  • step S105 Yes
  • the change of the operation mode is a change to one of the normal lock-on mode and the angle lock-on mode. In either mode, it is necessary to designate a subject to be locked on.
  • a touch panel (a part of the operation input unit 32) provided on the screen of the display unit 33 detects a touch of a subject to be locked on, and inputs a signal for designating the subject to be locked on. This is realized by receiving an input signal based on the position.
  • the mode control unit 361 changes the setting of the operation mode to the normal lock on mode or the angle lock on mode, and changes to the normal or angle lock on mode.
  • the mode change signal to be instructed and the position information of the subject to be locked on are transmitted to the moving device 4 (step S110).
  • the mode control unit 361 sets the operation mode to the other.
  • the mode change signal for instructing the change to the normal or angle lock on mode and the position information of the subject to be locked on are transmitted to the moving device 4 (step S112).
  • step S113: Yes the controller device 3 proceeds to step S108. If the power is not cut off in step S108 (step S108: No), the operation mode is not the normal operation mode (step S109: No), and the controller device 3 returns to step S102. In step S102, the mode control unit 361 sets the operation mode to the normal operation mode.
  • the imaging device 5 may be allowed to capture a still image or a moving image.
  • step S111 when the operation input unit 32 does not accept an input of an instruction signal for changing from one of the normal lock-on mode and the angle lock-on mode to the other (step S111: No), the controller device 3 proceeds to step S113.
  • step S113 when the input of the signal for ending the normal or angle lock-on mode is not accepted (step S113: No), the controller device 3 proceeds to step S114.
  • step S114 when the operation input unit 32 receives an input of an instruction signal for changing to the composition adjustment mode (step S114: Yes), the mode control unit 361 changes the operation mode setting to the composition adjustment mode (step S115). ).
  • step S116: Yes when the operation input unit 32 receives an input of a zoom operation signal (step S116: Yes), the third communication control unit 362 performs zoom control communication with the mobile device 4 (step S117).
  • step S116: No When the operation input unit 32 does not accept the input of the zoom operation signal (step S116: No), the controller device 3 proceeds to step S118.
  • step S118 when the operation input unit 32 receives an input of a slide operation (step S118: Yes), the third communication control unit 362 performs slide control communication with the mobile device 4 (step S119). On the other hand, when the operation input unit 32 does not accept the input of the slide operation (step S118: No), the controller device 3 proceeds to step S120.
  • step S120 when the operation input unit 32 receives an input of an angle change operation (step S120: Yes), the third communication control unit 362 performs angle change control communication with the mobile device 4 (step S121). On the other hand, when the operation input unit 32 does not accept the input of the angle change operation (step S120: No), the controller device 3 proceeds to step S122.
  • step S122 when the operation input unit 32 receives an input of an exposure change operation (step S122: Yes), the third communication control unit 362 performs exposure change control communication with the mobile device 4 (step S123). On the other hand, when the operation input unit 32 does not accept the input of the exposure change operation (step S122: No), the controller device 3 proceeds to step S124.
  • step S124 when the operation input unit 32 receives an input of a focus change operation (step S124: Yes), the third communication control unit 362 performs focus change control communication with the mobile device 4 (step S125). On the other hand, when the operation input unit 32 does not accept the input of the focus change operation (step S124: No), the controller device 3 proceeds to step S126.
  • step S126 when the operation input unit 32 receives an input of a shooting operation (step S126: Yes), the third communication control unit 362 performs shooting control communication with the mobile device 4 (step S127).
  • step S126: No When the operation input unit 32 does not accept the input of the shooting operation (step S126: No), the controller device 3 proceeds to step S128.
  • step S1208 when the operation input unit 32 receives an input of a signal for ending the composition adjustment mode (step S128: Yes), the controller device 3 returns to step S108. If the power is not cut off in step S108 (step S108: No), the operation mode is not the normal operation mode (step S109: No), and the controller device 3 returns to step S102. In step S102, the mode control unit 361 sets the operation mode to the normal operation mode.
  • step S128 when the operation input unit 32 does not accept the input of the signal for ending the composition adjustment mode (step S128: No), the controller device 3 returns to step S116.
  • step S114 when the operation input unit 32 does not accept the input of the instruction signal for changing to the composition adjustment mode (step S114: No), the controller device 3 returns to step S111.
  • the moving device 4 determines whether or not the moving device 4 can move in accordance with a change in the relative relationship between the imaging object 5 designated in advance and the imaging device 5, and reads the determination result from the first recording unit 46 for determination. Control according to the result. Detailed processing of the mobile device 4 including such movement control processing will be described with reference to the flowcharts shown in FIGS. 12A and 12B.
  • step S201: Yes the first communication control unit 496 establishes communication with the imaging device 5 and the operation device 3 (step S202). If the mobile device 4 is not powered on in step S201 (step S201: No), the mobile device 4 repeats step S201.
  • the first communication control unit 496 starts receiving image data from the imaging device 5 and transmitting image data to the operation device 3 (step S203). Thereafter, the moving device 4 receives the image data from the imaging device 5 at a predetermined interval, and transmits the image data to the operation device 3. The mobile device 4 transmits / receives necessary data in addition to the image data after establishing communication with the imaging device 5 and the operation device 3.
  • step S203 when the mobile device 4 receives the normal control signal from the controller device 3 (step S204: Yes), the first control unit 49 performs processing according to the normal control signal (step S205).
  • the processing according to the normal control signal here refers to, for example, transmission of a still image shooting start instruction signal, moving image shooting start or end instruction signal to the imaging device 5, and operation of the image data received from the imaging device 5. And the confirmation of the remaining battery level of the mobile device 4. Thereafter, the mobile device 4 proceeds to Step S206.
  • step S204 when the mobile device 4 does not receive the normal control signal (step S204: No), the mobile device 4 proceeds to step S206.
  • step S206 when the moving device 4 receives the mode change signal to the normal or angle lock-on mode and the position information of the subject to be locked on (step S206: Yes), the first control unit 49 sets the lock-on reference position. Is stored in the first recording unit 46 (step S207), and a signal for instructing tracking of the lock-on target subject and the position information of the lock-on target subject are transmitted to the imaging device 5 (step S208).
  • step S206 when the moving device 4 does not receive the mode change signal to the normal or angle lock-on mode and the position information of the subject to be locked on (step S206: No), the moving device 4 proceeds to step S224 to be described later. .
  • FIG. 13 is a flowchart showing an outline of subject tracking control processing performed by the moving device 4.
  • the moving device 4 determines the distance from the imaging device 5 to the subject that is the lock-on target calculated by the imaging device 5, the amount of deviation from the reference position of the subject, and the distance to the obstacle when there is an obstacle.
  • Obtain (step S301).
  • the “reference position” here is the center of the screen in the initial setting, and when a slide operation is performed, the center is the position where the center has been slid.
  • the “deviation amount” here is a three-dimensional deviation amount. The detection of the distance to the subject and the obstacle by the imaging device 5 and the calculation of the shift amount will be described in detail when the processing of the imaging device 5 is described.
  • the movement determination unit 492 calculates the movement distance of the moving device 4 for maintaining the position and size of the subject based on the acquired deviation amount (step S302).
  • the movement determination unit 492 sets the movement distance ⁇ L in the direction orthogonal to the screen, the distance L to the subject 200, the size of the subject before the subject 200 moves, and the subject 200 moving.
  • the moving distance ⁇ L has a positive value in the direction approaching the subject 200. In other words, when the subject 200 moves and becomes smaller on the screen, the moving distance becomes positive, and when the subject 200 moves and becomes larger on the screen, the moving distance becomes negative.
  • the movement distance in the direction parallel to the screen may be calculated in the same way.
  • step S303: Yes the movement determination unit 492 determines that the movement distance of the moving device 4 is less than the distance to the obstacle (step S304: Yes). 4 is determined to be possible (step S305). The movement determination unit 492 determines that the movement of the moving device 4 is possible even when there is no obstacle on the movement route (step S303: No) (step S305).
  • the propulsion control unit 494 follows the movement of the subject according to the movement distance calculated in step S302 so that the imaging device 5 can continue to capture the subject within the angle of view. Control is performed (step S306). Thereafter, the mobile device 4 proceeds to Step S308.
  • step S304 if the moving distance of the moving device 4 is equal to or greater than the distance to the obstacle (step S304: No), the movement determining unit 492 determines that the moving device 4 cannot move (step S307). . Thereafter, the mobile device 4 proceeds to Step S308.
  • step S308 when the moving device 4 is set to the angle lock on mode (step S308: Yes), the first control unit 49 performs imaging direction tracking control (step S309).
  • FIG. 15 is a flowchart illustrating an outline of processing of imaging direction tracking control performed by the moving device 4.
  • the imaging device 5 sends a change in the direction of the subject. The processing of the imaging device 5 at this time will be described later.
  • the movement determination unit 492 moves the movement device 4 based on the information. It is determined whether or not it is possible (step S402).
  • the movement determination unit 492 employs a model that approximates the subject's face to a sphere, and whether or not the moving device 4 can be rotated to a reference angle while keeping the distance from the rotation center constant with the center of the sphere as the rotation center. Determine.
  • the direction control unit 495 starts changing the imaging direction (step S404). For example, when a model that approximates the face with a sphere is adopted, the direction control unit 495 rotates the moving device 4 by the angle calculated in step S402 while keeping the distance from the rotation center constant with the center of the sphere as the rotation center. Do.
  • the posture of the imaging device 5 with respect to the moving device 4 can be changed, the moving device 4 moves while maintaining its own posture and changes the imaging direction by changing the posture of the imaging device 5 with respect to the moving device 4. You may make it make it. In this case, the movement of the moving device 4 and the posture change of the imaging device 5 may be performed simultaneously, or the posture of the imaging device 5 may be changed after the movement of the moving device 4.
  • step S404 when the lost information of the subject is received from the imaging device 5 while the imaging direction of the moving device 4 is being changed (step S405: Yes), the direction control unit 495 performs the rotation operation of the moving device 4. Control to cancel is performed (step S406), and lost information is transmitted to the controller device 3 (step S407).
  • “when the subject is lost” includes a case where a part of the subject disappears from the image, in other words, “when the subject is likely to be lost”.
  • step S407 the moving device 4 ends the shooting direction tracking control and returns to the main routine.
  • step S405 when the lost information of the subject is not received from the imaging device 5 while the imaging direction of the mobile device 4 is being changed (step S405: No), the change of the imaging direction is completed (step S408: Yes), the moving device 4 ends the shooting direction tracking control and returns to the main routine.
  • step S408: No when the change of the imaging direction is not completed in step S408 (step S408: No), the moving device 4 returns to step S405.
  • step S401 when the moving device 4 does not acquire information on the change in the direction of the lock-on subject from the imaging device 5 (step S401: No), and in step S403, the movement determining unit 492 determines that the movement is not possible. In the case (step S403: No), the moving device 4 ends the shooting direction tracking control and returns to the main routine.
  • step S308 when the moving device 4 is not set to the angle lock on mode (step S308: No), the moving device 4 ends the subject tracking control and returns to the main routine.
  • the processing after step S209 described above will be described with reference to FIG. 12B.
  • the first control unit 49 determines whether or not a control signal for instructing the change to the composition adjustment mode has been received (step S210). First, the case where the composition adjustment control signal is received (step S210: Yes) will be described. In this case, when a zoom operation signal is received (step S211: Yes), the first control unit 49 performs zoom control (step S212).
  • FIG. 16 is a flowchart showing an overview of zoom control processing performed by the moving device 4.
  • the power source determination unit 491 performs the process after determining that the remaining amount of the battery is equal to or greater than a predetermined value (for example, 50% or more of the full charge value).
  • a predetermined value for example, 50% or more of the full charge value.
  • the movement determination unit 492 determines whether or not movement is possible (step S503).
  • the movement request is sent from the imaging device 5 when the imaging device 5 determines that optical zoom cannot be performed.
  • the movement determination unit 492 determines whether or not the movement corresponding to the zoom up or zoom down corresponding to the zoom operation signal received from the controller device 3 is possible. Specifically, as illustrated in FIG. 17, the movement determination unit 492 sets the movement distance ⁇ L ′ in the direction orthogonal to the screen, the distance L to the subject, the size of the subject before moving the imaging device 5, and the imaging.
  • ⁇ L ′ ⁇ 1 ⁇ (c / d) ⁇ L, where d is the size of the subject after movement of the device 5.
  • the determination of whether to move based on the calculated movement distance ⁇ L ′ is the same as in steps S303 to S307 described with reference to FIG.
  • the movement distance in the direction parallel to the screen may be calculated in the same manner.
  • step S503 when it is determined that the movement determination unit 492 is movable (step S504: Yes), the propulsion control unit 494 drives the propulsion unit 41 to control the movement of the moving device 4 (step). S505).
  • the propulsion control unit 494 performs control to move the moving device 4 to a predetermined position before the obstacle.
  • step S506 when the lost information of the subject is received from the imaging device 5 while the moving device 4 is moving (step S506: Yes), the propulsion control unit 494 performs control to stop the movement of the moving device 4 ( In step S507, the lost information is transmitted to the controller device 3 (step S508).
  • “when the subject is lost” also includes a case where a part of the subject disappears from the image, that is, “when the subject is likely to be lost”.
  • step S508 the moving device 4 ends the zoom control and returns to the main routine.
  • step S506: No If the lost information of the subject is not received from the imaging device 5 while the moving device 4 is moving in step S506 (step S506: No), when the movement is completed (step S509: Yes), the first control unit 49 Transmits information on the end of movement to the imaging device 5 (step S510). Thereafter, the moving device 4 ends the zoom control and returns to the main routine. When the movement is not completed in step S509 (step S509: No), the moving device 4 returns to step S506.
  • step S502 If the movement request is not received from the imaging device 5 in step S502 (step S502: No) and the determination result is not movable in step S504 (step S504: No), the moving device 4 returns to the main routine.
  • step S211: No A case where the zoom operation signal is not received in step S211 (step S211: No) will be described.
  • step S213: Yes when the first control unit 49 receives a slide operation signal (step S213: Yes), the first control unit 49 performs slide control (step S214).
  • FIG. 18 is a flowchart showing an outline of the slide control process performed by the moving device 4.
  • the movement determination unit 492 determines whether or not the moving device 4 can perform a sliding operation (step S601).
  • the propulsion control unit 494 starts the slide operation (step S602).
  • the first control unit 49 also starts transmitting information regarding the slide operation to the imaging device 5.
  • the “information relating to the slide operation” here includes, for example, information relating to the amount and direction of the slide.
  • step S602 when the moving device 4 receives the lost information of the subject from the imaging device 5 during the sliding operation (step S603: Yes), the propulsion control unit 494 performs control to stop the sliding operation of the moving device 4 ( In step S604, the lost information is transmitted to the controller device 3 (step S605).
  • the subject is lost includes a case where a part of the subject disappears from the image, that is, “a case where the subject is likely to be lost”.
  • step S605 the mobile device 4 returns to the main routine.
  • step S603: No when the sliding operation of the moving device 4 is completed (step S606: Yes), the first control unit 49 changes depending on the sliding operation.
  • the reference position after the recording is stored in the first recording unit 16 (step S607), and the reference position is transmitted to the imaging device 5 (step S608). Thereafter, the moving device 4 ends the slide control and returns to the main routine.
  • step S606: No the moving device 4 returns to step S603.
  • step S601 when the movement determination unit 492 determines that the sliding operation of the moving device 4 is not possible (step S601: No), the first control unit 49 transmits a signal requesting image trimming to the imaging device 5. (Step S609). At this time, the first control unit 49 transmits information regarding the slide amount to the imaging device 5 together with the request signal.
  • the moving device 4 receives trimming image data or error information from the imaging device 5 (step S610).
  • the error information is information sent from the imaging device 5 when the imaging device 5 determines that trimming is impossible.
  • the moving device 4 transmits the trimmed image data or error information received from the imaging device 5 to the controller device 3 (step S611). Thereafter, the moving device 4 ends the slide control and returns to the main routine.
  • step S213: No the first control unit 49 does not receive the slide operation signal in step S213
  • step S215: Yes the first control unit 49 receives the angle change operation signal
  • step S216 the direction control unit 495 performs angle change control
  • the direction control unit 495 determines whether or not the process can be executed, and executes the movement of the moving device 4 and the change of the tilt angle (posture) when the process can be executed. Instead of simultaneously executing the movement of the moving device 4 and changing the angle, the angle of the moving device 4 may be changed after the moving device 4 is moved.
  • Step S215 the case where the first control unit 49 does not receive the angle change operation signal (Step S215: No) will be described.
  • the first control unit 49 receives the focus change operation signal (step S217: Yes)
  • the first control unit 49 transmits a control signal instructing the focus change to the imaging device 5 (step S218).
  • step S217: No The case where the first control unit 49 does not receive the focus change operation signal in step S217 (step S217: No) will be described.
  • the first control unit 49 receives the exposure change operation signal (step S219: Yes)
  • the first control unit 49 transmits a control signal instructing the exposure change to the imaging device 5 (step S220).
  • Step S219 the case where the first control unit 49 does not receive an exposure change operation signal (Step S219: No) will be described.
  • the first control unit 49 receives a shooting operation signal (step S221: Yes)
  • the first control unit 49 transmits a control signal instructing shooting to the imaging device 5 (step S222).
  • step S221 when the first control unit 49 does not receive the shooting operation signal (step S221: No), the moving device 4 proceeds to step S223.
  • step S223 when the first control unit 49 receives an end signal of the composition adjustment mode from the controller device 3 (step S223: Yes), when the mobile device 4 is powered off (step S224: Yes), the mobile device 4 ends the process. If the mobile device 4 is not turned off in step S224 (step S224: No), the mobile device 4 returns to step S204.
  • step S223: Yes when the first control unit 49 does not receive the composition adjustment mode end signal (step S223: No), the mobile device 4 returns to step S209.
  • step S210: No the control signal instructing the change to the composition adjustment mode is not received in step S210.
  • step S225: Yes the moving device 4 proceeds to step S211.
  • step S225 When the operation mode is not the composition adjustment mode in step S225 (step S225: No), when the first control unit 49 receives the lock-on mode end signal from the controller device 3 (step S226: Yes), the moving device 4 A lock-on mode end signal is transmitted to the imaging device 5 (step S227), and the process proceeds to step S224. On the other hand, when the first control unit 49 does not receive the lock-on mode end signal from the controller device 3 (step S226: No), the moving device 4 returns to step S209.
  • steps S211 to S222 corresponding to the composition adjustment mode process can be performed in parallel.
  • the imaging device 5 detects information necessary for the moving device 4 to determine whether it can move in accordance with a change in the relative relationship between the imaging object 5 specified in advance and the imaging device 5, The detected information is read from the second recording unit 55 and transmitted to the moving device 4 to assist the processing of the moving device 4. Detailed processing of the imaging apparatus 5 including such movement assistance processing will be described with reference to the flowcharts shown in FIGS. 19A and 19B.
  • step S701: Yes the imaging control unit 567 performs control to start imaging (step S702). If the power of the imaging device 5 is not turned on in step S701 (step S701: No), the imaging device 5 repeats step S701.
  • step S702 the second communication control unit 566 establishes communication with the mobile device 4 (step S703). Subsequently, the second communication control unit 566 starts transmitting the image data generated by the image processing unit 561 to the moving device 4 (step S704).
  • step S705 when the normal control signal is received from the mobile device 4 (step S705: Yes), the second control unit 56 performs normal control according to the signal (step S706).
  • the normal control here includes still image or moving image shooting, transmission of information related to the remaining battery level of the imaging device 5 to the moving device 4, and the like.
  • step S705 when the normal control signal is not received from the mobile device 4 (step S705: No), the imaging device 5 proceeds to step S707 described later.
  • step S707 when the signal for instructing tracking of the lock-on target subject and the position information of the lock-on target subject are received from the mobile device 4 (step S707: Yes), the subject detection unit 562 detects the lock-on target subject. Detection is started (step S708). The subject detection unit 562 detects the subject based on the color, shape, pattern, and the like of the subject, for example.
  • Step S707 when the signal for instructing tracking of the lock-on target and the position information of the lock-on target are not received from the moving device 4 (Step S707: No), the imaging device 5 returns to Step S705.
  • step S708 A case will be described in which the subject detection unit 562 starts subject detection in step S708 and then loses the subject to be locked on (step S709: Yes).
  • the second control unit 56 transmits lost information to the mobile device 4 (step S710).
  • “when the subject is lost” includes a case where a part of the subject disappears from the image, in other words, “when the subject is likely to be lost”.
  • step S711: Yes the subject detection unit 562 finishes the lock-on target subject detection (step S712).
  • step S711: No when the lock-on mode end signal is not received from the mobile device 4 (step S711: No), the imaging device 5 returns to step S709.
  • step S712 when the power of the imaging device 5 is turned off (step S713: Yes), the imaging device 5 ends the process. If the power of the imaging device 5 is not turned off in step S713 (step S713: No), the imaging device 5 returns to step S705.
  • step S714 the processing after step S714 performed when the subject to be locked on has not been lost in step S709 (step S709: No) will be described.
  • the distance calculation unit 564 calculates the distance to the subject that is the lock-on target.
  • a three-dimensional deviation amount from the reference position is calculated by obtaining a movement in a two-dimensional plane using a motion vector or the like (step S715).
  • the imaging device 5 transmits the calculation result of step S715 to the moving device 4 (step S716).
  • step S714 when the operation mode is set to the angle lock on mode (step S714: No), the subject detection unit 562 detects the orientation of the subject (step S717). Thereafter, the imaging device 5 proceeds to step S715. In this case, the imaging device 5 also transmits the orientation of the subject to the moving device 4 in step S716.
  • step S717 when the subject detection unit 562 detects an obstacle between the subject (step S718: Yes), the distance calculation unit 564 calculates the distance to the obstacle (step S719). Thereafter, the imaging device 5 transmits the distance to the obstacle to the moving device 4 (step S720). After step S720 and when the subject detection unit 562 does not detect an obstacle between the subject and the subject in step S718 (step S718: No), the imaging device 5 proceeds to step S721.
  • step S721 when the imaging device 5 receives a signal instructing the change to the composition adjustment mode (step S721: Yes), the imaging device 5 proceeds to step S722. On the other hand, when the imaging device 5 does not receive a signal for instructing change to the composition adjustment mode (step S721: No), the imaging device 5 proceeds to step S711.
  • step S722 when a zoom control signal is received from the mobile device 4 (step S722: Yes), the imaging control unit 567 performs zoom control (step S723). In step S722, when a zoom control signal is not received from the moving device 4 (step S722: No), the imaging device 5 proceeds to step S724 described later.
  • the imaging control unit 567 refers to the second recording unit 55 to calculate the change in the center of gravity of the steered body 2 based on the zoom instruction amount (step S801).
  • step S802 determines whether or not optical zoom is possible (step S803).
  • the determination in step S802 may be performed using an amount other than the change in the center of gravity.
  • the amount of movement of the mechanism for obtaining the buoyancy of the steered object 2, the value converted into the output of a gyro sensor or the like for balancing the steered object 2, or the position of the weight in the optical axis direction A threshold value may be set for a value converted by a change or the like, and step S803 may be performed when the value is equal to or less than the threshold value. Further, a threshold value (for example, 10%) is set for the ratio of the center of gravity movement amount of the steered object 2 in the optical axis direction of the imaging device 5 with respect to the entire length of the steered object 2, and the step is performed when the threshold is equal to or less than this threshold S803 may be performed.
  • a threshold value for example, 10%
  • the imaging control unit 567 may determine whether or not there is a change in the center of gravity using a plurality of parameters. If the optical zoom is possible (step S803: Yes), the imaging control unit 567 performs optical zoom control on the optical system 511 (step S804). Whether or not optical zoom is possible is determined based on conditions such as the type and nature of the lens and the required image quality. At the time of optical zoom, a signal indicating that may be transmitted to the moving device 4 before zoom control, and the moving device 4 may perform a stabilization operation in preparation for unexpected state fluctuations. Such processing of the imaging device 5 may be performed during the processing of step S802 or S803, or may be performed as a separate process.
  • the determination based on the influence of the vibration may be performed, or the determination based on the influence may be comprehensively determined in combination with the determination described above. For example, when the wind speed is strong and it is difficult to approach the object due to the influence of the wind, it may be used together with the determination of performing zooming.
  • step S805: Yes when the shooting control unit 567 determines that the movement to the requested zoom position has been completed based on information from the imaging device 5 (step S805: Yes), the shooting control unit 567 ends the zoom control.
  • step S805: No when the shooting control unit 567 determines that the movement to the requested zoom position has not been completed (step S805: No), the shooting control unit 567 performs electronic zoom to the requested zoom position (step S806). Thereafter, the imaging control unit 567 ends the zoom control.
  • step S803 If the optical zoom is not possible in step S803 (step S803: No), the imaging device 5 proceeds to step S806.
  • step S802 determines the magnitude of the amount of motion between the frames of the subject and the predetermined threshold (step S807). If the amount of movement of the subject is equal to or greater than the threshold (the amount of motion is large) (step S807: Yes), processing according to the magnitude relationship between the zoom speed of the subject and the approach speed to the subject is performed.
  • step S804 when the enlargement speed by the zoom of the subject image is larger than the enlargement speed by the approach (step S808: Yes), if the imaging device 5 can execute the optical zoom and the electronic zoom (step S809: Yes), the step The process proceeds to S804 to perform optical zoom control.
  • the optical zoom or the like is performed when it is determined that the movement of the center of gravity of the steered object 2 is large because the steerable object 2 may be able to move safely while losing its posture.
  • the imaging device 5 may comprehensively determine whether or not the optical zoom or the like can be performed from the surrounding environment at the time of determination, the stability of the steered object 2 and the performance information of the moving device 4.
  • a determination based on the influence of wind or vibration in the steered object 2 may be performed.
  • the determination based on the above may be combined with the determination described above to make a comprehensive determination. For example, when the wind speed is strong and it is difficult to approach the object due to the influence of the wind, it may be used together with the determination of performing zooming. Further, when the influence of the vibration of the to-be-steered body 2 is large, there is a case where it is more advantageous for blurring and framing to zoom in close to the object instead of zooming in on the telephoto side, so that step S809 is performed.
  • step S807 A determination based on the influence of wind and the influence of vibration of the steered body 2 may be added to the above process.
  • step S807 if the amount of movement of the subject is smaller than the threshold value (the movement is small) in step S807 (step S807: No), the imaging device 5 assumes that the subject will not escape and the imaging device 5 performs steps described later.
  • the process proceeds to S810.
  • the amount of movement of the subject may be determined based on a change in the distance to the subject, a change in the size of the subject image, or the like, particularly in view of the fact that the optical axis direction of the photographing lens is in focus. In order to simplify the description and make the features of the invention easier to understand, the outline of the process has been described using several branches.
  • step S808: Yes When the enlargement speed by the zoom of the subject image is larger than the enlargement speed by the approach (step S808: Yes) and the imaging device 5 cannot execute the optical zoom or the electronic zoom (No at step S809), and the enlargement by the zoom of the subject image
  • the subject detection unit 562 detects an obstacle in the image (step S810), and the obstacle detection result and the movement request to the moving device 4 are detected. Is transmitted (step S811).
  • This movement request is a movement request before the obstacle.
  • the determination based on the influence of the wind or the vibration of the steered body 2 may be performed, or the determination based on the influence may be comprehensively determined in combination with the above-described determination.
  • the movement of the subject can be detected by the time change of the image, the subject may be predicted by an image or the like even if the subject is not actually moving, and such a result may be used. It can also be determined by performing machine learning of image changes about what kind of object moves and how it moves after that.
  • the features of the imaging device 5 are described here, there is also a feature as the steered body 2 including the moving device 4 communicating with the imaging device 5 or the imaging device 5 and the moving device 4.
  • the steered body 2 including the moving device 4 that can communicate with the imaging device 5 and can hold the imaging device 5 and move together with the imaging device 5 is connected between the imaging object 5 and the imaging object 5 designated in advance.
  • changes such as the center of gravity of the imaging device 5, movements such as the agility of the object, prediction of those changes and movements
  • Information such as the performance of the pilot 2 and the environment around the pilot 2 is required.
  • the steered body control unit 21 of the steered body 2 detects such information and can use the detected information to move in accordance with a change in the relative relationship between the object to be imaged and the imaging device 5. It is one of the features of the first embodiment to determine whether or not to perform control according to the determination result.
  • the imaging device 5 transmits lost information to the moving device 4 (step S813). Thereafter, the imaging device 5 ends the zoom control and returns to the main routine.
  • step S812 If the subject detection unit 562 does not lose the subject in step S812 (step S812: No), the imaging device 5 performs zoom control when the imaging device 5 receives a movement end notification from the moving device 4 (step S814: Yes). End and return to the main routine.
  • step S814 when the imaging device 5 does not receive the movement end notification from the moving device 4 (step S814: No), the imaging device 5 returns to step S812.
  • step S724: Yes the tracking processing unit 563 performs lost determination (step S725).
  • step S725: No when the subject is not lost (step S725: No)
  • step S726: Yes when the reference position after the slide is received (step S726: Yes), the second control unit 56 writes information on the reference position in the second recording unit 55. Store (step S727).
  • step S725 When the subject is lost in step S725 (step S725: Yes), the second control unit 56 transmits lost information to the moving device 4 (step S728).
  • step S731 when an exposure change operation signal is received from the moving device 4 (step S731: Yes), the imaging control unit 567 performs a process of changing the exposure (step S732). After step S732 and when no exposure change operation signal is received from the moving device 4 in step S731 (step S731: No), the imaging device 5 proceeds to step S733.
  • step S733 when a shooting operation signal is received from the mobile device 4 (step S733: Yes), the shooting control unit 567 performs shooting according to the shooting operation signal (step S734). After step S734 and when no shooting operation signal is received from the mobile device 4 in step S733 (step S733: No), the imaging device 5 proceeds to step S711.
  • step S724: No The case where the slide information is not received from the mobile device 4 in step S724 (step S724: No) will be described.
  • the trimming unit 565 determines whether trimming is possible (step S736). The determination here is based on the presence or absence of a margin part that can supplement the image of the part that was not displayed in the captured image due to the movement when the reference position of the subject to be locked on is slid. Do.
  • step S736 Yes
  • the trimming unit 565 performs trimming (step S737), and transmits the trimmed image data to the moving device 4 (step S738).
  • step S736 determines whether trimming cannot be performed. If the result of determination is that trimming cannot be performed (step S736: No), the second control unit 56 transmits error information to the mobile device 4 (step S739). After step S738 or S739, the imaging device 5 proceeds to step S729.
  • step S735 when the trimming request is not received (step S735: No), the imaging device 5 proceeds to step S729.
  • the imaging device 5 is described as not changing the angle with respect to the moving device 4, but the imaging device 5 may be configured to be able to change the angle with respect to the moving device 4.
  • the angle change instruction signal may be received from the moving device 4 and control for changing the angle with respect to the moving device 4 may be performed.
  • the moving device is movable in accordance with a change in the relative relationship between the imaging object specified in advance and the imaging device. While performing control according to the result, it detects information necessary for the mobile device to determine whether or not the imaging device is movable according to a change in the relative relationship between the object to be imaged and the imaging device. Therefore, it is possible to perform appropriate processing even in a situation where the movement path at the time of shooting is not fixed.
  • the user when the composition adjustment mode is set, the user can concentrate on composition adjustment by leaving the operation to the imaging system. Therefore, even in an environment where the subject is tracked, the user can concentrate on shooting without being distracted by the processing for tracking.
  • the steered body is an endoscope.
  • the moving device expands and contracts the distal end of the endoscope, and holds the imaging device therein.
  • the mobile device is wirelessly connected so as to be communicable with the operating device, and is driven according to an instruction from the operating device, and receives an instruction signal for the imaging device and transmits it to the imaging device.
  • the steered object has a lock-on function, and the object to be photographed (subject) moves relative to the imaging device when the steered object (endoscope) is inserted into the subject and the examination is performed. In this case, the object to be imaged is tracked while sequentially determining whether or not the movement can be tracked. When performing this tracking, the moving device performs an approach or separation operation on the subject as necessary in order not to change the size of the object to be imaged.
  • FIG. 21 is a diagram showing a schematic configuration of an imaging system according to Embodiment 2 of the present invention.
  • FIG. 22 is a block diagram illustrating a functional configuration of the imaging system according to the second embodiment.
  • An imaging system 1A shown in FIGS. 21 and 22 receives an endoscope 2A that is inserted into the body of a subject that is a subject to be imaged and images the inside of the subject, and an operation instruction signal for the endoscope 2A.
  • the controller 3A includes a processor 6A that is communicably connected to the endoscope 2A and controls the imaging system 1A in an integrated manner, and a display device 7A that displays an image captured by the endoscope 2A.
  • the endoscope 2A has an elongated shape having flexibility and is inserted into the body cavity of the subject, and is connected to the proximal end side of the insertion portion 21A and receives an operation signal. 22A, and a universal cord 23A that extends in a direction different from the direction in which the insertion portion 21A extends from the operation portion 22A and incorporates various cables connected to the processor 6A.
  • the insertion portion 21A includes a distal end portion 24A that houses the moving device 4A and the imaging device 5A, and a long flexible tube portion 25A that is connected to the proximal end side of the distal end portion 24A and has flexibility.
  • the distal end portion 24A is configured by using any one of an electrostatic actuator, a conductive polymer actuator, an ultrasonic motor, and the like, and has a flexible structure capable of bending, crank bending, and the like.
  • the moving device 4A and the imaging device 5A have the same configurations as the moving device 4 and the imaging device 5 described in the first embodiment. For this reason, the endoscope 2 ⁇ / b> A has a function as the steered body 2.
  • the components corresponding to the moving device 4 and the imaging device 5 in the moving device 4A and the imaging device 5A will be described by adding A to the end of the components of the moving device 4 and the imaging device 5, respectively.
  • symbol of the propulsion part which the moving apparatus 4A has is "41A".
  • the moving device 4A is attached to the insertion portion 21A in a cylindrical shape that can be advanced and retracted from the distal end of the insertion portion 21A, and holds the imaging device 5A inside the cylindrical shape.
  • the propulsion unit 41A is configured using an actuator that advances and retracts the moving device 4A at the distal end of the insertion unit 21A in response to an operation instruction signal sent from the operation device 3A.
  • the processor 6A acquires an image data captured by the endoscope 2A and performs image processing, and a light source unit that generates illumination light that irradiates the subject from the distal end of the insertion unit 21A of the endoscope 2A. 62A and a control unit 63A that controls the entire imaging system 1A including the processor 6A itself.
  • the image processing unit 61A and the control unit 63A are configured using one or more of, for example, a CPU, an FPGA (Field Programmable Gate Array), an ASIC (Application Specific Integrated Circuit), and the like.
  • the light source unit 62A outputs illumination light that irradiates the subject (subject) from the distal end of the insertion unit 21A of the endoscope 2A.
  • the light source unit 62A is configured using, for example, an LED (Light Emitting Diode), a laser light source, a xenon lamp, a halogen lamp, or the like.
  • the processor 6A controls the entire imaging system 1A.
  • the display device 7A displays an image corresponding to the imaging signal subjected to image processing by the processor 6A.
  • the display device 7A displays various information related to the imaging system 1A.
  • the display device 7A is configured using a display panel such as a liquid crystal or an organic EL.
  • the processor 6A may include the functions of the first control unit 49A of the moving device 4A and the second control unit 56A of the imaging device 5A. Further, the operation unit 22A of the endoscope 2A may include the function of the operation device 3A.
  • the outline of the processing performed by the operation device 3A, the moving device 4A, and the imaging device 5A is substantially the same as in the first embodiment.
  • an outline of a slide control process unique to the second embodiment will be described.
  • Steps S901 to S908 sequentially correspond to steps S601 to 608 (see FIG. 18) in the slide control described in the first embodiment.
  • the determination of the slidability of the moving device 4A in step S901 is performed based on, for example, a result of the spatial information acquisition unit 43A detecting the presence or absence of the internal wall of the subject in the sliding direction.
  • step S909 the movement determination unit 492A determines whether or not the size of the subject image has changed before and after the slide (step S909).
  • FIG. 24 is a diagram schematically illustrating the relationship between the distal end portion of the endoscope and the subject when the size of the subject image changes. As shown in FIG. 24, when the slide amount is large, the state is about ⁇ d away from the subject 300 by crank bending from the front end portion 24A before the movement indicated by the broken line. For this reason, as shown in FIG. 25, the subject image 300b after the movement becomes smaller than the subject image 300a before the movement.
  • step S909 If the size of the subject image changes as a result of the determination (step S909: Yes), the movement determination unit 492A approaches the subject to return to the size before the slide according to the change in the size of the subject image. Alternatively, it is determined whether or not separation is possible (step S910). This determination is made based on the information acquired by the spatial information acquisition unit 43A. If the movement determination unit 492A determines in step S909 that the size of the subject image has not changed (step S909: No), the moving device 4A returns to the main routine.
  • step S910 when the movement determination unit 492A determines that the approach or separation from the subject is possible (step S910: Yes), the propulsion control unit 494A causes the propulsion unit 41A to perform an approach or separation operation (step S911). ).
  • FIG. 26 is a diagram schematically showing a situation in which the moving device 4A moves forward from the situation shown in FIG. In this situation, as illustrated in FIG. 27, the imaging device 5A captures a subject image 300c having approximately the same size as the subject image 300a before movement. After step S911, the moving device 4A returns to the main routine.
  • step S910 If it is determined in step S910 that approach or separation from the subject is impossible (step S910: No), the first control unit 49A transmits a zoom instruction signal to the imaging device 5A (step S912). .
  • step S913: Yes when error information indicating zoom failure is received from the imaging device 5A (step S913: Yes), the first control unit 49A transmits error information to the controller device 3A (step S914). Thereafter, the moving device 4A ends the slide control process. In step S913, when error information is not received from the imaging device 5A (step S913: No), the moving device 4A ends the slide control process.
  • step S901 when the first control unit 49A determines that the sliding of the moving device 4A is impossible (step S901: No), the processing of steps S915 to S917 performed subsequently is the moving device 4 of the first embodiment. Steps S609 to S611 (see FIG. 18) described in the slide control of FIG. After step S917, the moving device 4A ends the slide control process.
  • steps S1001 to S1005 sequentially corresponds to steps S724 to S728 (see FIG. 19B) described in the first embodiment.
  • step S1006 performed after step S1004 will be described.
  • the imaging control unit 567A determines whether or not the zoom to the requested zoom position is possible by the optical zoom (step S1007).
  • the imaging control unit 567A performs optical zoom control on the optical system 511A (step S1008). Thereafter, the imaging device 5A ends the zoom control process.
  • step S1006 when the zoom instruction signal is not received from the moving device 4A (step S1006: No), the imaging device 5A ends the zoom control process.
  • step S1007 If zooming to the required zoom position by optical zoom is impossible (step S1007: No), the imaging control unit 567A determines whether zooming to the required zoom position is possible by electronic zoom (step S1009). . When zooming to the requested zoom position by electronic zoom is possible (step S1009: Yes), the imaging control unit 567A performs electronic zoom to the requested zoom position (step S1010). Thereafter, the imaging device 5A ends the zoom control process.
  • step S1009 when zooming to the requested zoom position by electronic zoom is impossible (step S1009: No), the imaging device 5A transmits error information to the moving device 4A (step S1011). Thereafter, the imaging device 5A ends the slide control.
  • step S1001 when the slide information is not received from the moving apparatus 4A (step S1001: No), the processing in steps S1012 to S1015 sequentially corresponds to the processing in steps S735 to S738 described in the first embodiment. After step S1015, the imaging device 5A ends the slide control. If trimming cannot be performed in step S1013 (step S1013: No), the imaging device 5A moves to step S1011 and transmits error information to the mobile device 4A.
  • the user can perform the observation of the subject without a sense of incongruity because the angle of view is automatically adjusted on the apparatus side only by performing the slide operation.
  • an unmanned aerial vehicle or an endoscope is exemplified as the moving device, it can be applied to a self-propelled robot, an industrial endoscope, a capsule endoscope, or the like.
  • the lens barrel portion that holds the imaging device of the microscope may be a moving device.
  • processing algorithm described using the flowchart in this specification can be described as a program.
  • a program may be recorded by a recording unit inside the computer, or may be recorded on a computer-readable recording medium. Recording of the program in the recording unit or recording medium may be performed when the computer or recording medium is shipped as a product, or may be performed by downloading via a communication network.
  • the present invention can include various embodiments not described herein, and various design changes can be made within the scope of the technical idea specified by the claims. It is.

Abstract

Provided are a steered object, a moving device, an imaging device, a movement control method, a movement assist method, a movement control program, and a movement assist program with which it is possible to perform an appropriate process even under a condition in which a movement route at imaging time is undetermined. Information necessary to determine whether or not movement is possible in correspondence to a change in the relative relation of a predesignated object to be imaged and an imaging device is detected, and determination is made using the information as to whether not movement is possible that corresponds to a change in the relative relation, with control exercised that corresponds to the determination result.

Description

被操縦体、移動装置、撮像装置、移動制御方法、移動補助方法、移動制御プログラムおよび移動補助プログラムSteered object, moving device, imaging device, movement control method, movement assist method, movement control program, and movement assist program
 本発明は、被操縦体、移動装置、撮像装置、移動制御方法、移動補助方法、移動制御プログラムおよび移動補助プログラムに関する。 The present invention relates to a steered object, a moving device, an imaging device, a movement control method, a movement assist method, a movement control program, and a movement assist program.
 従来、無人飛行体にカメラを装備して空撮を行う技術が知られている(例えば、特許文献1を参照)。この技術では、無人飛行体が所定の飛行ルートを飛行中に機体と撮影対象物との距離を測定し、その結果からカメラのズーム倍率を決定して撮影対象物を撮影する。 Conventionally, a technique for performing aerial photography with a camera mounted on an unmanned air vehicle is known (see, for example, Patent Document 1). In this technique, the unmanned air vehicle measures the distance between the airframe and the object to be imaged while flying on a predetermined flight route, and determines the zoom magnification of the camera from the result to image the object to be imaged.
特開2006-27448号公報JP 2006-27448 A
 特許文献1に記載の技術では、高架送電線の撮影を想定しているため、飛行ルートは予め設定されている。このため、例えば撮影対象物が動く場合のように、撮影時の移動経路が定まっていない状況下で移動、撮影等の処理を適切に行うことができなかった。 In the technique described in Patent Document 1, since the shooting of an elevated transmission line is assumed, the flight route is set in advance. For this reason, for example, when the object to be photographed moves, it is not possible to appropriately perform processing such as movement and photographing under a situation where the moving route at the time of photographing is not fixed.
 本発明は、上記に鑑みてなされたものであって、撮影時の移動経路が定まっていない状況下でも適切な処理を行うことができる被操縦体、移動装置、撮像装置、移動制御方法、移動補助方法、移動制御プログラムおよび移動補助プログラムを提供することを目的とする。 The present invention has been made in view of the above, and a steered object, a moving device, an imaging device, a movement control method, a movement that can perform appropriate processing even in a situation where a moving route at the time of shooting is not fixed An object is to provide an auxiliary method, a movement control program, and a movement auxiliary program.
 上述した課題を解決し、目的を達成するために、本発明に係る被操縦体は、被写体を撮像して画像データを生成する撮像装置、および前記撮像装置と通信可能であり、前記撮像装置を保持して前記撮像装置とともに移動可能な移動装置を備えた被操縦体であって、予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定するために必要な情報を検出し、前記情報を用いて前記相対的な関係の変化に応じた移動が可能であるか否かを判定し、判定結果に応じた制御を行う被操縦体制御部を備えたことを特徴とする。 In order to solve the above-described problems and achieve the object, a steered body according to the present invention is capable of communicating with an imaging device that captures an image of a subject and generates image data, and the imaging device. Whether or not the steered body is provided with a movable device that can be held and moved together with the imaging device, and is movable according to a change in a relative relationship between a predetermined imaging target and the imaging device. To detect the information necessary for determining whether or not the information can be used to determine whether or not movement according to the change in the relative relationship is possible, and to perform control according to the determination result A body control unit is provided.
 本発明に係る移動装置は、被写体を撮像して画像データを生成する撮像装置と通信可能であり、前記撮像装置を保持して前記撮像装置とともに移動可能な移動装置であって、予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定し、判定結果に応じた制御を行う第1制御部を備えたことを特徴とする。 The moving device according to the present invention is a moving device that can communicate with an imaging device that captures an image of a subject and generates image data, and that can move with the imaging device while holding the imaging device. A first control unit is provided that determines whether or not the object is movable according to a change in a relative relationship between the object to be imaged and the imaging device, and performs control according to the determination result.
 本発明に係る撮像装置は、移動装置と通信可能であるとともに前記移動装置に保持されてなり、被写体を撮像して画像データを生成する撮像装置であって、予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを前記移動装置が判定するために必要な情報を検出し、検出した情報を前記移動装置に送信する第2制御部を備えたことを特徴とする。 An imaging apparatus according to the present invention is an imaging apparatus that is communicable with a mobile device and is held by the mobile device, and that captures an image of a subject to generate image data. A second control unit that detects information necessary for the mobile device to determine whether it can move according to a change in a relative relationship with the imaging device and transmits the detected information to the mobile device It is provided with.
 本発明に係る移動制御方法は、被写体を撮像して画像データを生成する撮像装置と通信可能であり、前記撮像装置を保持して前記撮像装置とともに移動可能な移動装置が行う移動制御方法であって、予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定する判定ステップと、前記判定ステップにおける判定結果を記録部から読み出して前記判定結果に応じた制御を行う制御ステップと、を有することを特徴とする。 The movement control method according to the present invention is a movement control method performed by a moving apparatus that is capable of communicating with an imaging apparatus that captures an image of a subject and generates image data, and that is movable with the imaging apparatus. A determination step for determining whether or not the object can be moved in accordance with a change in a relative relationship between the imaging object specified in advance and the imaging device, and a determination result in the determination step is read from the recording unit. And a control step for performing control according to the determination result.
 本発明に係る移動補助方法は、移動装置と通信可能であるとともに前記移動装置に保持されてなり、被写体を撮像して画像データを生成する撮像装置が行う移動補助方法であって、予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを前記移動装置が判定するために必要な情報を検出する検出ステップと、前記検出ステップが検出した情報を記録部から読み出して前記移動装置に送信する送信ステップと、を有することを特徴とする。 A movement assistance method according to the present invention is a movement assistance method performed by an imaging apparatus that is communicable with a mobile device and is held by the mobile device and that captures an image of a subject and generates image data. A detecting step for detecting information necessary for the moving device to determine whether or not the moving object can be moved according to a change in a relative relationship between the imaging object and the imaging device; and the detecting step detects A transmission step of reading out the information obtained from the recording unit and transmitting the information to the mobile device.
 本発明に係る移動制御プログラムは、被写体を撮像して画像データを生成する撮像装置と通信可能であり、前記撮像装置を保持して前記撮像装置とともに移動可能な移動装置に、予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定する判定ステップと、前記判定ステップにおける判定結果を記録部から読み出して前記判定結果に応じた制御を行う制御ステップと、を実行させることを特徴とする。 The movement control program according to the present invention is capable of communicating with an imaging apparatus that captures an image of a subject and generates image data. A determination step for determining whether or not the object can be moved according to a change in a relative relationship between the object and the imaging device, and a control according to the determination result by reading out the determination result in the determination step from the recording unit And performing a control step.
 本発明に係る移動補助プログラムは、移動装置と通信可能であるとともに前記移動装置に保持されてなり、被写体を撮像して画像データを生成する撮像装置に、予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを前記移動装置が判定するために必要な情報を検出する検出ステップと、前記検出ステップが検出した情報を記録部から読み出して前記移動装置に送信する送信ステップと、を実行させることを特徴とする。 The movement assistance program according to the present invention is communicable with a mobile device and is held by the mobile device, and captures an image of a subject and generates image data. A detection step for detecting information necessary for the mobile device to determine whether or not the mobile device can be moved according to a change in a relative relationship with the device; and reading information detected by the detection step from the recording unit And transmitting to the mobile device.
 本発明によれば、撮影時の移動経路が定まっていない状況下でも適切な処理を行うことができる。 According to the present invention, it is possible to perform appropriate processing even in a situation where a moving route at the time of shooting is not fixed.
図1は、本発明の実施の形態1に係る撮像システムの概略構成を示す模式図である。FIG. 1 is a schematic diagram showing a schematic configuration of an imaging system according to Embodiment 1 of the present invention. 図2は、本発明の実施の形態1に係る撮像システムの機能構成を示すブロック図である。FIG. 2 is a block diagram showing a functional configuration of the imaging system according to Embodiment 1 of the present invention. 図3は、操作装置の外観構成を模式的に示す図である。FIG. 3 is a diagram schematically showing the external configuration of the operating device. 図4は、通常操縦モードにおける操作装置の操作入力部の操作割り当てを模式的に示す図である。FIG. 4 is a diagram schematically showing operation assignment of the operation input unit of the operation device in the normal operation mode. 図5は、通常ロックオンモードを説明する図である。FIG. 5 is a diagram for explaining the normal lock-on mode. 図6は、角度ロックオンモードを説明する図である。FIG. 6 is a diagram for explaining the angle lock-on mode. 図7は、通常操縦モードからロックオンモードへ移行する際の処理の概要を示す図である。FIG. 7 is a diagram showing an outline of processing when shifting from the normal operation mode to the lock-on mode. 図8は、ロックオンモードへ移行した場合の操作装置の表示部における表示例を示す図である。FIG. 8 is a diagram illustrating a display example on the display unit of the controller device when the mode is shifted to the lock-on mode. 図9は、構図調整モード設定時の操作装置の表示部における表示例を示す図である。FIG. 9 is a diagram illustrating a display example on the display unit of the controller device when the composition adjustment mode is set. 図10は、構図調整モード設定時の操作入力部における操作割り当ての一例を模式的に示す図である。FIG. 10 is a diagram schematically illustrating an example of operation assignment in the operation input unit when the composition adjustment mode is set. 図11は、操作装置が行う処理の概要を示すフローチャートである。FIG. 11 is a flowchart illustrating an outline of processing performed by the controller device. 図12Aは、本発明の実施の形態1に係る移動装置が行う処理の概要を示すフローチャート(その1)である。FIG. 12A is a flowchart (part 1) showing an overview of processing performed by the mobile device according to Embodiment 1 of the present invention. 図12Bは、本発明の実施の形態1に係る移動装置が行う処理の概要を示すフローチャート(その2)である。FIG. 12B is a flowchart (part 2) illustrating an overview of the process performed by the mobile device according to Embodiment 1 of the present invention. 図13は、本発明の実施の形態1に係る移動装置が行う被写体追尾制御処理の概要を示すフローチャートである。FIG. 13 is a flowchart showing an outline of subject tracking control processing performed by the mobile device according to Embodiment 1 of the present invention. 図14は、本発明の実施の形態1に係る移動装置の移動判定部が行う移動距離算出処理の概要を示す図である。FIG. 14 is a diagram showing an outline of the movement distance calculation process performed by the movement determination unit of the mobile device according to Embodiment 1 of the present invention. 図15は、本発明の実施の形態1に係る移動装置が行う撮影方向追従制御の処理の概要を示すフローチャートである。FIG. 15 is a flowchart showing an overview of processing of shooting direction tracking control performed by the mobile device according to Embodiment 1 of the present invention. 図16は、本発明の実施の形態1に係る移動装置が行うズーム制御の処理の概要を示すフローチャートである。FIG. 16 is a flowchart showing an overview of zoom control processing performed by the mobile device according to Embodiment 1 of the present invention. 図17は、本発明の実施の形態1に係る移動装置の移動判定部がズーム制御の処理時に行う移動距離算出処理の概要を示す図である。FIG. 17 is a diagram illustrating an outline of a moving distance calculation process performed by the movement determination unit of the moving device according to the first embodiment of the present invention during the zoom control process. 図18は、本発明の実施の形態1に係る移動装置が行うスライド制御の処理の概要を示すフローチャートである。FIG. 18 is a flowchart showing an overview of slide control processing performed by the mobile device according to Embodiment 1 of the present invention. 図19Aは、本発明の実施の形態1に係る撮像装置が行う処理の概要を示すフローチャート(その1)である。FIG. 19A is a flowchart (part 1) illustrating an overview of processing performed by the imaging apparatus according to Embodiment 1 of the present invention. 図19Bは、本発明の実施の形態1に係る撮像装置が行う処理の概要を示すフローチャート(その2)である。FIG. 19B is a flowchart (part 2) illustrating an overview of the process performed by the imaging device according to Embodiment 1 of the present invention. 図20は、本発明の実施の形態1に係る撮像装置が行うズーム制御の処理の概要を示すフローチャートである。FIG. 20 is a flowchart illustrating an overview of zoom control processing performed by the imaging apparatus according to Embodiment 1 of the present invention. 図21は、本発明の実施の形態2に係る撮像システムの概略構成を示す図である。FIG. 21 is a diagram showing a schematic configuration of an imaging system according to Embodiment 2 of the present invention. 図22は、本発明の実施の形態2に係る撮像システムの機能構成を示すブロック図である。FIG. 22 is a block diagram showing a functional configuration of the imaging system according to Embodiment 2 of the present invention. 図23は、本発明の実施の形態2に係る移動装置が行うスライド制御の処理の概要を説明するフローチャートである。FIG. 23 is a flowchart illustrating an outline of a slide control process performed by the mobile device according to the second embodiment of the present invention. 図24は、被写体像の大きさが変化した場合の内視鏡先端部と被写体との関係を模式的に示す図である。FIG. 24 is a diagram schematically illustrating the relationship between the distal end portion of the endoscope and the subject when the size of the subject image changes. 図25は、図24に示す場合の被写体像の変化を模式的に示す図である。FIG. 25 is a diagram schematically showing changes in the subject image in the case shown in FIG. 図26は、図24に示す状況から移動装置が前進して被写体に接近した状況を模式的に示す図である。FIG. 26 is a diagram schematically illustrating a situation in which the moving device has advanced from the situation illustrated in FIG. 24 and has approached the subject. 図27は、図26に示す場合の被写体像の変化を模式的に示す図である。FIG. 27 is a diagram schematically showing changes in the subject image in the case shown in FIG. 図28は、本発明の実施の形態2に係る撮像装置が行うスライド制御の処理の概要を説明するフローチャートである。FIG. 28 is a flowchart illustrating an outline of slide control processing performed by the imaging apparatus according to Embodiment 2 of the present invention.
 以下に、添付図面を参照して、本発明を実施するための形態(以下、「実施の形態」という)を説明する。本発明の実施の形態に係る撮像システムは、撮像装置が移動装置に装着された被操縦体を備える。被操縦体は、予め指定された撮影対象物と撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定するために必要な情報を検出し、その情報を用いて相対的な関係の変化に応じた移動が可能であるか否かを判定し、判定結果に応じた制御を行う。以下、被操縦体を構成する移動装置が無人航空機(実施の形態1)および内視鏡(実施の形態2)である場合を例示して説明するが、これらはあくまでも一例に過ぎず、移動装置として自走可能なロボットを適用することも可能である。 Hereinafter, modes for carrying out the present invention (hereinafter referred to as “embodiments”) will be described with reference to the accompanying drawings. An imaging system according to an embodiment of the present invention includes a steered body in which an imaging device is mounted on a moving device. The steered object detects information necessary for determining whether or not the steerable object can move in accordance with a change in the relative relationship between the imaging object specified in advance and the imaging apparatus, and uses the information. It is determined whether or not movement according to a change in relative relationship is possible, and control according to the determination result is performed. Hereinafter, a case where the mobile device constituting the steered body is an unmanned aerial vehicle (Embodiment 1) and an endoscope (Embodiment 2) will be described by way of example. However, these are merely examples, and the mobile device It is also possible to apply a self-running robot.
(実施の形態1)
 本発明の実施の形態1は、移動装置が無人航空機である。撮像装置は無人航空機に装着されている。移動装置は、操作装置と通信可能に無線接続されており、操作装置の指示に応じて駆動するとともに、撮像装置に対する指示信号を受信して撮像装置に送信する。被操縦体はロックオン機能を具備しており、撮影対象物の動きを追尾することが可能であるか否かを逐次判定しながら撮影対象物を追尾する。
(Embodiment 1)
In the first embodiment of the present invention, the moving device is an unmanned aerial vehicle. The imaging device is mounted on an unmanned aerial vehicle. The mobile device is wirelessly connected so as to be communicable with the operating device, and is driven according to an instruction from the operating device, and receives an instruction signal for the imaging device and transmits the instruction signal to the imaging device. The steered object has a lock-on function, and tracks the shooting target while sequentially determining whether or not the movement of the shooting target can be tracked.
 図1は、本発明の実施の形態1に係る撮像システムの概略構成を示す模式図である。図2は、本実施の形態1に係る撮像システムの機能構成を示すブロック図である。図1および図2に示す撮像システム1は、撮像機能を有するとともに飛行により移動可能な被操縦体2と、被操縦体2における撮像および移動を操縦する操作装置3とを備える。 FIG. 1 is a schematic diagram showing a schematic configuration of an imaging system according to Embodiment 1 of the present invention. FIG. 2 is a block diagram illustrating a functional configuration of the imaging system according to the first embodiment. An imaging system 1 shown in FIGS. 1 and 2 includes a steered body 2 that has an imaging function and can be moved by flying, and an operating device 3 that steers imaging and movement of the steered body 2.
 被操縦体2は、移動装置4と、移動装置4に着脱自在に取り付けられる撮像装置5とを有する。なお、スタビライザー、振動補正機構(例えば、ジンバル)およびリグ等を介して撮像装置5を移動装置4に取り付けてもよい。移動装置4と撮像装置5は、USB(Universal Serial Bus)等のケーブル6によって双方向に通信可能に接続される。また、移動装置4と操作装置3は、所定の周波数帯域の無線通信を行うことが可能である。なお、本実施の形態1では、移動装置4と撮像装置5とをケーブル6によって接続した構成としたが、これに限定されることなく、無線通信可能な構成を採用してもよい。また、移動装置4と撮像装置5との位置関係を固定してもよいし、移動装置4に対する撮像装置5の姿勢を変化させることが可能な構成としてもよい。 The steered body 2 includes a moving device 4 and an imaging device 5 that is detachably attached to the moving device 4. Note that the imaging device 5 may be attached to the moving device 4 via a stabilizer, a vibration correction mechanism (for example, a gimbal), a rig, or the like. The mobile device 4 and the imaging device 5 are connected to each other via a cable 6 such as a USB (Universal Serial Bus) so as to be capable of bidirectional communication. In addition, the mobile device 4 and the controller device 3 can perform wireless communication in a predetermined frequency band. In the first embodiment, the mobile device 4 and the imaging device 5 are connected by the cable 6. However, the present invention is not limited to this, and a configuration capable of wireless communication may be adopted. Further, the positional relationship between the moving device 4 and the imaging device 5 may be fixed, or the posture of the imaging device 5 with respect to the moving device 4 may be changed.
 被操縦体2は、予め指定された撮影対象物と撮像装置5との相対的な関係の変化に応じて移動可能であるか否かを判定するために必要な情報を検出し、検出した情報を用いて相対的な関係の変化に応じた移動が可能であるか否かを判定し、判定結果に応じた制御を行う被操縦体制御部21を備える。被操縦体制御部21は、CPU(Central Processing Unit)等を用いて構成され、操作装置3から送信される各種指示信号等に応じて被操縦体2の制御を行う。被操縦体制御部21は、後述する移動装置4の第1制御部49と撮像装置5の第2制御部56とを含む。被操縦体制御部21は、専用の回路やプログラムが連携して特定のシーケンス制御で各種制御を行う回路部であり、必要に応じて、人工知能の回路部も含んでおり、ビッグデータを用いた深層学習や機械学習等の結果を利用した制御を行うことも可能である(この点は、操作装置3の第3制御部36も同様である)。以下の説明においては、専用の回路やプログラムが連携して特定のシーケンス制御で各種制御を行う回路部で、必要に応じて、人工知能の回路部も含み、深層学習や機械学習等の結果を利用した制御を行う制御部の一部の構成や回路などを、機能別で分けた形で分類している。なお、第1制御部49および第2制御部56の個々の機能の中には、他方の制御部が具備してよいものもある。 The to-be-steered body 2 detects information necessary for determining whether or not it can move according to a change in the relative relationship between the imaging object 5 specified in advance and the imaging device 5, and the detected information The to-be-steered body control part 21 which determines whether the movement according to the change of the relative relationship is possible using is performed, and performs control according to the determination result is provided. The steered body control unit 21 is configured using a CPU (Central Processing Unit) or the like, and controls the steered body 2 according to various instruction signals transmitted from the operation device 3. The steered body control unit 21 includes a first control unit 49 of the moving device 4 and a second control unit 56 of the imaging device 5 which will be described later. The steered body control unit 21 is a circuit unit that performs various types of control by specific sequence control in cooperation with a dedicated circuit or program, and includes a circuit unit for artificial intelligence as necessary. It is also possible to perform control using results such as deep learning and machine learning (this is also true for the third control unit 36 of the controller device 3). In the following description, a dedicated circuit or program cooperates with a circuit unit that performs various types of control with specific sequence control, including artificial intelligence circuit units as necessary, and results such as deep learning and machine learning. A part of the configuration and circuit of the control unit that performs the used control is classified according to function. Note that some of the individual functions of the first control unit 49 and the second control unit 56 may be included in the other control unit.
 移動装置4の機能構成を説明する。移動装置4は、予め指定された撮影対象物と撮像装置5との相対的な関係の変化に応じて移動可能であるか否かを判定し、判定結果に応じた制御を行う。移動装置4は、無人航空機(UAV:Unmanned Aerial Vehicle)であり、4枚のロータ4aを有する回転翼無人機のドローンとして構成されている。なお、ロータ4aの数は4枚に限定されず、その他の数としてもよい。また、移動装置4は回転翼無人機のドローンに限られず、固定翼無人機のドローン等、その他の無人航空機によって構成してもよい。さらに、移動装置4は、無線によって操縦可能であり、かつ自走可能な装置であればよく、例えば自走式ロボット、自動車、模型自動車、船舶などでもよい。 The functional configuration of the mobile device 4 will be described. The moving device 4 determines whether or not the moving device 4 can move according to a change in the relative relationship between the imaging object specified in advance and the imaging device 5, and performs control according to the determination result. The moving device 4 is an unmanned aerial vehicle (UAV: Unmanned Aero Vehicle), and is configured as a drone of a rotary wing unmanned aircraft having four rotors 4a. The number of rotors 4a is not limited to four and may be other numbers. Further, the moving device 4 is not limited to a drone of a rotary wing drone, but may be constituted by other unmanned aircraft such as a drone of a fixed wing drone. Furthermore, the mobile device 4 may be any device that can be operated wirelessly and can be self-propelled, and may be, for example, a self-propelled robot, a car, a model car, a ship, or the like.
 移動装置4は、推進部41と、電源42と、空間情報取得部43と、位置方位検出部44と、高度姿勢検出部45と、第1記録部46と、第1通信部47と、第2通信部48と、第1制御部49と、を備える。 The mobile device 4 includes a propulsion unit 41, a power source 42, a spatial information acquisition unit 43, a position / orientation detection unit 44, an altitude / attitude detection unit 45, a first recording unit 46, a first communication unit 47, and a first communication unit 47. 2 communication unit 48 and first control unit 49.
 推進部41は、第1制御部49の制御のもと、複数枚(図1の例では、4枚)のロータ4aおよび各ロータ4aを駆動するための複数のモータ(図示せず)を用いて構成され、移動装置4を飛行させる。 The propulsion unit 41 uses a plurality of (four in the example of FIG. 1) rotors 4a and a plurality of motors (not shown) for driving the rotors 4a under the control of the first control unit 49. The mobile device 4 is made to fly.
 電源42は、電池と昇圧回路等を用いて構成される。電源42は、移動装置4の各部に対して所定の電圧を供給する。 The power source 42 is configured using a battery and a booster circuit. The power source 42 supplies a predetermined voltage to each part of the moving device 4.
 空間情報取得部43は、レーザレーダ等を用いて構成される。空間情報取得部43は、パルスレーザ光を移動装置4から放射状に照射し、反射したパルスレーザ光の帰還時間を計測することによって、移動装置4の周囲にある物体(各照射点の方向、各照射点までの距離)に関する空間情報を取得し、取得した結果を第1制御部49へ出力する。なお、空間情報取得部43は、レーザの代わりに超音波を照射することによって空間情報を取得してもよい。空間情報取得部43は、非接触型の近接センサをさらに有していてもよい。 The spatial information acquisition unit 43 is configured using a laser radar or the like. The spatial information acquisition unit 43 irradiates the pulse laser beam radially from the moving device 4 and measures the feedback time of the reflected pulse laser beam to thereby detect an object (direction of each irradiation point, each of the irradiation points) around the moving device 4. Spatial information regarding the distance to the irradiation point) is acquired, and the acquired result is output to the first control unit 49. The spatial information acquisition unit 43 may acquire the spatial information by irradiating ultrasonic waves instead of the laser. The spatial information acquisition unit 43 may further include a non-contact type proximity sensor.
 位置方位検出部44は、GPS(Global Positioning System)受信機や磁気方位センサ等を用いて構成される。位置方位検出部44は、移動装置4の現在位置に関する現在位置情報、および予め定められた基準方位(例えば、北)に対する移動装置4の機首方向(機首が向く方向)のなす角度(方位)に関する方位情報を検出し、この検出結果を第1制御部49へ出力する。 The position / orientation detection unit 44 is configured using a GPS (Global Positioning System) receiver, a magnetic direction sensor, or the like. The position / orientation detection unit 44 includes the current position information regarding the current position of the mobile device 4 and an angle (azimuth) formed by the nose direction (the direction in which the nose faces) of the mobile device 4 with respect to a predetermined reference orientation (for example, north). ) Direction information is detected, and the detection result is output to the first control unit 49.
 高度姿勢検出部45は、気圧センサ、ジャイロセンサ(角速度センサ)および傾きセンサ(加速度センサ)等を用いて構成される。高度姿勢検出部45は、撮像装置5の高度に関する高度情報、撮像装置5の傾斜角(4枚のロータ4aが水平面内に位置する基準姿勢に対する傾斜角)に関する傾斜角度情報および移動装置4の回転角(4枚のロータ4aの中心位置を通る鉛直線を中心とする回転角)に関する回転角度情報を検出し、この検出結果を第1制御部49へ出力する。 The altitude posture detection unit 45 is configured using an atmospheric pressure sensor, a gyro sensor (angular velocity sensor), an inclination sensor (acceleration sensor), and the like. The altitude posture detection unit 45 includes altitude information regarding the altitude of the imaging device 5, tilt angle information regarding the tilt angle of the image capturing device 5 (the tilt angle with respect to the reference posture in which the four rotors 4 a are positioned in the horizontal plane), and rotation of the moving device 4. Rotation angle information related to the angle (rotation angle centered on a vertical line passing through the center position of the four rotors 4 a) is detected, and the detection result is output to the first control unit 49.
 第1記録部46は、FlashメモリやSDRAM(Synchronous Dynamic Random Access Memory)等の記録媒体を用いて構成される。第1記録部46は、移動装置4を駆動するための各種プログラムを記録するとともに、処理中の情報を一時的に記録する。 The first recording unit 46 is configured using a recording medium such as a flash memory or an SDRAM (Synchronous Dynamic Random Access Memory). The first recording unit 46 records various programs for driving the mobile device 4 and temporarily records information being processed.
 第1通信部47は、所定の通信モジュールを用いて構成される。第1通信部47は、第1制御部49の制御のもと、移動装置4の飛行動作を遠隔制御する操作装置3との間で無線通信を行う。また、第1通信部47は、第1制御部49の制御のもと、撮像装置5から入力された情報を操作装置3へ送信する。 The first communication unit 47 is configured using a predetermined communication module. The first communication unit 47 performs wireless communication with the operation device 3 that remotely controls the flight operation of the mobile device 4 under the control of the first control unit 49. Further, the first communication unit 47 transmits information input from the imaging device 5 to the operation device 3 under the control of the first control unit 49.
 第2通信部48は、通信モジュールを用いて構成される。第2通信部48は、第1制御部49の制御のもと、ケーブル6を介して撮像装置5との間で通信を行う。 The second communication unit 48 is configured using a communication module. The second communication unit 48 communicates with the imaging device 5 via the cable 6 under the control of the first control unit 49.
 第1制御部49は、予め指定された撮影対象物と撮像装置5との相対的な関係の変化に応じて移動可能であるか否かを判定し、判定結果に応じた制御を行う。第1制御部49は、CPU等を用いて構成され、第1通信部47を介して入力された操作装置3からの指示信号、第2通信部48を介して入力された撮像装置5からのデータに応じて、推進部41の動作(移動装置4の飛行動作)ならびに撮像装置5および操作装置3との通信を制御する。 The first control unit 49 determines whether or not the first control unit 49 is movable in accordance with a change in the relative relationship between the imaging target specified in advance and the imaging device 5, and performs control according to the determination result. The first control unit 49 is configured using a CPU or the like. The first control unit 49 receives an instruction signal from the operation device 3 input via the first communication unit 47 and the imaging device 5 input via the second communication unit 48. In accordance with the data, the operation of the propulsion unit 41 (the flight operation of the moving device 4) and the communication with the imaging device 5 and the operation device 3 are controlled.
 第1制御部49は、電源判定部491と、移動判定部492と、姿勢判定部493と、推進制御部494と、方向制御部495と、第1通信制御部496と、を有する。 The first control unit 49 includes a power supply determination unit 491, a movement determination unit 492, an attitude determination unit 493, a propulsion control unit 494, a direction control unit 495, and a first communication control unit 496.
 電源判定部491は、電源42の残量(電力残量)を検出し、この検出結果に基づいて、移動装置4が可能な飛行時間や飛行距離を判定する。 The power source determination unit 491 detects the remaining amount (power remaining amount) of the power source 42 and determines the flight time and the flight distance that the mobile device 4 is capable of based on the detection result.
 移動判定部492は、空間情報取得部43が取得した空間情報、位置方位検出部44が検出した現在位置情報および方位情報、高度姿勢検出部45が検出した高度情報、ならびに電源判定部491が判定した飛行時間の情報等に基づいて、移動装置4の移動距離を判定する。なお、移動判定部492の機能は、撮像装置5の第2制御部56が有してもよい。 The movement determination unit 492 has the spatial information acquired by the spatial information acquisition unit 43, the current position information and direction information detected by the position / orientation detection unit 44, the altitude information detected by the altitude / attitude detection unit 45, and the power source determination unit 491. The moving distance of the moving device 4 is determined based on the flight time information and the like. Note that the function of the movement determination unit 492 may be included in the second control unit 56 of the imaging device 5.
 姿勢判定部493は、高度姿勢検出部45が検出した傾斜角度情報および回転角度情報に基づいて、撮像装置5の姿勢を判定する。 The posture determination unit 493 determines the posture of the imaging device 5 based on the tilt angle information and the rotation angle information detected by the altitude posture detection unit 45.
 推進制御部494は、電源判定部491、移動判定部492および姿勢判定部493の判定結果に基づいて、移動装置4を推進させる。具体的には、推進制御部494は、4枚のロータ4aの回転数を独立に制御することによって、移動装置4を上昇、前進、静止、後退および下降させる。例えば、推進制御部494は、移動装置4の移動方向を前後(移動装置4の機首方向が前)または左右(移動装置4の機首方向に見て左右)に変更する。 The propulsion control unit 494 propels the mobile device 4 based on the determination results of the power source determination unit 491, the movement determination unit 492, and the posture determination unit 493. Specifically, the propulsion control unit 494 raises, moves forward, stops, moves backward, and moves down the moving device 4 by independently controlling the rotational speeds of the four rotors 4a. For example, the propulsion control unit 494 changes the moving direction of the moving device 4 to the front and rear (the nose direction of the moving device 4 is front) or to the left and right (right and left when viewed in the nose direction of the moving device 4).
 方向制御部495は、電源判定部491、移動判定部492および姿勢判定部493それぞれの判定結果に基づいて、移動装置4を回転させる。具体的には、方向制御部495は、4枚のロータ4aの回転数を独立に制御することにより、移動装置4を回転させる。 The direction control unit 495 rotates the moving device 4 based on the determination results of the power source determination unit 491, the movement determination unit 492, and the posture determination unit 493. Specifically, the direction control unit 495 rotates the moving device 4 by independently controlling the number of rotations of the four rotors 4a.
 第1通信制御部496は、第1通信部47および第2通信部48の通信を制御する。具体的には、第1通信制御部496は、第2通信部48を介して撮像装置5から入力された情報を、第1通信部47を介して操作装置3へ送信する。また、第1通信制御部496は、第1通信部47を介して操作装置3から入力された操作情報を含む情報(以下、「移動装置情報」という)を、第2通信部48を介して撮像装置5へ送信する。ここで、移動装置情報とは、移動装置4の現在位置、操作装置3から送信された移動装置4の移動を制御する制御信号に関する制御情報(操作情報)、および移動装置4の移動方向、移動装置4の高度および移動装置4の状態(例えば、通常移動、帰還移動、電池残量)である。 The first communication control unit 496 controls communication between the first communication unit 47 and the second communication unit 48. Specifically, the first communication control unit 496 transmits information input from the imaging device 5 via the second communication unit 48 to the controller device 3 via the first communication unit 47. In addition, the first communication control unit 496 receives information including operation information input from the operation device 3 via the first communication unit 47 (hereinafter referred to as “mobile device information”) via the second communication unit 48. It transmits to the imaging device 5. Here, the moving device information is the current position of the moving device 4, control information (control information) related to a control signal for controlling the movement of the moving device 4 transmitted from the operating device 3, and the moving direction and moving of the moving device 4. The altitude of the apparatus 4 and the state of the moving apparatus 4 (for example, normal movement, return movement, remaining battery level).
 次に、撮像装置5の構成について説明する。撮像装置5は、予め指定された撮影対象物と自身との相対的な関係の変化に応じて移動可能であるか否かを移動装置4が判定するために必要な情報を検出し、検出した情報を移動装置4に送信する。撮像装置5は、移動装置4の機首方向に撮影方向を合わせた状態で、移動装置4に対して移動不能に固定されている。撮像装置5は、静止画撮影および動画撮影を行うことが可能である。なお、撮像装置5を、移動装置4に対して回転可能な回転機構を介して移動装置4に固定してもよい。また、撮像装置5を、スタビライザー、振動補正機構(例えば、ジンバル)およびリグ等を介して移動装置4に固定してもよい。 Next, the configuration of the imaging device 5 will be described. The imaging device 5 detects and detects information necessary for the moving device 4 to determine whether or not it can move according to a change in the relative relationship between the imaging object specified in advance and itself. Information is transmitted to the mobile device 4. The imaging device 5 is fixed so as not to move with respect to the moving device 4 in a state where the shooting direction is aligned with the nose direction of the moving device 4. The imaging device 5 can perform still image shooting and moving image shooting. The imaging device 5 may be fixed to the moving device 4 via a rotation mechanism that can rotate with respect to the moving device 4. The imaging device 5 may be fixed to the moving device 4 via a stabilizer, a vibration correction mechanism (for example, a gimbal), a rig, or the like.
 次に、撮像装置5の機能構成を説明する。撮像装置5は、撮像部51と、仰角方位検出部52と、音声入力部53と、第3通信部54と、第2記録部55と、第2制御部56と、を備える。 Next, the functional configuration of the imaging device 5 will be described. The imaging device 5 includes an imaging unit 51, an elevation angle direction detection unit 52, a voice input unit 53, a third communication unit 54, a second recording unit 55, and a second control unit 56.
 撮像部51は、第2制御部56の制御のもと、撮影対象物である被写体を撮像して画像データを生成する。撮像部51は、光学系511と、撮像素子512と、を有する。 The imaging unit 51 captures an image of a subject that is an object to be imaged under the control of the second control unit 56 and generates image data. The imaging unit 51 includes an optical system 511 and an imaging element 512.
 光学系511は、フォーカスレンズ、ズームレンズ、シャッタおよび絞りを有する。光学系511は、被写体像を撮像素子512の受光面に結像する。光学系511は、第2制御部56の制御のもと、ズーム倍率、ピント位置、絞り値およびシャッタスピード等の各撮影パラメータを変更する。 The optical system 511 includes a focus lens, a zoom lens, a shutter, and a diaphragm. The optical system 511 forms a subject image on the light receiving surface of the image sensor 512. The optical system 511 changes each shooting parameter such as zoom magnification, focus position, aperture value, and shutter speed under the control of the second control unit 56.
 撮像素子512は、光学系511が結像した被写体像を受光して光電変換を行うことによって画像データを生成し、この画像データを第2制御部56へ出力する。撮像素子512は、CCD(Charge Coupled Device)またはCMOS(Complementary Metal Oxide Semiconductor)等を用いて構成される。また、撮像素子512は、画像データを生成する画素が2次元マトリックス状に配置されてなる。なお、撮像素子512に被写体の距離を検出するための位相差画素をさらに設けてもよいし、上述した画像データ生成用の画素を位相差画素として利用してもよい。 The image sensor 512 receives the subject image formed by the optical system 511 and performs photoelectric conversion to generate image data, and outputs the image data to the second control unit 56. The imaging element 512 is configured using a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like. In addition, the imaging element 512 includes pixels that generate image data arranged in a two-dimensional matrix. Note that the image sensor 512 may be further provided with a phase difference pixel for detecting the distance of the subject, or the image data generation pixel described above may be used as the phase difference pixel.
 仰角方位検出部52は、方位センサ、3軸の加速度センサ、ジャイロセンサおよび磁気方位センサを用いて構成される。仰角方位検出部52は、水平面に対する撮像装置5(光学系511の光軸)の傾斜角(仰角)に関する仰角情報、及び予め定められた基準方位(例えば、北)に対する撮像装置5の撮影方向のなす角度(方位)に関する方位情報を検出し、この検出結果を第2制御部56へ出力する。 The elevation angle azimuth detecting unit 52 is configured using an azimuth sensor, a triaxial acceleration sensor, a gyro sensor, and a magnetic azimuth sensor. The elevation angle azimuth detection unit 52 includes elevation angle information regarding the inclination angle (elevation angle) of the imaging device 5 (the optical axis of the optical system 511) with respect to the horizontal plane, and the shooting direction of the imaging device 5 with respect to a predetermined reference orientation (for example, north). Direction information regarding the angle (direction) to be formed is detected, and the detection result is output to the second control unit 56.
 音声入力部53は、外部の音を集音して音声データを生成するマイクロフォン等を用いて構成される。 The audio input unit 53 is configured using a microphone or the like that collects external sounds and generates audio data.
 第3通信部54は、第2制御部56の制御のもと、ケーブル6を介して移動装置4と通信を行う。 The third communication unit 54 communicates with the mobile device 4 via the cable 6 under the control of the second control unit 56.
 第2記録部55は、Flashメモリ、SDRAMおよびメモリカード等を用いて構成される。第2記録部55は、撮像部51が生成した画像データを記録する画像データ記録部551と、撮像装置5が実行する各種プログラムを記録するプログラム記録部552と、を有する。 The second recording unit 55 is configured using a flash memory, an SDRAM, a memory card, and the like. The second recording unit 55 includes an image data recording unit 551 that records image data generated by the imaging unit 51, and a program recording unit 552 that records various programs executed by the imaging device 5.
 第2制御部56は、予め指定された撮影対象物と自身との相対的な関係の変化に応じて移動可能であるか否かを移動装置4が判定するために必要な情報を検出し、検出した情報を移動装置4に送信する。第2制御部56は、CPU等を用いて構成される。第2通信部48を介して移動装置4から受信した操作装置3の指示信号に応じて、撮像部51の撮影を制御するとともに、撮像部51が生成した画像データを移動装置4へ送信する。この場合において、第2制御部56は、画像データをそのまま送信する代わりに、時間的または画素的に間引きしたデータ(リサイズ処理した画像データ)や、圧縮した画像データ(縮小した画像データや圧縮処理した画像データ)を送信してもよい。また、第2制御部56は、画像データに対して視認性を上げる処理(例えば、コントラスト強調処理や露出値低下処理)を施してから送信してもよいし、画像を解析した結果(後述する被写体検出部562が抽出する画像の特徴量)を送信してもよい。 The second control unit 56 detects information necessary for the moving device 4 to determine whether or not it can move in accordance with a change in the relative relationship between the photographing object specified in advance and itself, The detected information is transmitted to the mobile device 4. The second control unit 56 is configured using a CPU or the like. In accordance with the instruction signal of the operating device 3 received from the mobile device 4 via the second communication unit 48, the imaging of the imaging unit 51 is controlled and the image data generated by the imaging unit 51 is transmitted to the mobile device 4. In this case, the second control unit 56, instead of transmitting the image data as it is, the data thinned out temporally or pixelally (resized image data) or compressed image data (reduced image data or compression processing). Image data) may be transmitted. Further, the second control unit 56 may transmit the image data after performing a process for increasing the visibility (for example, contrast enhancement process or exposure value reduction process), or a result of analyzing the image (described later). The feature amount of the image extracted by the subject detection unit 562 may be transmitted.
 第2制御部56は、画像処理部561と、被写体検出部562と、追尾処理部563と、距離算出部564と、トリミング部565と、第2通信制御部566と、撮影制御部567と、を有する。 The second control unit 56 includes an image processing unit 561, a subject detection unit 562, a tracking processing unit 563, a distance calculation unit 564, a trimming unit 565, a second communication control unit 566, an imaging control unit 567, Have
 画像処理部561は、撮像部51が生成した画像データに対して、所定の画像処理を行う。具体的には、画像処理部561は、画像データに対して、ゲインアップ処理、ホワイトバランス処理、階調処理および同時化処理等を行う。 The image processing unit 561 performs predetermined image processing on the image data generated by the imaging unit 51. Specifically, the image processing unit 561 performs gain-up processing, white balance processing, gradation processing, synchronization processing, and the like on the image data.
 被写体検出部562は、画像処理部561が画像処理を施した画像の特徴量を抽出することによってその画像データに含まれる被写体を検出する。被写体としては、予め指定された撮影対象物や、撮影対象物との間に存在する物体(障害物)等が挙げられる。具体的には、被写体検出部562は、画像データにおける画素ごとの輝度値や代表画素の前フレームからの移動量などに基づいて、被写体の顔、顔の位置、大きさ、向き、被写体の性別などの情報を検出する。被写体検出部562は、必要に応じて、人工知能の回路部も含むことによって、ビッグデータを用いた深層学習や機械学習等の結果を利用した検出を行ってもよい。なお、ユーザの画面上のタッチ指定や、音声指定などを参考にして検出を行ってもよい。このような場合、特定の被写体像とすべき被写体像の指定入力した、ということもできる。ここでは、判定した結果は、画像データの一部となるが、画像データ自体を判定結果としてデータとして記録したり、色分布や輪郭や大きさなどをデータ化してもよく、画像の特徴量を記録したりしてもよい。また、こうした特徴から、深層学習や機械学習等を行い、ユーザが撮りたい対象物情報を人工知能が類推できるようなシステムを構築してもよい。これにより、よく撮影される画像を教師画像とした学習を行うことが可能となる。 The subject detection unit 562 detects a subject included in the image data by extracting the feature amount of the image subjected to the image processing by the image processing unit 561. Examples of the subject include a pre-designated photographing object and an object (obstacle) existing between the photographing object. Specifically, the subject detection unit 562 is based on the luminance value for each pixel in the image data, the amount of movement of the representative pixel from the previous frame, and the like. Detect information such as. The subject detection unit 562 may include an artificial intelligence circuit unit as necessary to perform detection using results such as deep learning using big data and machine learning. Note that detection may be performed with reference to touch designation on the user's screen or voice designation. In such a case, it can also be said that a subject image to be a specific subject image is designated and input. Here, the determination result is a part of the image data, but the image data itself may be recorded as data as a determination result, or the color distribution, outline, size, etc. may be converted into data, and the image feature amount may be determined. It may be recorded. Further, based on such features, a system may be constructed in which artificial intelligence can infer target information that the user wants to take by performing deep learning, machine learning, or the like. As a result, it is possible to perform learning using a frequently photographed image as a teacher image.
 追尾処理部563は、予め設定される被写体に対して被写体検出部562が検出した情報に基づいて、その被写体を画像中で追尾する。この際、追尾処理部563は、例えば被写体検出部562が検出した情報に基づいて、パターンマッチング処理を行うことにより、被写体を追尾する。追尾処理部563も被写体検出部562と同様、必要に応じて、人工知能の回路部も含むことによって、ビッグデータを用いた深層学習や機械学習等の結果を利用した検出を行ってもよい。 The tracking processing unit 563 tracks the subject in the image based on information detected by the subject detection unit 562 for a preset subject. At this time, the tracking processing unit 563 tracks the subject by performing pattern matching processing based on information detected by the subject detection unit 562, for example. Similarly to the subject detection unit 562, the tracking processing unit 563 may include an artificial intelligence circuit unit as necessary to perform detection using the results of deep learning using big data, machine learning, or the like.
 距離算出部564は、撮像部51が生成した時間的に前後する2つの画像データのコントラスト変化または撮像素子512に設けられた位相差画素による位相差情報などに基づいて、撮像装置5から被写体までの距離を算出する。距離算出部564は、被写体の大きさを判定し、フレーム間の被写体の大きさの変化に基づいて被写体までの距離を算出してもよい。また、距離算出部564は、移動装置4の空間情報取得部43が取得した空間情報を用いて被写体までの距離を算出してもよい。被写体までの距離の情報は、光学系511のピント合わせに使える他、移動装置4と被写体、移動装置4と対象物までの距離情報として、移動装置4を操縦するユーザにとっても重要な情報となる。なお、撮像装置5から被写体までの距離に関する情報に基づいて、ユーザ100が操作装置3を操作することによって、ホヴァリングなど静止制御を指定する操作を行ってもよいし、移動装置4の第1制御部49が同様の静止制御を自動で行ってもよい。被写体までの距離情報は、例えば距離を変えながら所定の順番で被写体を撮影する場合や、距離を一定にして様々な角度から被写体を撮影する場合などにも有効活用することができる。また、撮像装置5が正確な3D情報を取得する撮影を行う場合などにも距離情報を活用することができる。 The distance calculation unit 564 is configured from the imaging device 5 to the subject based on a contrast change between two pieces of image data that are generated before and after the time generated by the imaging unit 51 or phase difference information by phase difference pixels provided in the imaging element 512. The distance is calculated. The distance calculation unit 564 may determine the size of the subject and calculate the distance to the subject based on a change in the size of the subject between frames. The distance calculation unit 564 may calculate the distance to the subject using the spatial information acquired by the spatial information acquisition unit 43 of the mobile device 4. The information on the distance to the subject can be used for focusing the optical system 511, and is also important information for the user who operates the moving device 4 as the distance information between the moving device 4 and the subject and the moving device 4 and the object. . In addition, based on the information regarding the distance from the imaging device 5 to the subject, the user 100 may perform an operation for designating stationary control such as hovering by operating the operation device 3 or the first control of the moving device 4. The unit 49 may automatically perform the same stationary control. The distance information to the subject can be effectively used when, for example, the subject is photographed in a predetermined order while changing the distance, or when the subject is photographed from various angles with the distance kept constant. The distance information can also be used when the imaging device 5 performs shooting to acquire accurate 3D information.
 なお、被写体検出部562、追尾処理部563および距離算出部564の機能は、移動装置4の第1制御部49が有してもよい。 The functions of the subject detection unit 562, the tracking processing unit 563, and the distance calculation unit 564 may be included in the first control unit 49 of the moving device 4.
 トリミング部565は、画像処理部561が画像処理を施した画像データに対して、トリミング処理を行うことによって、トリミング画像データやサムネイル画像データを生成する。 The trimming unit 565 generates trimming image data and thumbnail image data by performing trimming processing on the image data subjected to image processing by the image processing unit 561.
 第2通信制御部566は、第3通信部54の通信制御を行う。具体的には、第2通信制御部566は、第3通信部54を介して、操作装置3の表示部33において表示可能な情報を移動装置4へ送信する。 The second communication control unit 566 performs communication control of the third communication unit 54. Specifically, the second communication control unit 566 transmits information that can be displayed on the display unit 33 of the controller device 3 to the mobile device 4 via the third communication unit 54.
 撮影制御部567は、撮像部51の撮影を制御する。具体的には、撮影制御部567は、第3通信部54を介して移動装置4から撮影を指示する指示信号を受信した場合、撮像部51に指示信号に応じた撮影を実行させる。本実施の形態1において、撮像装置5は、動画撮影および静止画撮影を実行することが可能である。 The imaging control unit 567 controls imaging of the imaging unit 51. Specifically, when the imaging control unit 567 receives an instruction signal instructing imaging from the mobile device 4 via the third communication unit 54, the imaging control unit 567 causes the imaging unit 51 to perform imaging according to the instruction signal. In the first embodiment, the imaging device 5 can perform moving image shooting and still image shooting.
 次に、操作装置3の機能構成を説明する。操作装置3は、第4通信部31と、操作入力部32と、表示部33と、音声出力部34と、第3記録部35と、第3制御部36と、を備える。 Next, the functional configuration of the controller device 3 will be described. The controller device 3 includes a fourth communication unit 31, an operation input unit 32, a display unit 33, an audio output unit 34, a third recording unit 35, and a third control unit 36.
 第4通信部31は、通信モジュールを用いて構成される。第4通信部31は、第3制御部36の制御のもと、移動装置4の第1通信部47と無線通信を行うことによって、操作入力部32が入力を受け付けた指示信号を送信する一方、撮像装置5が撮像した画像データ等を受信して、第3制御部36へ出力する。 The fourth communication unit 31 is configured using a communication module. The fourth communication unit 31 transmits an instruction signal received by the operation input unit 32 by performing wireless communication with the first communication unit 47 of the mobile device 4 under the control of the third control unit 36. The image data captured by the imaging device 5 is received and output to the third control unit 36.
 操作入力部32は、ユーザによる操作を受け付けるボタン、スイッチ、十字キー等を用いて構成され、ユーザによる操作に応じた指示信号の入力を受け付け、この指示信号を第3制御部36へ出力する。操作入力部32は、表示部33に重ねて設けられ、外部からの物体の接触位置に応じた信号の入力を受け付けるタッチパネルを有する。操作入力部32がマイクロフォン等の音声入力部をさらに有し、ユーザの音声入力に応じた指示信号の入力を受け付けるようにしてもよい。 The operation input unit 32 is configured by using a button, a switch, a cross key, or the like that receives an operation by the user, receives an input of an instruction signal corresponding to the operation by the user, and outputs the instruction signal to the third control unit 36. The operation input unit 32 includes a touch panel that is provided so as to overlap the display unit 33 and receives an input of a signal corresponding to the contact position of an object from the outside. The operation input unit 32 may further include a voice input unit such as a microphone, and may receive an input of an instruction signal corresponding to the user's voice input.
 表示部33は、第3制御部36の制御のもと、移動装置4に関する各種情報を表示する。表示部33は、液晶や有機EL(Electro Luminescence)等の表示パネル等を用いて構成される。表示部33は、第3制御部36の制御のもと、撮像装置5が生成した画像データに対応する画像および撮像装置5に関する各種情報を表示する。 The display unit 33 displays various information related to the moving device 4 under the control of the third control unit 36. The display unit 33 is configured using a display panel such as liquid crystal or organic EL (Electro Luminescence). The display unit 33 displays an image corresponding to the image data generated by the imaging device 5 and various types of information regarding the imaging device 5 under the control of the third control unit 36.
 音声出力部34は、音声を出力するスピーカ等を用いて構成され、撮像装置5が撮像した動画等のデータに対応して収録される音を出力する。 The audio output unit 34 is configured using a speaker or the like that outputs audio, and outputs sound recorded corresponding to data such as a moving image captured by the imaging device 5.
 図3は、操作装置3の外観構成を模式的に示す図である。同図に示す操作装置3は、表面に表示部33と操作入力部32が設けられている。操作入力部32は、左右に配置された左スティック32aおよび右スティック32b、ならびにタッチパネル32cを有する。 FIG. 3 is a diagram schematically showing the external configuration of the operating device 3. The operation device 3 shown in the figure is provided with a display unit 33 and an operation input unit 32 on the surface. The operation input unit 32 includes a left stick 32a, a right stick 32b, and a touch panel 32c arranged on the left and right.
 図4は、通常操縦モードにおける操作装置3の操作入力部32の操作割り当てを模式的に示す図である。左レバーの前後への移動は、それぞれ前後への移動に対応している。左スティック32aの左右への移動は、それぞれ左右の回転に対応している。右スティック32bの前後への移動は上昇および下降にそれぞれ対応している。右スティック32bの左右への移動は、左右のドリーにそれぞれ対応している。なお、タッチパネル32c上でユーザが所望の操作を選択した後、その操作に対応するスティック操作を行うようにしてもよい。この場合には、タッチパネル32cでの選択入力後、表示部33がスティック操作の割り当てを図示するようにすればより好ましい。また、全ての操作をタッチパネル32c上で行うことができるようにしてもよい。 FIG. 4 is a diagram schematically showing operation assignment of the operation input unit 32 of the operation device 3 in the normal operation mode. The movement of the left lever back and forth corresponds to the movement back and forth. The left and right movements of the left stick 32a correspond to left and right rotations, respectively. The forward / backward movement of the right stick 32b corresponds to ascending and descending, respectively. The right and left movements of the right stick 32b correspond to the left and right dolly, respectively. Note that after the user selects a desired operation on the touch panel 32c, a stick operation corresponding to the operation may be performed. In this case, it is more preferable that the display unit 33 shows the assignment of the stick operation after the selection input on the touch panel 32c. Further, all operations may be performed on the touch panel 32c.
 第3記録部35は、操作装置3に関する各種プログラムを記録する。第3記録部35は、FlashメモリやSDRAMを用いて構成される。 The third recording unit 35 records various programs related to the operation device 3. The third recording unit 35 is configured using a flash memory or an SDRAM.
 第3制御部36は、CPUを用いて構成され、操作装置3の各部を制御する。第3制御部36は、被操縦体2の動作モードを制御するモード制御部361と、第4通信部31と移動装置4の第1通信部47との通信を制御する第3通信制御部362と、表示部33の表示を制御する第1表示制御部363と、を有する。 The third control unit 36 is configured using a CPU and controls each unit of the controller device 3. The third control unit 36 controls a mode control unit 361 that controls the operation mode of the steered object 2, and a third communication control unit 362 that controls communication between the fourth communication unit 31 and the first communication unit 47 of the mobile device 4. And a first display control unit 363 that controls the display of the display unit 33.
 モード制御部361は、被操縦体2の動作モードとして、通常操縦モードおよびロックオンモードを設定可能である。このうち、通常操縦モードは、操作装置3が入力を受け付ける操作信号に基づいて被操縦体2が動作するモードである。 The mode control unit 361 can set a normal control mode and a lock-on mode as the operation mode of the control target 2. Among these, the normal maneuvering mode is a mode in which the to-be-steered body 2 operates based on an operation signal in which the controller device 3 receives an input.
 ロックオンモードは、ロックオン対象となる被写体を追尾するモードである。ロックオンモードでは、ロックオン対象の被写体が画面内の同じ位置であり、かつロックオン対象の被写体までの距離が同じ状態を維持するように追尾処理を行う。 The lock on mode is a mode for tracking a subject to be locked on. In the lock-on mode, the tracking process is performed so that the lock-on target subject is at the same position in the screen and the distance to the lock-on target subject remains the same.
 ロックオンモードには、通常ロックオンモードと角度ロックオンモードの2種類がある。通常ロックオンモードは、被写体を画角内にとらえ続ける際、図5に示すように、画面内における被写体200の向きは問わないモードである。これに対し、角度ロックオンモードは、図6に示すように、画面内における被写体200の向きも同じ状態を維持するように追尾処理を行うモードである。したがって、例えばロックオン対象の被写体が人である場合、通常ロックオンモードではその人の顔の向きが変わることがある一方、角度ロックオンモードではその人の顔の向きは変わらない。 There are two types of lock-on modes: normal lock-on mode and angle lock-on mode. The normal lock-on mode is a mode in which the orientation of the subject 200 in the screen does not matter as shown in FIG. 5 when the subject is kept within the angle of view. On the other hand, the angle lock-on mode is a mode in which the tracking process is performed so that the orientation of the subject 200 in the screen remains the same as shown in FIG. Therefore, for example, when the subject to be locked on is a person, the face direction of the person may change in the normal lock-on mode, while the face direction of the person does not change in the angle lock-on mode.
 通常操縦モードからロックオンモードへ移行する際には、図7に示すように、まずユーザは表示部33の画面でロックオン対象とする被写体200をタッチする。この後、表示部33は、図8に示すように、通常ロックオンモードアイコン33a、角度ロックオンモードアイコン33b、および戻るボタン33cを表示する。これらのアイコンまたはボタンが表示されている領域がタッチされると、操作装置3のモード制御部361は、各アイコンまたはボタンに応じた処理を行う。具体的には、モード制御部361は、タッチされたアイコンに相当するロックオンモードを設定するか、または戻るボタン33cにより通常操縦モードを再設定する処理を行う。 When shifting from the normal operation mode to the lock-on mode, the user first touches the subject 200 to be locked-on on the screen of the display unit 33 as shown in FIG. Thereafter, as shown in FIG. 8, the display unit 33 displays a normal lock-on mode icon 33a, an angle lock-on mode icon 33b, and a return button 33c. When a region where these icons or buttons are displayed is touched, the mode control unit 361 of the controller device 3 performs processing according to each icon or button. Specifically, the mode control unit 361 performs a process of setting a lock-on mode corresponding to the touched icon or resetting the normal operation mode with the return button 33c.
 動作モードが通常ロックオンモードに設定された場合、操作入力部32の操作入力はすべて無効となる。動作モードが角度ロックオンモードに設定された場合、操作入力部32の操作入力は、例えば左スティック32aの前進および後退の指示入力を除いて無効となる。なお、動作モードが通常ロックオンモードに設定された場合、通常操縦モードと比較して、左スティック32aの前進および後退の指示のみを無効化してもよい。 When the operation mode is set to the normal lock on mode, all operation inputs of the operation input unit 32 are invalid. When the operation mode is set to the angle lock on mode, the operation input of the operation input unit 32 is invalid except for, for example, the forward and backward instruction inputs of the left stick 32a. When the operation mode is set to the normal lock-on mode, only the forward and backward instructions for the left stick 32a may be invalidated as compared with the normal operation mode.
 ロックオンモードでは、さらに構図調整モードを設定することが可能である。構図調整モードは、操作装置3が表示部33に表示される画像に対する構図調整用の指示信号のみ入力を受け付け、移動装置4の移動に関する指示信号の入力を受け付けないモードである。構図調整モードに設定されている場合、ユーザは移動装置4の操縦に気を取られることなく、構図の調整に専念することができる。 In the lock-on mode, it is possible to set the composition adjustment mode. The composition adjustment mode is a mode in which the operation device 3 accepts input of only an instruction signal for composition adjustment for an image displayed on the display unit 33 and does not accept input of an instruction signal related to movement of the moving device 4. When the composition adjustment mode is set, the user can concentrate on the composition adjustment without being distracted by the operation of the mobile device 4.
 図9は、構図調整モード設定時の表示部33における表示例を示す図である。表示部33は、戻るボタン33c、撮影アイコン33d、ズームアイコン33e、ISOアイコン33f、絞りアイコン33g、シャッタ速アイコン33h、露出補正アイコン33iを表示する。戻るボタン33cがタッチされた場合、表示部33は図8に示す表示に戻る。撮影アイコン33dがタッチされた場合、操作装置3は移動装置4に対して撮像装置5における撮影を指示する信号を送信する。ズームアイコン33e、ISOアイコン33f、絞りアイコン33g、シャッタ速アイコン33h、露出補正アイコン33iのいずれかがタッチされた場合、操作装置3はタッチされたアイコンに対応するパラメータの変更を受け付け可能な状態に遷移する。 FIG. 9 is a diagram showing a display example on the display unit 33 when the composition adjustment mode is set. The display unit 33 displays a return button 33c, a shooting icon 33d, a zoom icon 33e, an ISO icon 33f, an aperture icon 33g, a shutter speed icon 33h, and an exposure correction icon 33i. When the return button 33c is touched, the display unit 33 returns to the display shown in FIG. When the shooting icon 33 d is touched, the controller device 3 transmits a signal instructing shooting to be performed by the imaging device 5 to the moving device 4. When any one of the zoom icon 33e, the ISO icon 33f, the aperture icon 33g, the shutter speed icon 33h, and the exposure correction icon 33i is touched, the controller device 3 is in a state where it can accept the change of the parameter corresponding to the touched icon. Transition.
 図10は、構図調整モード設定時の操作入力部32における操作割り当ての一例を模式的に示す図である。左スティック32aの前後への移動は、機体の左右方向を回転中心とする回転(ピッチ)による角度変更にそれぞれ対応している。また、左スティック32aの左右への移動は、機体の前後方向を回転中心とする回転(ロール)による角度変更にそれぞれ対応している。右スティック32bの前方向および後方向は、ズームアイコン33e、ISOアイコン33f、絞りアイコン33g、シャッタ速アイコン33h、露出補正アイコン33iのうち選択されたパラメータのアップおよびダウンにそれぞれ対応している。右スティック32bの左方向は左スライド、右方向は右スライドにそれぞれ対応している。なお、ズームアップ/ダウン操作はタッチパネル32cにおけるピンチアウト/ピンチインによってそれぞれ入力を受け付けるようにしてもよい。また、左右のスライド操作(ドリー)はタッチパネル32cにおける左右のスライドによってそれぞれ入力を受け付けるようにしてもよい。 FIG. 10 is a diagram schematically illustrating an example of operation assignment in the operation input unit 32 when the composition adjustment mode is set. The forward / backward movement of the left stick 32a corresponds to an angle change by rotation (pitch) with the horizontal direction of the machine body as the rotation center. Further, the left / right movement of the left stick 32a corresponds to an angle change by rotation (roll) with the front-rear direction of the machine body as the center of rotation. The forward direction and the backward direction of the right stick 32b correspond to the up and down of the selected parameter among the zoom icon 33e, ISO icon 33f, aperture icon 33g, shutter speed icon 33h, and exposure correction icon 33i, respectively. The left direction of the right stick 32b corresponds to the left slide, and the right direction corresponds to the right slide. The zoom up / down operation may be received by pinch out / pinch in on the touch panel 32c. Further, the left and right slide operations (dolly) may be received by left and right slides on the touch panel 32c.
 なお、操作装置3は、レバー付の構成を有していなくてもよい。例えば、スマートフォン等の携帯端末を操作装置3として実現してもよい。また、ユーザによる操作信号の入力は、音声入力により実現してもよいし、操作装置3の本体の向きを変えたり操作装置3の本体を振ったりすることによって実現してもよい。 The operating device 3 may not have a configuration with a lever. For example, a mobile terminal such as a smartphone may be realized as the operation device 3. The input of the operation signal by the user may be realized by voice input, or may be realized by changing the direction of the main body of the operation device 3 or shaking the main body of the operation device 3.
 以下、撮像システム1が実行する処理の概要について、各装置の処理に分けて説明する。まず、操作装置3が行う処理の概要について、図11に示すフローチャートを参照して説明する。操作装置3の電源が投入された場合(ステップS101:Yes)、モード制御部361は動作モードを通常操縦モードに設定する(ステップS102)。操作装置3の電源が投入されない場合(ステップS101:No)、操作装置3はステップS101を繰り返す。 Hereinafter, an outline of processing executed by the imaging system 1 will be described separately for each device. First, an outline of processing performed by the controller device 3 will be described with reference to a flowchart shown in FIG. When the power of the controller device 3 is turned on (step S101: Yes), the mode control unit 361 sets the operation mode to the normal operation mode (step S102). When the power of the controller device 3 is not turned on (step S101: No), the controller device 3 repeats step S101.
 続いて、第3通信制御部362は、移動装置4との通信を確立する(ステップS103)。その後、第1表示制御部363は、移動装置4から画像データを受信し、受信した画像データに対応する画像の表示部33での表示を開始する(ステップS104)。なお、操作装置3は、移動装置4との通信確立後に、画像データ以外にも必要なデータの送受信を行う。 Subsequently, the third communication control unit 362 establishes communication with the mobile device 4 (step S103). Thereafter, the first display control unit 363 receives the image data from the mobile device 4 and starts displaying the image corresponding to the received image data on the display unit 33 (step S104). The controller device 3 transmits and receives necessary data in addition to the image data after establishing communication with the mobile device 4.
 この後、操作装置3は、操作入力部32がモード変更の指示信号の入力を受け付けたか否かに応じて異なる処理を行う。まず、操作入力部32がモード変更の指示信号の入力を受け付けない場合(ステップS105:No)を説明する。この場合、操作入力部32が操作信号の入力を受け付けたとき(ステップS106:Yes)、その操作信号に応じた通常制御信号を移動装置4へ送信する(ステップS107)。ここでいう通常制御信号には、撮像装置5に対して静止画または動画の撮影を指示する信号、移動装置4の電源を供給する電池残量の情報を要求する信号、移動装置4の飛行を操縦する信号などが含まれる。ステップS107の後、およびステップS106において操作入力部32が操作信号の入力を受け付けない場合(ステップS106:No)、操作装置3はステップS108へ移行する。 Thereafter, the controller device 3 performs different processing depending on whether or not the operation input unit 32 has received an input of a mode change instruction signal. First, the case where the operation input unit 32 does not accept the input of the mode change instruction signal (step S105: No) will be described. In this case, when the operation input unit 32 receives an input of an operation signal (step S106: Yes), a normal control signal corresponding to the operation signal is transmitted to the mobile device 4 (step S107). The normal control signal here includes a signal for instructing the imaging device 5 to capture a still image or a moving image, a signal for requesting information on the remaining battery level to supply power to the moving device 4, and a flight of the moving device 4. It includes signals to steer. After step S107 and when the operation input unit 32 does not accept the input of the operation signal in step S106 (step S106: No), the controller device 3 proceeds to step S108.
 ステップS108において、操作装置3の電源が切断された場合(ステップS108:Yes)、操作装置3は処理を終了する。一方、操作装置3の電源が切断されない場合(ステップS108:No)、動作モードが通常操縦モードであれば(ステップS109:Yes)、操作装置3はステップS105に戻る。ステップS109において、動作モードが通常操作モードでなければ(ステップS109:No)、操作装置3はステップS102に戻る。 In step S108, when the power of the controller device 3 is cut off (step S108: Yes), the controller device 3 ends the process. On the other hand, when the power source of the controller device 3 is not cut off (step S108: No), the controller device 3 returns to step S105 if the operation mode is the normal control mode (step S109: Yes). If the operation mode is not the normal operation mode in step S109 (step S109: No), the controller device 3 returns to step S102.
 次に、ステップS105において、操作入力部32がモード変更の指示信号の入力を受け付けた場合(ステップS105:Yes)を説明する。ここでの動作モードの変更は、通常ロックオンモードおよび角度ロックオンモードのいずれかの動作モードへの変更である。いずれのモードの場合にも、ロックオン対象の被写体を指定する必要がある。ロックオン対象の被写体を指定する信号の入力は、例えば表示部33の画面上に設けられたタッチパネル(操作入力部32の一部)がロックオン対象とする被写体のタッチを検知し、その被写体の位置に基づく入力信号を受け付けることによって実現される。操作入力部32がモード変更の指示信号の入力を受け付けた場合、モード制御部361は動作モードの設定を通常ロックオンモードまたは角度ロックオンモードに変更し、通常または角度ロックオンモードへの変更を指示するモード変更信号およびロックオン対象の被写体の位置情報を移動装置4に送信する(ステップS110)。 Next, the case where the operation input unit 32 receives an input of a mode change instruction signal in step S105 (step S105: Yes) will be described. The change of the operation mode here is a change to one of the normal lock-on mode and the angle lock-on mode. In either mode, it is necessary to designate a subject to be locked on. For example, a touch panel (a part of the operation input unit 32) provided on the screen of the display unit 33 detects a touch of a subject to be locked on, and inputs a signal for designating the subject to be locked on. This is realized by receiving an input signal based on the position. When the operation input unit 32 receives the input of the mode change instruction signal, the mode control unit 361 changes the setting of the operation mode to the normal lock on mode or the angle lock on mode, and changes to the normal or angle lock on mode. The mode change signal to be instructed and the position information of the subject to be locked on are transmitted to the moving device 4 (step S110).
 続いて、操作入力部32が通常ロックオンモードおよび角度ロックオンモードの一方を他方に変更する指示信号の入力を受け付けた場合(ステップS111:Yes)、モード制御部361は動作モードの設定を他方のロックオンモードに変更し、通常または角度ロックオンモードへの変更を指示するモード変更信号およびロックオン対象の被写体の位置情報を移動装置4に送信する(ステップS112)。 Subsequently, when the operation input unit 32 receives an input of an instruction signal for changing one of the normal lock-on mode and the angle lock-on mode to the other (step S111: Yes), the mode control unit 361 sets the operation mode to the other. The mode change signal for instructing the change to the normal or angle lock on mode and the position information of the subject to be locked on are transmitted to the moving device 4 (step S112).
 この後、操作入力部32が通常または角度ロックオンモードを終了する信号の入力を受け付けた場合(ステップS113:Yes)、操作装置3はステップS108へ移行する。ステップS108において電源が切断されない場合(ステップS108:No)、動作モードは通常操縦モードではないので(ステップS109:No)、操作装置3はステップS102へ戻る。ステップS102において、モード制御部361は動作モードを通常操縦モードに設定する。 Thereafter, when the operation input unit 32 receives an input of a signal for ending the normal or angle lock-on mode (step S113: Yes), the controller device 3 proceeds to step S108. If the power is not cut off in step S108 (step S108: No), the operation mode is not the normal operation mode (step S109: No), and the controller device 3 returns to step S102. In step S102, the mode control unit 361 sets the operation mode to the normal operation mode.
 なお、動作モードが通常ロックオンモードや角度ロックオンモードに設定されている場合に、撮像装置5に静止画または動画の撮影を行わせることができるようにしてもよい。 Note that when the operation mode is set to the normal lock-on mode or the angle lock-on mode, the imaging device 5 may be allowed to capture a still image or a moving image.
 ステップS111において、操作入力部32が通常ロックオンモードおよび角度ロックオンモードの一方から他方に変更する指示信号の入力を受け付けない場合(ステップS111:No)、操作装置3はステップS113へ移行する。 In step S111, when the operation input unit 32 does not accept an input of an instruction signal for changing from one of the normal lock-on mode and the angle lock-on mode to the other (step S111: No), the controller device 3 proceeds to step S113.
 ステップS113において、通常または角度ロックオンモードを終了する信号の入力を受け付けない場合(ステップS113:No)、操作装置3はステップS114へ移行する。ステップS114において、操作入力部32が構図調整モードへの変更の指示信号の入力を受け付けた場合(ステップS114:Yes)、モード制御部361は動作モードの設定を構図調整モードに変更する(ステップS115)。 In step S113, when the input of the signal for ending the normal or angle lock-on mode is not accepted (step S113: No), the controller device 3 proceeds to step S114. In step S114, when the operation input unit 32 receives an input of an instruction signal for changing to the composition adjustment mode (step S114: Yes), the mode control unit 361 changes the operation mode setting to the composition adjustment mode (step S115). ).
 この後、操作入力部32がズーム操作信号の入力を受け付けた場合(ステップS116:Yes)、第3通信制御部362は移動装置4との間でズーム制御通信を行う(ステップS117)。操作入力部32がズーム操作信号の入力を受け付けない場合(ステップS116:No)、操作装置3はステップS118へ移行する。 Thereafter, when the operation input unit 32 receives an input of a zoom operation signal (step S116: Yes), the third communication control unit 362 performs zoom control communication with the mobile device 4 (step S117). When the operation input unit 32 does not accept the input of the zoom operation signal (step S116: No), the controller device 3 proceeds to step S118.
 ステップS118において、操作入力部32がスライド操作の入力を受け付けた場合(ステップS118:Yes)、第3通信制御部362は移動装置4との間でスライド制御通信を行う(ステップS119)。一方、操作入力部32がスライド操作の入力を受け付けない場合(ステップS118:No)、操作装置3はステップS120へ移行する。 In step S118, when the operation input unit 32 receives an input of a slide operation (step S118: Yes), the third communication control unit 362 performs slide control communication with the mobile device 4 (step S119). On the other hand, when the operation input unit 32 does not accept the input of the slide operation (step S118: No), the controller device 3 proceeds to step S120.
 ステップS120において、操作入力部32が角度変更操作の入力を受け付けた場合(ステップS120:Yes)、第3通信制御部362は移動装置4との間で角度変更制御通信を行う(ステップS121)。一方、操作入力部32が角度変更操作の入力を受け付けない場合(ステップS120:No)、操作装置3はステップS122へ移行する。 In step S120, when the operation input unit 32 receives an input of an angle change operation (step S120: Yes), the third communication control unit 362 performs angle change control communication with the mobile device 4 (step S121). On the other hand, when the operation input unit 32 does not accept the input of the angle change operation (step S120: No), the controller device 3 proceeds to step S122.
 ステップS122において、操作入力部32が露出変更操作の入力を受け付けた場合(ステップS122:Yes)、第3通信制御部362は移動装置4との間で露出変更制御通信を行う(ステップS123)。一方、操作入力部32が露出変更操作の入力を受け付けない場合(ステップS122:No)、操作装置3はステップS124へ移行する。 In step S122, when the operation input unit 32 receives an input of an exposure change operation (step S122: Yes), the third communication control unit 362 performs exposure change control communication with the mobile device 4 (step S123). On the other hand, when the operation input unit 32 does not accept the input of the exposure change operation (step S122: No), the controller device 3 proceeds to step S124.
 ステップS124において、操作入力部32がピント変更操作の入力を受け付けた場合(ステップS124:Yes)、第3通信制御部362は移動装置4との間でピント変更制御通信を行う(ステップS125)。一方、操作入力部32がピント変更操作の入力を受け付けない場合(ステップS124:No)、操作装置3はステップS126へ移行する。 In step S124, when the operation input unit 32 receives an input of a focus change operation (step S124: Yes), the third communication control unit 362 performs focus change control communication with the mobile device 4 (step S125). On the other hand, when the operation input unit 32 does not accept the input of the focus change operation (step S124: No), the controller device 3 proceeds to step S126.
 ステップS126において、操作入力部32が撮影操作の入力を受け付けた場合(ステップS126:Yes)、第3通信制御部362は移動装置4との間で撮影制御通信を行う(ステップS127)。操作入力部32が撮影操作の入力を受け付けない場合(ステップS126:No)、操作装置3はステップS128へ移行する。 In step S126, when the operation input unit 32 receives an input of a shooting operation (step S126: Yes), the third communication control unit 362 performs shooting control communication with the mobile device 4 (step S127). When the operation input unit 32 does not accept the input of the shooting operation (step S126: No), the controller device 3 proceeds to step S128.
 ステップS128において、操作入力部32が構図調整モードを終了する信号の入力を受け付けた場合(ステップS128:Yes)、操作装置3はステップS108に戻る。ステップS108において電源が切断されない場合(ステップS108:No)、動作モードは通常操縦モードではないので(ステップS109:No)、操作装置3はステップS102へ戻る。ステップS102において、モード制御部361は動作モードを通常操縦モードに設定する。 In step S128, when the operation input unit 32 receives an input of a signal for ending the composition adjustment mode (step S128: Yes), the controller device 3 returns to step S108. If the power is not cut off in step S108 (step S108: No), the operation mode is not the normal operation mode (step S109: No), and the controller device 3 returns to step S102. In step S102, the mode control unit 361 sets the operation mode to the normal operation mode.
 ステップS128において、操作入力部32が構図調整モードを終了する信号の入力を受け付けない場合(ステップS128:No)、操作装置3はステップS116に戻る。 In step S128, when the operation input unit 32 does not accept the input of the signal for ending the composition adjustment mode (step S128: No), the controller device 3 returns to step S116.
 ステップS114において、操作入力部32が構図調整モードに変更する指示信号の入力を受け付けない場合(ステップS114:No)、操作装置3はステップS111に戻る。 In step S114, when the operation input unit 32 does not accept the input of the instruction signal for changing to the composition adjustment mode (step S114: No), the controller device 3 returns to step S111.
 次に、移動装置4が行う処理の概要について説明する。移動装置4は、予め指定された撮影対象物と撮像装置5との相対的な関係の変化に応じて移動可能であるか否かを判定し、判定結果を第1記録部46から読み出して判定結果に応じた制御を行う。このような移動制御処理を含む移動装置4の詳細な処理について、図12Aおよび図12Bに示すフローチャートを参照して説明する。 Next, an outline of processing performed by the mobile device 4 will be described. The moving device 4 determines whether or not the moving device 4 can move in accordance with a change in the relative relationship between the imaging object 5 designated in advance and the imaging device 5, and reads the determination result from the first recording unit 46 for determination. Control according to the result. Detailed processing of the mobile device 4 including such movement control processing will be described with reference to the flowcharts shown in FIGS. 12A and 12B.
 まず、図12Aを参照してステップS201~S209の処理を説明する。移動装置4の電源が投入された場合(ステップS201:Yes)、第1通信制御部496は撮像装置5および操作装置3との通信を確立する(ステップS202)。ステップS201において移動装置4の電源が投入されない場合(ステップS201:No)、移動装置4はステップS201を繰り返す。 First, the processing of steps S201 to S209 will be described with reference to FIG. 12A. When the power of the mobile device 4 is turned on (step S201: Yes), the first communication control unit 496 establishes communication with the imaging device 5 and the operation device 3 (step S202). If the mobile device 4 is not powered on in step S201 (step S201: No), the mobile device 4 repeats step S201.
 続いて、第1通信制御部496は、撮像装置5からの画像データの受信および操作装置3への画像データの送信を開始する(ステップS203)。この後、移動装置4は撮像装置5から所定の間隔で画像データを受信し、操作装置3にその画像データを送信する。なお、移動装置4は、撮像装置5および操作装置3との通信確立後に、画像データ以外にも必要なデータの送受信を行う。 Subsequently, the first communication control unit 496 starts receiving image data from the imaging device 5 and transmitting image data to the operation device 3 (step S203). Thereafter, the moving device 4 receives the image data from the imaging device 5 at a predetermined interval, and transmits the image data to the operation device 3. The mobile device 4 transmits / receives necessary data in addition to the image data after establishing communication with the imaging device 5 and the operation device 3.
 ステップS203の後、移動装置4が操作装置3から通常制御信号を受信した場合(ステップS204:Yes)、第1制御部49は通常制御信号に応じた処理を行う(ステップS205)。ここでいう通常制御信号に応じた処理とは、例えば撮像装置5に対する静止画撮影の開始指示信号、動画撮影の開始または終了指示信号の送信、撮像装置5から受信した画像データの操作装置3への送信、および移動装置4の電池残量の確認等である。この後、移動装置4はステップS206へ移行する。 After step S203, when the mobile device 4 receives the normal control signal from the controller device 3 (step S204: Yes), the first control unit 49 performs processing according to the normal control signal (step S205). The processing according to the normal control signal here refers to, for example, transmission of a still image shooting start instruction signal, moving image shooting start or end instruction signal to the imaging device 5, and operation of the image data received from the imaging device 5. And the confirmation of the remaining battery level of the mobile device 4. Thereafter, the mobile device 4 proceeds to Step S206.
 ステップS204において、移動装置4が通常制御信号を受信しない場合(ステップS204:No)、移動装置4はステップS206へ移行する。 In step S204, when the mobile device 4 does not receive the normal control signal (step S204: No), the mobile device 4 proceeds to step S206.
 ステップS206において、移動装置4が通常または角度ロックオンモードへのモード変更信号およびロックオン対象の被写体の位置情報を受信した場合(ステップS206:Yes)、第1制御部49はロックオンの基準位置を第1記録部46に記憶させ(ステップS207)、ロックオン対象の被写体の追尾を指示する信号およびロックオン対象の被写体の位置情報を撮像装置5へ送信する(ステップS208)。ステップS206において、移動装置4が通常または角度ロックオンモードへのモード変更信号およびロックオン対象の被写体の位置情報を受信しない場合(ステップS206:No)、移動装置4は後述するステップS224へ移行する。 In step S206, when the moving device 4 receives the mode change signal to the normal or angle lock-on mode and the position information of the subject to be locked on (step S206: Yes), the first control unit 49 sets the lock-on reference position. Is stored in the first recording unit 46 (step S207), and a signal for instructing tracking of the lock-on target subject and the position information of the lock-on target subject are transmitted to the imaging device 5 (step S208). In step S206, when the moving device 4 does not receive the mode change signal to the normal or angle lock-on mode and the position information of the subject to be locked on (step S206: No), the moving device 4 proceeds to step S224 to be described later. .
 ステップS208の後、移動装置4は被写体追尾制御を実行する(ステップS209)。図13は、移動装置4が行う被写体追尾制御処理の概要を示すフローチャートである。まず、移動装置4は、撮像装置5によって算出されたロックオン対象である被写体までの距離、被写体の基準位置からのずれ量、および障害物がある場合の障害物までの距離を撮像装置5から取得する(ステップS301)。ここでいう「基準位置」とは、初期設定では画面の中心であり、スライド操作が行われた場合には、その中心がスライドした位置である。また、ここでいう「ずれ量」とは、3次元的なずれ量である。撮像装置5による被写体および障害物までの距離の検出、ならびにずれ量の算出については、撮像装置5の処理を説明する際に詳述する。 After step S208, the moving device 4 performs subject tracking control (step S209). FIG. 13 is a flowchart showing an outline of subject tracking control processing performed by the moving device 4. First, the moving device 4 determines the distance from the imaging device 5 to the subject that is the lock-on target calculated by the imaging device 5, the amount of deviation from the reference position of the subject, and the distance to the obstacle when there is an obstacle. Obtain (step S301). The “reference position” here is the center of the screen in the initial setting, and when a slide operation is performed, the center is the position where the center has been slid. The “deviation amount” here is a three-dimensional deviation amount. The detection of the distance to the subject and the obstacle by the imaging device 5 and the calculation of the shift amount will be described in detail when the processing of the imaging device 5 is described.
 ステップS301の後、移動判定部492は、取得したずれ量に基づいて、被写体の位置および大きさを維持するための移動装置4の移動距離を算出する(ステップS302)。この際、移動判定部492は、図14に示すように、画面と直交する方向の移動距離ΔLを、被写体200までの距離L、被写体200が動く前の被写体のサイズをa、被写体200が動いた後の被写体のサイズをbとして、ΔL={(a/b)-1}Lにより算出する。ここで、移動距離ΔLは被写体200に対して近づく方向を正の値としている。換言すれば、被写体200が動いて画面上で小さくなった場合には移動距離が正となり、被写体200が動いて画面上で大きくなった場合には移動距離が負となる。なお、画面と平行な方向の移動距離を同様に算出してもよい。 After step S301, the movement determination unit 492 calculates the movement distance of the moving device 4 for maintaining the position and size of the subject based on the acquired deviation amount (step S302). At this time, as shown in FIG. 14, the movement determination unit 492 sets the movement distance ΔL in the direction orthogonal to the screen, the distance L to the subject 200, the size of the subject before the subject 200 moves, and the subject 200 moving. And b is the size of the subject after the calculation, ΔL = {(a / b) −1} L. Here, the moving distance ΔL has a positive value in the direction approaching the subject 200. In other words, when the subject 200 moves and becomes smaller on the screen, the moving distance becomes positive, and when the subject 200 moves and becomes larger on the screen, the moving distance becomes negative. The movement distance in the direction parallel to the screen may be calculated in the same way.
 続いて、移動判定部492は、移動経路上に障害物がある場合(ステップS303:Yes)、移動装置4の移動距離が障害物までの距離未満であれば(ステップS304:Yes)、移動装置4の移動が可能であると判定する(ステップS305)。移動判定部492は、移動経路上に障害物がない場合(ステップS303:No)も、移動装置4の移動が可能であると判定する(ステップS305)。 Subsequently, when there is an obstacle on the movement route (step S303: Yes), the movement determination unit 492 determines that the movement distance of the moving device 4 is less than the distance to the obstacle (step S304: Yes). 4 is determined to be possible (step S305). The movement determination unit 492 determines that the movement of the moving device 4 is possible even when there is no obstacle on the movement route (step S303: No) (step S305).
 この後、推進制御部494は、撮像装置5が被写体を画角内にとらえ続けることができるよう、ステップS302で算出した移動距離にしたがって被写体の動きに追従して推進部41を駆動させる移動追従制御を行う(ステップS306)。この後、移動装置4はステップS308へ移行する。 Thereafter, the propulsion control unit 494 follows the movement of the subject according to the movement distance calculated in step S302 so that the imaging device 5 can continue to capture the subject within the angle of view. Control is performed (step S306). Thereafter, the mobile device 4 proceeds to Step S308.
 ステップS304において、移動装置4の移動距離が障害物までの距離以上であれば(ステップS304:No)、移動判定部492は、移動装置4の移動が不可能であると判定する(ステップS307)。この後、移動装置4はステップS308へ移行する。 In step S304, if the moving distance of the moving device 4 is equal to or greater than the distance to the obstacle (step S304: No), the movement determining unit 492 determines that the moving device 4 cannot move (step S307). . Thereafter, the mobile device 4 proceeds to Step S308.
 ステップS308において、移動装置4が角度ロックオンモードに設定されている場合(ステップS308:Yes)、第1制御部49は撮像方向追従制御を行う(ステップS309)。図15は、移動装置4が行う撮像方向追従制御の処理の概要を示すフローチャートである。角度ロックオンモードに設定されている場合、撮像装置5からは被写体の方向の変化が送られてくる。このときの撮像装置5の処理については後述する。移動装置4が撮像装置5からロックオン対象の被写体の方向が変化したことを示す情報を取得した場合(ステップS401:Yes)、移動判定部492は、その情報に基づいて移動装置4の移動の可否を判定する(ステップS402)。移動判定部492は、例えば被写体の顔を球と近似するモデルを採用し、その球の中心を回転中心として回転中心からの距離を一定に保ちながら基準の角度への移動装置4の回転の可否を判定する。 In step S308, when the moving device 4 is set to the angle lock on mode (step S308: Yes), the first control unit 49 performs imaging direction tracking control (step S309). FIG. 15 is a flowchart illustrating an outline of processing of imaging direction tracking control performed by the moving device 4. When the angle lock on mode is set, the imaging device 5 sends a change in the direction of the subject. The processing of the imaging device 5 at this time will be described later. When the movement device 4 acquires information indicating that the direction of the lock-on target subject has changed from the imaging device 5 (step S401: Yes), the movement determination unit 492 moves the movement device 4 based on the information. It is determined whether or not it is possible (step S402). For example, the movement determination unit 492 employs a model that approximates the subject's face to a sphere, and whether or not the moving device 4 can be rotated to a reference angle while keeping the distance from the rotation center constant with the center of the sphere as the rotation center. Determine.
 移動判定部492が移動可能であると判定した場合(ステップS403:Yes)、方向制御部495は撮像方向の変更を開始する(ステップS404)。例えば、顔を球で近似するモデルを採用した場合、方向制御部495はその球の中心を回転中心として回転中心からの距離を一定に保ちながらステップS402で算出した角度だけ移動装置4の回転を行う。なお、撮像装置5の移動装置4に対する姿勢を変更可能な場合、移動装置4は自身の姿勢を保ったまま移動するとともに、撮像装置5の移動装置4に対する姿勢を変化させることによって撮像方向を変更させるようにしてもよい。この場合の移動装置4の移動と撮像装置5の姿勢変化は同時に行ってもよいし、移動装置4の移動後に撮像装置5の姿勢を変化させてもよい。 When the movement determination unit 492 determines that the movement is possible (step S403: Yes), the direction control unit 495 starts changing the imaging direction (step S404). For example, when a model that approximates the face with a sphere is adopted, the direction control unit 495 rotates the moving device 4 by the angle calculated in step S402 while keeping the distance from the rotation center constant with the center of the sphere as the rotation center. Do. When the posture of the imaging device 5 with respect to the moving device 4 can be changed, the moving device 4 moves while maintaining its own posture and changes the imaging direction by changing the posture of the imaging device 5 with respect to the moving device 4. You may make it make it. In this case, the movement of the moving device 4 and the posture change of the imaging device 5 may be performed simultaneously, or the posture of the imaging device 5 may be changed after the movement of the moving device 4.
 ステップS404の後、移動装置4の撮像方向を変更している最中に被写体のロスト情報を撮像装置5から受信した場合(ステップS405:Yes)、方向制御部495は移動装置4の回転動作を中止する制御を行い(ステップS406)、ロスト情報を操作装置3に送信する(ステップS407)。ここで「被写体をロストした場合」には、被写体の一部が画像内から消えている場合、換言すれば「被写体をロストしそうな場合」も含まれる。ステップS407の後、移動装置4は撮影方向追従制御を終了してメインルーチンへ戻る。 After step S404, when the lost information of the subject is received from the imaging device 5 while the imaging direction of the moving device 4 is being changed (step S405: Yes), the direction control unit 495 performs the rotation operation of the moving device 4. Control to cancel is performed (step S406), and lost information is transmitted to the controller device 3 (step S407). Here, “when the subject is lost” includes a case where a part of the subject disappears from the image, in other words, “when the subject is likely to be lost”. After step S407, the moving device 4 ends the shooting direction tracking control and returns to the main routine.
 ステップS405において、移動装置4の撮像方向を変更している最中に被写体のロスト情報を撮像装置5から受信しない場合(ステップS405:No)において、撮像方向の変更が終了したとき(ステップS408:Yes)、移動装置4は撮影方向追従制御を終了してメインルーチンへ戻る。ステップS408において撮像方向の変更が終了していないとき(ステップS408:No)、移動装置4はステップS405へ戻る。 In step S405, when the lost information of the subject is not received from the imaging device 5 while the imaging direction of the mobile device 4 is being changed (step S405: No), the change of the imaging direction is completed (step S408: Yes), the moving device 4 ends the shooting direction tracking control and returns to the main routine. When the change of the imaging direction is not completed in step S408 (step S408: No), the moving device 4 returns to step S405.
 ステップS401において、移動装置4が撮像装置5からロックオン対象の被写体の方向が変化した情報を取得しない場合(ステップS401:No)、およびステップS403において移動判定部492が移動可能ではないと判定した場合(ステップS403:No)、移動装置4は撮影方向追従制御を終了してメインルーチンへ戻る。 In step S401, when the moving device 4 does not acquire information on the change in the direction of the lock-on subject from the imaging device 5 (step S401: No), and in step S403, the movement determining unit 492 determines that the movement is not possible. In the case (step S403: No), the moving device 4 ends the shooting direction tracking control and returns to the main routine.
 ステップS308において、移動装置4が角度ロックオンモードに設定されていない場合(ステップS308:No)、移動装置4は被写体追尾制御を終了してメインルーチンへ戻る。 In step S308, when the moving device 4 is not set to the angle lock on mode (step S308: No), the moving device 4 ends the subject tracking control and returns to the main routine.
 以上説明したステップS209の後の処理について、図12Bを参照して説明する。第1制御部49は、構図調整モードへの変更を指示する制御信号の受信の有無を判定する(ステップS210)。まず、構図調整制御信号を受信した場合(ステップS210:Yes)を説明する。この場合において、ズーム操作信号を受信したとき(ステップS211:Yes)、第1制御部49はズーム制御を行う(ステップS212)。 The processing after step S209 described above will be described with reference to FIG. 12B. The first control unit 49 determines whether or not a control signal for instructing the change to the composition adjustment mode has been received (step S210). First, the case where the composition adjustment control signal is received (step S210: Yes) will be described. In this case, when a zoom operation signal is received (step S211: Yes), the first control unit 49 performs zoom control (step S212).
 図16は、移動装置4が行うズーム制御の処理の概要を示すフローチャートである。ズーム処理を行う前提として、電源判定部491がバッテリーの残量を所定値以上(例えばフル充電値の50%以上)であることを判定した上で、処理を行うものとする。まず、第1制御部49は撮像装置5にズーム制御指示信号を送信する(ステップS501)。 FIG. 16 is a flowchart showing an overview of zoom control processing performed by the moving device 4. As a premise for performing the zoom process, it is assumed that the power source determination unit 491 performs the process after determining that the remaining amount of the battery is equal to or greater than a predetermined value (for example, 50% or more of the full charge value). First, the first control unit 49 transmits a zoom control instruction signal to the imaging device 5 (step S501).
 この後、撮像装置5から移動要求を受信した場合(ステップS502:Yes)、移動判定部492は移動可否判定を行う(ステップS503)。移動要求は、撮像装置5が光学ズームを行うことができないと判定した場合に撮像装置5から送られてくる。移動判定部492は、操作装置3から受信したズーム操作信号に応じた分のズームアップまたはズームダウンに相当する移動が可能であるか否かを判定する。具体的には、移動判定部492は、図17に示すように、画面と直交する方向の移動距離ΔL’を、被写体までの距離L、撮像装置5の移動前の被写体のサイズをc、撮像装置5の移動後の被写体のサイズをdとして、ΔL’={1-(c/d)}Lにより算出する。算出した移動距離ΔL’に基づく移動可否判定は、図13で説明したステップS303~S307と同様である。なお、ここでも画面と平行な方向の移動距離を同様に算出してもよい。 Thereafter, when a movement request is received from the imaging device 5 (step S502: Yes), the movement determination unit 492 determines whether or not movement is possible (step S503). The movement request is sent from the imaging device 5 when the imaging device 5 determines that optical zoom cannot be performed. The movement determination unit 492 determines whether or not the movement corresponding to the zoom up or zoom down corresponding to the zoom operation signal received from the controller device 3 is possible. Specifically, as illustrated in FIG. 17, the movement determination unit 492 sets the movement distance ΔL ′ in the direction orthogonal to the screen, the distance L to the subject, the size of the subject before moving the imaging device 5, and the imaging. It is calculated by ΔL ′ = {1− (c / d)} L, where d is the size of the subject after movement of the device 5. The determination of whether to move based on the calculated movement distance ΔL ′ is the same as in steps S303 to S307 described with reference to FIG. Here, the movement distance in the direction parallel to the screen may be calculated in the same manner.
 ステップS503における判定の結果、移動判定部492が移動可能であると判定した場合(ステップS504:Yes)、推進制御部494は、推進部41を駆動して移動装置4の移動制御を行う(ステップS505)。ここで推進制御部494は、例えば移動方向に障害物が存在する場合、その障害物の手前の所定位置まで移動装置4を移動させる制御を行う。 As a result of the determination in step S503, when it is determined that the movement determination unit 492 is movable (step S504: Yes), the propulsion control unit 494 drives the propulsion unit 41 to control the movement of the moving device 4 (step). S505). Here, for example, when there is an obstacle in the moving direction, the propulsion control unit 494 performs control to move the moving device 4 to a predetermined position before the obstacle.
 続いて、移動装置4が移動している最中に被写体のロスト情報を撮像装置5から受信した場合(ステップS506:Yes)、推進制御部494は移動装置4の移動を中止する制御を行い(ステップS507)、ロスト情報を操作装置3に送信する(ステップS508)。ここでの「被写体をロストした場合」も、被写体の一部が画像内から消えるような場合、すなわち「被写体をロストしそうな場合」も含まれる。ステップS508の後、移動装置4はズーム制御を終了してメインルーチンへ戻る。 Subsequently, when the lost information of the subject is received from the imaging device 5 while the moving device 4 is moving (step S506: Yes), the propulsion control unit 494 performs control to stop the movement of the moving device 4 ( In step S507, the lost information is transmitted to the controller device 3 (step S508). Here, “when the subject is lost” also includes a case where a part of the subject disappears from the image, that is, “when the subject is likely to be lost”. After step S508, the moving device 4 ends the zoom control and returns to the main routine.
 ステップS506において移動装置4が移動している最中に被写体のロスト情報を撮像装置5から受信しない場合(ステップS506:No)、移動が終了したとき(ステップS509:Yes)、第1制御部49は撮像装置5に移動終了の情報を送信する(ステップS510)。その後、移動装置4はズーム制御を終了してメインルーチンへ戻る。ステップS509において移動が終了していないとき(ステップS509:No)、移動装置4はステップS506へ戻る。 If the lost information of the subject is not received from the imaging device 5 while the moving device 4 is moving in step S506 (step S506: No), when the movement is completed (step S509: Yes), the first control unit 49 Transmits information on the end of movement to the imaging device 5 (step S510). Thereafter, the moving device 4 ends the zoom control and returns to the main routine. When the movement is not completed in step S509 (step S509: No), the moving device 4 returns to step S506.
 ステップS502において撮像装置5から移動要求を受信しない場合(ステップS502:No)、およびステップS504において判定結果が移動不可能である場合(ステップS504:No)、移動装置4はメインルーチンへ戻る。 If the movement request is not received from the imaging device 5 in step S502 (step S502: No) and the determination result is not movable in step S504 (step S504: No), the moving device 4 returns to the main routine.
 再び図12Bのフローチャートに戻って説明を続ける。ステップS211においてズーム操作信号を受信しない場合(ステップS211:No)を説明する。この場合において、第1制御部49がスライド操作信号を受信したとき(ステップS213:Yes)、第1制御部49はスライド制御を行う(ステップS214)。 Returning to the flowchart of FIG. 12B, the description will be continued. A case where the zoom operation signal is not received in step S211 (step S211: No) will be described. In this case, when the first control unit 49 receives a slide operation signal (step S213: Yes), the first control unit 49 performs slide control (step S214).
 図18は、移動装置4が行うスライド制御の処理の概要を示すフローチャートである。まず、移動判定部492は、移動装置4がスライド動作が可能であるか否かを判定する(ステップS601)。スライド動作が可能である場合(ステップS601:Yes)、推進制御部494はスライド動作を開始する(ステップS602)。第1制御部49は、スライド動作を開始すると、スライド動作に関する情報を撮像装置5への送信も開始する。ここでいう「スライド動作に関する情報」には、例えばスライドの量や方向に関する情報が含まれる。 FIG. 18 is a flowchart showing an outline of the slide control process performed by the moving device 4. First, the movement determination unit 492 determines whether or not the moving device 4 can perform a sliding operation (step S601). When the slide operation is possible (step S601: Yes), the propulsion control unit 494 starts the slide operation (step S602). When starting the slide operation, the first control unit 49 also starts transmitting information regarding the slide operation to the imaging device 5. The “information relating to the slide operation” here includes, for example, information relating to the amount and direction of the slide.
 ステップS602の後、移動装置4がスライド動作中に撮像装置5から被写体のロスト情報を受信した場合(ステップS603:Yes)、推進制御部494は移動装置4のスライド動作を中止する制御を行い(ステップS604)、ロスト情報を操作装置3に送信する(ステップS605)。ここでも、「被写体をロストした場合」には、被写体の一部が画像内から消えるような場合、すなわち「被写体をロストしそうな場合」も含まれる。ステップS605の後、移動装置4はメインルーチンへ戻る。 After step S602, when the moving device 4 receives the lost information of the subject from the imaging device 5 during the sliding operation (step S603: Yes), the propulsion control unit 494 performs control to stop the sliding operation of the moving device 4 ( In step S604, the lost information is transmitted to the controller device 3 (step S605). Here, “when the subject is lost” includes a case where a part of the subject disappears from the image, that is, “a case where the subject is likely to be lost”. After step S605, the mobile device 4 returns to the main routine.
 ステップS603において移動装置4の移動中に被写体をロストしない場合(ステップS603:No)、移動装置4のスライド動作が終了したとき(ステップS606:Yes)、第1制御部49は、スライド動作によって変化した後の基準位置を第1記録部16に記憶させ(ステップS607)、その基準位置を撮像装置5へ送信する(ステップS608)。この後、移動装置4はスライド制御を終了してメインルーチンへ戻る。ステップS606において移動装置4のスライド動作が終了していないとき(ステップS606:No)、移動装置4はステップS603へ戻る。 When the subject is not lost during the movement of the moving device 4 in step S603 (step S603: No), when the sliding operation of the moving device 4 is completed (step S606: Yes), the first control unit 49 changes depending on the sliding operation. The reference position after the recording is stored in the first recording unit 16 (step S607), and the reference position is transmitted to the imaging device 5 (step S608). Thereafter, the moving device 4 ends the slide control and returns to the main routine. When the slide operation of the moving device 4 is not completed in step S606 (step S606: No), the moving device 4 returns to step S603.
 ステップS601において、移動判定部492が移動装置4のスライド動作を不可能であると判定した場合(ステップS601:No)、第1制御部49は撮像装置5へ画像のトリミングを要求する信号を送信する(ステップS609)。この際、第1制御部49は、要求信号とともにスライド量に関する情報を撮像装置5に送信する。 In step S601, when the movement determination unit 492 determines that the sliding operation of the moving device 4 is not possible (step S601: No), the first control unit 49 transmits a signal requesting image trimming to the imaging device 5. (Step S609). At this time, the first control unit 49 transmits information regarding the slide amount to the imaging device 5 together with the request signal.
 この後、移動装置4は、撮像装置5からトリミング画像データまたはエラー情報を受信する(ステップS610)。エラー情報は、撮像装置5によってトリミングが不可能と判断された場合に撮像装置5から送られてくる情報である。 Thereafter, the moving device 4 receives trimming image data or error information from the imaging device 5 (step S610). The error information is information sent from the imaging device 5 when the imaging device 5 determines that trimming is impossible.
 移動装置4は、撮像装置5から受信したトリミング画像データまたはエラー情報を操作装置3に送信する(ステップS611)。この後、移動装置4はスライド制御を終了してメインルーチンに戻る。 The moving device 4 transmits the trimmed image data or error information received from the imaging device 5 to the controller device 3 (step S611). Thereafter, the moving device 4 ends the slide control and returns to the main routine.
 図12Bのフローチャートに戻って説明を続ける。ステップS213において、第1制御部49がスライド操作信号を受信しないとき(ステップS213:No)を説明する。この場合において、第1制御部49が角度変更操作信号を受信したとき(ステップS215:Yes)、方向制御部495は角度変更制御を行う(ステップS216)。角度変更制御を行う際、方向制御部495は処理が実行可能であるか否かを判定し、実行可能である場合に移動装置4の移動と傾斜角度(姿勢)の変更を実行する。なお、移動装置4の移動と角度変更とを同時に実行する代わりに、移動装置4を移動した後に移動装置4の角度を変更させるようにしてもよい。 Returning to the flowchart of FIG. 12B, the description will be continued. The case where the first control unit 49 does not receive the slide operation signal in step S213 (step S213: No) will be described. In this case, when the first control unit 49 receives the angle change operation signal (step S215: Yes), the direction control unit 495 performs angle change control (step S216). When performing the angle change control, the direction control unit 495 determines whether or not the process can be executed, and executes the movement of the moving device 4 and the change of the tilt angle (posture) when the process can be executed. Instead of simultaneously executing the movement of the moving device 4 and changing the angle, the angle of the moving device 4 may be changed after the moving device 4 is moved.
 ステップS215において、第1制御部49が角度変更操作信号を受信しないとき(ステップS215:No)を説明する。この場合において、第1制御部49がピント変更操作信号を受信したとき(ステップS217:Yes)、第1制御部49はピント変更を指示する制御信号を撮像装置5に送信する(ステップS218)。 In Step S215, the case where the first control unit 49 does not receive the angle change operation signal (Step S215: No) will be described. In this case, when the first control unit 49 receives the focus change operation signal (step S217: Yes), the first control unit 49 transmits a control signal instructing the focus change to the imaging device 5 (step S218).
 ステップS217において、第1制御部49がピント変更操作信号を受信しない場合(ステップS217:No)を説明する。この場合において、第1制御部49が露出変更操作信号を受信したとき(ステップS219:Yes)、第1制御部49は露出変更を指示する制御信号を撮像装置5へ送信する(ステップS220)。 The case where the first control unit 49 does not receive the focus change operation signal in step S217 (step S217: No) will be described. In this case, when the first control unit 49 receives the exposure change operation signal (step S219: Yes), the first control unit 49 transmits a control signal instructing the exposure change to the imaging device 5 (step S220).
 ステップS219において、第1制御部49が露出変更操作信号を受信しない場合(ステップS219:No)を説明する。この場合において、第1制御部49が撮影操作信号を受信したとき(ステップS221:Yes)、第1制御部49は撮影を指示する制御信号を撮像装置5へ送信する(ステップS222)。 In Step S219, the case where the first control unit 49 does not receive an exposure change operation signal (Step S219: No) will be described. In this case, when the first control unit 49 receives a shooting operation signal (step S221: Yes), the first control unit 49 transmits a control signal instructing shooting to the imaging device 5 (step S222).
 ステップS221において、第1制御部49が撮影操作信号を受信しない場合(ステップS221:No)、移動装置4はステップS223へ移行する。ステップS223において、第1制御部49が構図調整モードの終了信号を操作装置3から受信した場合(ステップS223:Yes)、移動装置4の電源が切断されたとき(ステップS224:Yes)、移動装置4は処理を終了する。ステップS224において移動装置4の電源が切断されない場合(ステップS224:No)、移動装置4はステップS204に戻る。一方、第1制御部49が構図調整モード終了信号を受信しない場合(ステップS223:No)、移動装置4はステップS209に戻る。 In step S221, when the first control unit 49 does not receive the shooting operation signal (step S221: No), the moving device 4 proceeds to step S223. In step S223, when the first control unit 49 receives an end signal of the composition adjustment mode from the controller device 3 (step S223: Yes), when the mobile device 4 is powered off (step S224: Yes), the mobile device 4 ends the process. If the mobile device 4 is not turned off in step S224 (step S224: No), the mobile device 4 returns to step S204. On the other hand, when the first control unit 49 does not receive the composition adjustment mode end signal (step S223: No), the mobile device 4 returns to step S209.
 ステップS210において、構図調整モードへの変更を指示する制御信号を受信しない場合(ステップS210:No)を説明する。この場合において、動作モードが構図調整モードであるとき(ステップS225:Yes)、移動装置4はステップS211へ移行する。 The case where the control signal instructing the change to the composition adjustment mode is not received in step S210 (step S210: No) will be described. In this case, when the operation mode is the composition adjustment mode (step S225: Yes), the moving device 4 proceeds to step S211.
 ステップS225において動作モードが構図調整モードでない場合(ステップS225:No)において、第1制御部49が操作装置3からロックオンモードの終了信号を受信したとき(ステップS226:Yes)、移動装置4は撮像装置5へロックオンモードの終了信号を送信し(ステップS227)、ステップS224へ移行する。一方、第1制御部49が操作装置3からロックオンモードの終了信号を受信しないとき(ステップS226:No)、移動装置4はステップS209に戻る。 When the operation mode is not the composition adjustment mode in step S225 (step S225: No), when the first control unit 49 receives the lock-on mode end signal from the controller device 3 (step S226: Yes), the moving device 4 A lock-on mode end signal is transmitted to the imaging device 5 (step S227), and the process proceeds to step S224. On the other hand, when the first control unit 49 does not receive the lock-on mode end signal from the controller device 3 (step S226: No), the moving device 4 returns to step S209.
 以上説明した処理のうち、構図調整モードの処理に相当するステップS211~S222の処理は、並行して行うことも可能である。 Of the processes described above, the processes in steps S211 to S222 corresponding to the composition adjustment mode process can be performed in parallel.
 次に、撮像装置5が行う処理の概要について説明する。撮像装置5は、予め指定された撮影対象物と撮像装置5との相対的な関係の変化に応じて移動可能であるか否かを移動装置4が判定するために必要な情報を検出し、検出した情報を第2記録部55から読み出して移動装置4に送信することにより、移動装置4の処理を補助する。このような移動補助処理を含む撮像装置5の詳細な処理について、図19Aおよび図19Bに示すフローチャートを参照して説明する。 Next, an outline of processing performed by the imaging device 5 will be described. The imaging device 5 detects information necessary for the moving device 4 to determine whether it can move in accordance with a change in the relative relationship between the imaging object 5 specified in advance and the imaging device 5, The detected information is read from the second recording unit 55 and transmitted to the moving device 4 to assist the processing of the moving device 4. Detailed processing of the imaging apparatus 5 including such movement assistance processing will be described with reference to the flowcharts shown in FIGS. 19A and 19B.
 まず、図19Aを参照してステップS701~S713の処理を説明する。撮像装置5の電源が投入された場合(ステップS701:Yes)、撮影制御部567は撮像を開始する制御を行う(ステップS702)。ステップS701において撮像装置5の電源が投入されない場合(ステップS701:No)、撮像装置5はステップS701を繰り返す。 First, the processing in steps S701 to S713 will be described with reference to FIG. 19A. When the power of the imaging device 5 is turned on (step S701: Yes), the imaging control unit 567 performs control to start imaging (step S702). If the power of the imaging device 5 is not turned on in step S701 (step S701: No), the imaging device 5 repeats step S701.
 ステップS702の後、第2通信制御部566は、移動装置4との通信を確立する(ステップS703)。続いて、第2通信制御部566は、画像処理部561が生成した画像データの移動装置4への送信を開始する(ステップS704)。 After step S702, the second communication control unit 566 establishes communication with the mobile device 4 (step S703). Subsequently, the second communication control unit 566 starts transmitting the image data generated by the image processing unit 561 to the moving device 4 (step S704).
 続いて、移動装置4から通常制御信号を受信した場合(ステップS705:Yes)、第2制御部56は信号に応じた通常制御を行う(ステップS706)。ここでいう通常制御には、静止画または動画撮影、撮像装置5の電池残量に関する情報の移動装置4への送信等が含まれる。ステップS705において、移動装置4から通常制御信号を受信しない場合(ステップS705:No)、撮像装置5は後述するステップS707へ移行する。 Subsequently, when the normal control signal is received from the mobile device 4 (step S705: Yes), the second control unit 56 performs normal control according to the signal (step S706). The normal control here includes still image or moving image shooting, transmission of information related to the remaining battery level of the imaging device 5 to the moving device 4, and the like. In step S705, when the normal control signal is not received from the mobile device 4 (step S705: No), the imaging device 5 proceeds to step S707 described later.
 ステップS707において、移動装置4からロックオン対象の被写体の追尾を指示する信号およびロックオン対象の被写体の位置情報を受信した場合(ステップS707:Yes)、被写体検出部562はロックオン対象の被写体の検出を開始する(ステップS708)。被写体検出部562は、例えば被写体の色、形、およびパターン等に基づいて被写体を検出する。ステップS707において、移動装置4からロックオン対象の被写体の追尾を指示する信号およびロックオン対象の被写体の位置情報を受信しない場合(ステップS707:No)、撮像装置5はステップS705に戻る。 In step S707, when the signal for instructing tracking of the lock-on target subject and the position information of the lock-on target subject are received from the mobile device 4 (step S707: Yes), the subject detection unit 562 detects the lock-on target subject. Detection is started (step S708). The subject detection unit 562 detects the subject based on the color, shape, pattern, and the like of the subject, for example. In Step S707, when the signal for instructing tracking of the lock-on target and the position information of the lock-on target are not received from the moving device 4 (Step S707: No), the imaging device 5 returns to Step S705.
 ステップS708で被写体検出部562が被写体検出を開始してから、ロックオン対象の被写体をロストした場合(ステップS709:Yes)を説明する。この場合、第2制御部56は、移動装置4へロスト情報を送信する(ステップS710)。ここで「被写体をロストした場合」には、被写体の一部が画像内から消えている場合、換言すれば「被写体をロストしそうな場合」も含まれる。 A case will be described in which the subject detection unit 562 starts subject detection in step S708 and then loses the subject to be locked on (step S709: Yes). In this case, the second control unit 56 transmits lost information to the mobile device 4 (step S710). Here, “when the subject is lost” includes a case where a part of the subject disappears from the image, in other words, “when the subject is likely to be lost”.
 この後、移動装置4からロックオンモードの終了信号を受信した場合(ステップS711:Yes)、被写体検出部562はロックオン対象の被写体検出を終了する(ステップS712)。ステップS711において、移動装置4からロックオンモードの終了信号を受信しない場合(ステップS711:No)、撮像装置5はステップS709に戻る。 Thereafter, when a lock-on mode end signal is received from the mobile device 4 (step S711: Yes), the subject detection unit 562 finishes the lock-on target subject detection (step S712). In step S711, when the lock-on mode end signal is not received from the mobile device 4 (step S711: No), the imaging device 5 returns to step S709.
 ステップS712の後、撮像装置5の電源が切断されたとき(ステップS713:Yes)、撮像装置5は処理を終了する。ステップS713において撮像装置5の電源が切断されない場合(ステップS713:No)、撮像装置5はステップS705に戻る。 After step S712, when the power of the imaging device 5 is turned off (step S713: Yes), the imaging device 5 ends the process. If the power of the imaging device 5 is not turned off in step S713 (step S713: No), the imaging device 5 returns to step S705.
 ステップS709において、ロックオン対象の被写体をロストしていない場合(ステップS709:No)に行うステップS714以降の処理について、図19Bを参照して説明する。まず、動作モードが角度ロックオンモードに設定されてなく、通常ロックオンモードに設定されている場合(ステップS714:Yes)、距離算出部564は、ロックオン対象である被写体までの距離を算出するとともに、動きベクトル等を用いて2次元平面内の移動を求めることによって基準位置からの3次元的なずれ量を算出する(ステップS715)。この後、撮像装置5は、ステップS715の演算結果を移動装置4に送信する(ステップS716)。 Referring to FIG. 19B, the processing after step S714 performed when the subject to be locked on has not been lost in step S709 (step S709: No) will be described. First, when the operation mode is not set to the angle lock-on mode but is set to the normal lock-on mode (step S714: Yes), the distance calculation unit 564 calculates the distance to the subject that is the lock-on target. At the same time, a three-dimensional deviation amount from the reference position is calculated by obtaining a movement in a two-dimensional plane using a motion vector or the like (step S715). Thereafter, the imaging device 5 transmits the calculation result of step S715 to the moving device 4 (step S716).
 ステップS714において、動作モードが角度ロックオンモードに設定されている場合(ステップS714:No)、被写体検出部562は被写体の向きを検出する(ステップS717)。この後、撮像装置5はステップS715へ移行する。この場合、撮像装置5は、ステップS716において、被写体の向きも移動装置4に送信する。 In step S714, when the operation mode is set to the angle lock on mode (step S714: No), the subject detection unit 562 detects the orientation of the subject (step S717). Thereafter, the imaging device 5 proceeds to step S715. In this case, the imaging device 5 also transmits the orientation of the subject to the moving device 4 in step S716.
 ステップS717の後、被写体検出部562が被写体との間に障害物を検出した場合(ステップS718:Yes)、距離算出部564は障害物までの距離を算出する(ステップS719)。その後、撮像装置5は、障害物までの距離を移動装置4へ送信する(ステップS720)。ステップS720の後、およびステップS718において被写体検出部562が被写体との間に障害物を検出しない場合(ステップS718:No)、撮像装置5はステップS721へ移行する。 After step S717, when the subject detection unit 562 detects an obstacle between the subject (step S718: Yes), the distance calculation unit 564 calculates the distance to the obstacle (step S719). Thereafter, the imaging device 5 transmits the distance to the obstacle to the moving device 4 (step S720). After step S720 and when the subject detection unit 562 does not detect an obstacle between the subject and the subject in step S718 (step S718: No), the imaging device 5 proceeds to step S721.
 ステップS721において、撮像装置5が構図調整モードへの変更を指示する信号を受信した場合(ステップS721:Yes)、撮像装置5はステップS722へ移行する。一方、撮像装置5が構図調整モードへの変更を指示する信号を受信しない場合(ステップS721:No)、撮像装置5はステップS711へ移行する。 In step S721, when the imaging device 5 receives a signal instructing the change to the composition adjustment mode (step S721: Yes), the imaging device 5 proceeds to step S722. On the other hand, when the imaging device 5 does not receive a signal for instructing change to the composition adjustment mode (step S721: No), the imaging device 5 proceeds to step S711.
 ステップS722において、移動装置4からズーム制御信号を受信した場合(ステップS722:Yes)、撮影制御部567はズーム制御を行う(ステップS723)。ステップS722において、移動装置4からズーム制御信号を受信しない場合(ステップS722:No)、撮像装置5は後述するステップS724へ移行する。 In step S722, when a zoom control signal is received from the mobile device 4 (step S722: Yes), the imaging control unit 567 performs zoom control (step S723). In step S722, when a zoom control signal is not received from the moving device 4 (step S722: No), the imaging device 5 proceeds to step S724 described later.
 撮像装置5が行うズーム制御の処理の概要について、図20に示すフローチャートを参照して説明する。まず、撮影制御部567は、第2記録部55を参照して、ズーム指示量に基づく被操縦体2の重心変化を算出する(ステップS801)。 An overview of zoom control processing performed by the imaging device 5 will be described with reference to a flowchart shown in FIG. First, the imaging control unit 567 refers to the second recording unit 55 to calculate the change in the center of gravity of the steered body 2 based on the zoom instruction amount (step S801).
 被操縦体2の重心変化が所定の閾値以下である場合(ステップS802:No)、ズームの影響も少ないので、バランスを崩す心配もない。そこで、撮影制御部567は、光学ズームが可能であるか否かを判定する(ステップS803)。ここで、重心変化以外の量を用いてステップS802の判定を行ってもよい。具体的には、例えば被操縦体2の浮力を得るための機構の動き量、被操縦体2がバランスを取るためのジャイロセンサ等の出力に換算した値、または光軸方向の重さの位置変化などで換算した値に対して閾値を定め、その閾値以下である場合にステップS803を行うようにしてもよい。また、被操縦体2の全体の長さに対する撮像装置5の光軸方向における被操縦体2の重心移動量の割合に対して閾値(例えば10%)を定め、この閾値以下である場合にステップS803を行うようにしてもよい。なお、ここでは単純な閾値による判定を説明したが、複数のパラメータを用いて撮影制御部567が重心変化の有無を判定するようにしてもよい。光学ズームが可能である場合(ステップS803:Yes)、撮影制御部567は、光学系511に対して光学ズーム制御を行う(ステップS804)。光学ズームが可能であるか否かは、例えばレンズの種類、性質や要求される画質などの条件に基づいて判定される。なお、光学ズーム時は、その旨を表す信号をズーム制御前に移動装置4に伝え、移動装置4が不測の状態変動に備えた安定化動作などを行うようにしてもよい。このような撮像装置5の処理は、ステップS802またはS803の処理中に行ってもよいし、別処理で行ってもよい。また、被操縦体2の重心変化を用いて飛行や浮遊の観点での判定を行う代わりに、風速センサを設けることによって風の影響に基づく判定を行ったり、振動センサを設けて被操縦体2における振動の影響に基づく判定を行ったりしてもよく、これらの影響に基づく判定を上述した判定と併用して総合的に判定してもよい。例えば、風速が強く風の影響で対象物に近づきにくい場合にはズームを行うという判定を併用してもよい。 If the change in the center of gravity of the steered object 2 is equal to or less than the predetermined threshold (step S802: No), the influence of zooming is small, so there is no fear of losing balance. Therefore, the shooting control unit 567 determines whether or not optical zoom is possible (step S803). Here, the determination in step S802 may be performed using an amount other than the change in the center of gravity. Specifically, for example, the amount of movement of the mechanism for obtaining the buoyancy of the steered object 2, the value converted into the output of a gyro sensor or the like for balancing the steered object 2, or the position of the weight in the optical axis direction A threshold value may be set for a value converted by a change or the like, and step S803 may be performed when the value is equal to or less than the threshold value. Further, a threshold value (for example, 10%) is set for the ratio of the center of gravity movement amount of the steered object 2 in the optical axis direction of the imaging device 5 with respect to the entire length of the steered object 2, and the step is performed when the threshold is equal to or less than this threshold S803 may be performed. Although the determination based on a simple threshold has been described here, the imaging control unit 567 may determine whether or not there is a change in the center of gravity using a plurality of parameters. If the optical zoom is possible (step S803: Yes), the imaging control unit 567 performs optical zoom control on the optical system 511 (step S804). Whether or not optical zoom is possible is determined based on conditions such as the type and nature of the lens and the required image quality. At the time of optical zoom, a signal indicating that may be transmitted to the moving device 4 before zoom control, and the moving device 4 may perform a stabilization operation in preparation for unexpected state fluctuations. Such processing of the imaging device 5 may be performed during the processing of step S802 or S803, or may be performed as a separate process. Further, instead of making a determination from the viewpoint of flying or floating using the change in the center of gravity of the steered object 2, a judgment is made based on the influence of wind by providing a wind speed sensor, or a steered object 2 is provided by providing a vibration sensor. The determination based on the influence of the vibration may be performed, or the determination based on the influence may be comprehensively determined in combination with the determination described above. For example, when the wind speed is strong and it is difficult to approach the object due to the influence of the wind, it may be used together with the determination of performing zooming.
 この後、撮影制御部567が、撮像装置5からの情報により要求ズーム位置まで移動が完了したと判定した場合(ステップS805:Yes)、撮影制御部567はズーム制御を終了する。一方、撮影制御部567が、要求ズーム位置まで移動が完了していないと判定した場合(ステップS805:No)、撮影制御部567は要求ズーム位置まで電子ズームを行う(ステップS806)。その後、撮影制御部567はズーム制御を終了する。 Thereafter, when the shooting control unit 567 determines that the movement to the requested zoom position has been completed based on information from the imaging device 5 (step S805: Yes), the shooting control unit 567 ends the zoom control. On the other hand, when the shooting control unit 567 determines that the movement to the requested zoom position has not been completed (step S805: No), the shooting control unit 567 performs electronic zoom to the requested zoom position (step S806). Thereafter, the imaging control unit 567 ends the zoom control.
 ステップS803で光学ズームが可能でない場合(ステップS803:No)、撮像装置5はステップS806へ移行する。 If the optical zoom is not possible in step S803 (step S803: No), the imaging device 5 proceeds to step S806.
 次に、ステップS801で算出された被操縦体2の重心変化が閾値より大きい場合(ステップS802:Yes)を説明する。この場合、撮影制御部567は、被写体のフレーム間の動き量と所定の閾値との大小を判定する(ステップS807)。被写体の動き量が閾値以上である(動き大である)場合(ステップS807:Yes)、被写体のズーム速度と被写体への接近速度との大小関係に応じた処理を行う。具体的には、被写体像のズームによる拡大速度が接近による拡大速度より大きい場合(ステップS808:Yes)、撮像装置5が光学ズームおよび電子ズームを実行可能であれば(ステップS809:Yes)、ステップS804へ移行して光学ズーム制御を行う。被操縦体2の重心移動が大きいと判定した場合に光学ズーム等を行うのは、被操縦体2が体勢を崩しつつも安全な移動が可能な場合があるからである。このように撮像装置5は、判定時の周囲の環境、被操縦体2の安定性や移動装置4の性能情報から光学ズーム等の実行可否を総合的に判断してもよい。ここでも、被操縦体2の重心変化を用いて飛行や浮遊の観点での判定を行う代わりに、風や被操縦体2における振動の影響に基づく判定を行ってもよいし、これらの影響に基づく判定を上述した判定と併用して総合的に判定してもよい。例えば、風速が強く風の影響で対象物に近づきにくい場合にはズームを行うという判定を併用してもよい。また、被操縦体2の振動の影響が大きい場合には、ズームで望遠にして拡大するのではなく、対象物に近づいて拡大する方がぶれやフレーミングで有利になる場合があるので、ステップS809の処理に風の影響や被操縦体2の振動の影響に基づく判断を加えてもよい。これに対して、ステップS807において被写体の動き量が閾値より小さい(動きが小である)場合(ステップS807:No)、被写体が逃げたりすることはないとみなして、撮像装置5は後述するステップS810へ移行する。被写体の動き量は、特に撮影レンズの光軸方向がピントに効くということに鑑みて、被写体までの距離の変化や被写体像の大きさ変化などに基づいて判定してもよい。なお、ここでは説明を簡略化して、発明の特徴をわかりやすくするために、いくつかの分岐を用いて処理の概要を説明したが、これらの分岐における各種判定処理は、さらに様々な変数を用いることにより、光学的、機械的、電気的な撮像装置5の制御と、移動装置4の移動制御のバランスを取る、あるいは最適化をするような判定処理となるように制御するのが好ましい。なお、「光学ズーム」と記載した箇所を「ピント位置移動による重心変化」などに置き換える応用も可能である。この場合において、重心変化の影響が撮影や移動制御に影響を及ぼすときには、ピントではなく被操縦体2の移動により、フォーカス位置に合わせるような制御を行ってもよい。 Next, a case where the change in the center of gravity of the steered object 2 calculated in step S801 is larger than the threshold value (step S802: Yes) will be described. In this case, the shooting control unit 567 determines the magnitude of the amount of motion between the frames of the subject and the predetermined threshold (step S807). If the amount of movement of the subject is equal to or greater than the threshold (the amount of motion is large) (step S807: Yes), processing according to the magnitude relationship between the zoom speed of the subject and the approach speed to the subject is performed. Specifically, when the enlargement speed by the zoom of the subject image is larger than the enlargement speed by the approach (step S808: Yes), if the imaging device 5 can execute the optical zoom and the electronic zoom (step S809: Yes), the step The process proceeds to S804 to perform optical zoom control. The optical zoom or the like is performed when it is determined that the movement of the center of gravity of the steered object 2 is large because the steerable object 2 may be able to move safely while losing its posture. As described above, the imaging device 5 may comprehensively determine whether or not the optical zoom or the like can be performed from the surrounding environment at the time of determination, the stability of the steered object 2 and the performance information of the moving device 4. Here, instead of making a determination from the viewpoint of flying or floating using the change in the center of gravity of the steered object 2, a determination based on the influence of wind or vibration in the steered object 2 may be performed. The determination based on the above may be combined with the determination described above to make a comprehensive determination. For example, when the wind speed is strong and it is difficult to approach the object due to the influence of the wind, it may be used together with the determination of performing zooming. Further, when the influence of the vibration of the to-be-steered body 2 is large, there is a case where it is more advantageous for blurring and framing to zoom in close to the object instead of zooming in on the telephoto side, so that step S809 is performed. A determination based on the influence of wind and the influence of vibration of the steered body 2 may be added to the above process. On the other hand, if the amount of movement of the subject is smaller than the threshold value (the movement is small) in step S807 (step S807: No), the imaging device 5 assumes that the subject will not escape and the imaging device 5 performs steps described later. The process proceeds to S810. The amount of movement of the subject may be determined based on a change in the distance to the subject, a change in the size of the subject image, or the like, particularly in view of the fact that the optical axis direction of the photographing lens is in focus. In order to simplify the description and make the features of the invention easier to understand, the outline of the process has been described using several branches. However, various determination processes in these branches use various variables. Accordingly, it is preferable to control the optical, mechanical, and electrical control of the imaging apparatus 5 and the movement control of the moving apparatus 4 so that the determination process is balanced or optimized. An application in which the portion described as “optical zoom” is replaced with “change in the center of gravity by moving the focus position” or the like is also possible. In this case, when the influence of the change in the center of gravity affects the photographing and movement control, control may be performed so as to adjust to the focus position not by focusing but by movement of the steered object 2.
 被写体像のズームによる拡大速度が接近による拡大速度より大きく(ステップS808:Yes)、かつ撮像装置5が光学ズームまたは電子ズームを実行可能でない(ステップS809:No)場合、および被写体像のズームによる拡大速度が接近による拡大速度以下である場合(ステップS808:No)、被写体検出部562は画像内の障害物の検出を行い(ステップS810)、移動装置4に対して障害物の検出結果と移動要求を送信する(ステップS811)。この移動要求は、障害物の手前までの移動要求である。ここでも、風や被操縦体2における振動の影響に基づく判定を行ってもよいし、これらの影響に基づく判定を上述した判定と併用して総合的に判定してもよい。被写体の動きは画像の時間変化によって検出できるが、被写体が実際には動いていなくても画像等で予測できる場合があり、そうした結果を利用してもよい。どのような対象物がどのような動きをした時に、その後どう動くかなどについては、画像変化の機械学習を行うことによって判定することもできる。ここでは、撮像装置5の特徴を説明しているが、この撮像装置5と通信する移動装置4または撮像装置5と移動装置4を含む被操縦体2としての特徴もある。すなわち、撮像装置5と通信可能であり、撮像装置5を保持して撮像装置5とともに移動可能な移動装置4を備えた被操縦体2が、予め指定された撮影対象物と撮像装置5との相対的な関係の変化に応じて移動可能であるか否かを判定するためには、撮像装置5の重心など変化や、対象物の敏捷性などの動き、それらの変化や動きの予測、被操縦体2の性能、被操縦体2の周囲の環境などの情報が必要となってくる。被操縦体2の被操縦体制御部21が、それらの情報を検出し、検出した情報を用いて撮影対象物と撮像装置5との相対的な関係の変化に応じた移動が可能であるか否かを判定し、判定結果に応じた制御を行うことは、本実施の形態1の特徴の一つである。 When the enlargement speed by the zoom of the subject image is larger than the enlargement speed by the approach (step S808: Yes) and the imaging device 5 cannot execute the optical zoom or the electronic zoom (No at step S809), and the enlargement by the zoom of the subject image When the speed is equal to or less than the enlargement speed due to the approach (step S808: No), the subject detection unit 562 detects an obstacle in the image (step S810), and the obstacle detection result and the movement request to the moving device 4 are detected. Is transmitted (step S811). This movement request is a movement request before the obstacle. Here, the determination based on the influence of the wind or the vibration of the steered body 2 may be performed, or the determination based on the influence may be comprehensively determined in combination with the above-described determination. Although the movement of the subject can be detected by the time change of the image, the subject may be predicted by an image or the like even if the subject is not actually moving, and such a result may be used. It can also be determined by performing machine learning of image changes about what kind of object moves and how it moves after that. Although the features of the imaging device 5 are described here, there is also a feature as the steered body 2 including the moving device 4 communicating with the imaging device 5 or the imaging device 5 and the moving device 4. In other words, the steered body 2 including the moving device 4 that can communicate with the imaging device 5 and can hold the imaging device 5 and move together with the imaging device 5 is connected between the imaging object 5 and the imaging object 5 designated in advance. In order to determine whether or not movement is possible in accordance with a change in the relative relationship, changes such as the center of gravity of the imaging device 5, movements such as the agility of the object, prediction of those changes and movements, Information such as the performance of the pilot 2 and the environment around the pilot 2 is required. Whether the steered body control unit 21 of the steered body 2 detects such information and can use the detected information to move in accordance with a change in the relative relationship between the object to be imaged and the imaging device 5. It is one of the features of the first embodiment to determine whether or not to perform control according to the determination result.
 続いて、被写体検出部562が被写体をロストした場合(ステップS812:Yes)、撮像装置5はロスト情報を移動装置4へ送信する(ステップS813)。この後、撮像装置5はズーム制御を終了してメインルーチンへ戻る。 Subsequently, when the subject detection unit 562 has lost the subject (step S812: Yes), the imaging device 5 transmits lost information to the moving device 4 (step S813). Thereafter, the imaging device 5 ends the zoom control and returns to the main routine.
 ステップS812において被写体検出部562が被写体をロストしない場合(ステップS812:No)、撮像装置5が移動装置4から移動終了の報知を受信したとき(ステップS814:Yes)、撮像装置5はズーム制御を終了してメインルーチンへ戻る。ステップS814において、撮像装置5が移動装置4から移動終了の報知を受信しないとき(ステップS814:No)、撮像装置5はステップS812へ戻る。 If the subject detection unit 562 does not lose the subject in step S812 (step S812: No), the imaging device 5 performs zoom control when the imaging device 5 receives a movement end notification from the moving device 4 (step S814: Yes). End and return to the main routine. In step S814, when the imaging device 5 does not receive the movement end notification from the moving device 4 (step S814: No), the imaging device 5 returns to step S812.
 再び図19Bを参照して説明を続ける。ステップS723の後、スライド情報を移動装置4から受信した場合(ステップS724:Yes)、追尾処理部563はロスト判定を行う(ステップS725)。被写体をロストしていない場合(ステップS725:No)において、スライド後の基準位置を受信したとき(ステップS726:Yes)、第2制御部56は基準位置の情報を第2記録部55に書き込んで記憶させる(ステップS727)。 The description will be continued with reference to FIG. 19B again. When slide information is received from the mobile device 4 after step S723 (step S724: Yes), the tracking processing unit 563 performs lost determination (step S725). When the subject is not lost (step S725: No), when the reference position after the slide is received (step S726: Yes), the second control unit 56 writes information on the reference position in the second recording unit 55. Store (step S727).
 ステップS725において被写体をロストした場合(ステップS725:Yes)、第2制御部56は移動装置4へロスト情報を送信する(ステップS728)。 When the subject is lost in step S725 (step S725: Yes), the second control unit 56 transmits lost information to the moving device 4 (step S728).
 ステップS727またはS728の後、移動装置4からピント変更操作信号を受信した場合(ステップS729:Yes)、撮影制御部567は光学系511を動作させてピント変更処理を行う(ステップS730)。ステップS730の後、およびステップS729で移動装置4からピント変更操作信号を受信しない場合(ステップS729:No)、撮像装置5はステップS731へ移行する。 After step S727 or S728, when a focus change operation signal is received from the moving device 4 (step S729: Yes), the imaging control unit 567 operates the optical system 511 to perform focus change processing (step S730). After step S730 and when no focus change operation signal is received from the mobile device 4 in step S729 (step S729: No), the imaging device 5 proceeds to step S731.
 ステップS731において、移動装置4から露出変更操作信号を受信した場合(ステップS731:Yes)、撮影制御部567は露出を変更する処理を行う(ステップS732)。ステップS732の後、およびステップS731で移動装置4から露出変更操作信号を受信しない場合(ステップS731:No)、撮像装置5はステップS733へ移行する。 In step S731, when an exposure change operation signal is received from the moving device 4 (step S731: Yes), the imaging control unit 567 performs a process of changing the exposure (step S732). After step S732 and when no exposure change operation signal is received from the moving device 4 in step S731 (step S731: No), the imaging device 5 proceeds to step S733.
 ステップS733において、移動装置4から撮影操作信号を受信した場合(ステップS733:Yes)、撮影制御部567は撮影操作信号に応じた撮影を行わせる(ステップS734)。ステップS734の後、およびステップS733において移動装置4から撮影操作信号を受信しない場合(ステップS733:No)、撮像装置5はステップS711へ移行する。 In step S733, when a shooting operation signal is received from the mobile device 4 (step S733: Yes), the shooting control unit 567 performs shooting according to the shooting operation signal (step S734). After step S734 and when no shooting operation signal is received from the mobile device 4 in step S733 (step S733: No), the imaging device 5 proceeds to step S711.
 ステップS724においてスライド情報を移動装置4から受信しない場合(ステップS724:No)を説明する。この場合において、トリミング要求を受信したとき(ステップS735:Yes)、トリミング部565はトリミングが可能であるか否かを判定する(ステップS736)。ここでの判定は、ロックオン対象の被写体の基準位置をスライドした場合に撮像画像の中に移動によって当所表示されていなかった部分の画像を補うことが可能な余白部分の有無に基づいて判定を行う。判定の結果、トリミングを行うことが可能である場合(ステップS736:Yes)、トリミング部565はトリミングを行い(ステップS737)、トリミング画像のデータを移動装置4に送信する(ステップS738)。一方、判定の結果、トリミングを行うことができない場合(ステップS736:No)、第2制御部56はエラー情報を移動装置4に送信する(ステップS739)。ステップS738またはS739の後、撮像装置5はステップS729へ移行する。 The case where the slide information is not received from the mobile device 4 in step S724 (step S724: No) will be described. In this case, when a trimming request is received (step S735: Yes), the trimming unit 565 determines whether trimming is possible (step S736). The determination here is based on the presence or absence of a margin part that can supplement the image of the part that was not displayed in the captured image due to the movement when the reference position of the subject to be locked on is slid. Do. As a result of the determination, if trimming can be performed (step S736: Yes), the trimming unit 565 performs trimming (step S737), and transmits the trimmed image data to the moving device 4 (step S738). On the other hand, if the result of determination is that trimming cannot be performed (step S736: No), the second control unit 56 transmits error information to the mobile device 4 (step S739). After step S738 or S739, the imaging device 5 proceeds to step S729.
 ステップS735において、トリミング要求を受信しないとき(ステップS735:No)、撮像装置5はステップS729へ移行する。 In step S735, when the trimming request is not received (step S735: No), the imaging device 5 proceeds to step S729.
 なお、以上の説明では撮像装置5が移動装置4に対して角度を変更しないものとして説明をしたが、撮像装置5が移動装置4に対して角度を変更可能な構成としてもよい。その場合には、移動装置4から角度変更指示信号を受信して移動装置4に対する角度を変更する制御を行えばよい。 In the above description, the imaging device 5 is described as not changing the angle with respect to the moving device 4, but the imaging device 5 may be configured to be able to change the angle with respect to the moving device 4. In that case, the angle change instruction signal may be received from the moving device 4 and control for changing the angle with respect to the moving device 4 may be performed.
 以上説明した本発明の実施の形態1によれば、移動装置が予め指定された撮影対象物と撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定し、判定結果に応じた制御を行う一方、撮像装置が撮影対象物と撮像装置との相対的な関係の変化に応じて移動可能であるか否かを移動装置が判定するために必要な情報を検出して移動装置に送信するため、撮影時の移動経路が定まっていない状況下でも適切な処理を行うことができる。 According to the first embodiment of the present invention described above, it is determined whether or not the moving device is movable in accordance with a change in the relative relationship between the imaging object specified in advance and the imaging device. While performing control according to the result, it detects information necessary for the mobile device to determine whether or not the imaging device is movable according to a change in the relative relationship between the object to be imaged and the imaging device. Therefore, it is possible to perform appropriate processing even in a situation where the movement path at the time of shooting is not fixed.
 また、本実施の形態1によれば、構図調整モードを設定した場合、ユーザは操縦は撮像システムに任せて構図の調整に専念することができる。したがって、被写体を追尾する環境下においても、ユーザが追尾のための処理に気を取られることなく、撮影に集中することが可能となる。 Also, according to the first embodiment, when the composition adjustment mode is set, the user can concentrate on composition adjustment by leaving the operation to the imaging system. Therefore, even in an environment where the subject is tracked, the user can concentrate on shooting without being distracted by the processing for tracking.
(実施の形態2)
 本発明の実施の形態2は、被操縦体が内視鏡である。移動装置は内視鏡の先端を伸縮させるものであり、その内部に撮像装置を保持している。移動装置は操作装置と通信可能に無線接続されており、操作装置の指示にしたがって駆動するとともに、撮像装置に対する指示信号を受信して撮像装置に送信する。被操縦体はロックオン機能を具備しており、被操縦体(内視鏡)が被検体の内部に挿入されて検査が行われる際に撮影対象物(被写体)が撮像装置に対して動いた場合、その動きを追尾することが可能であるか否かを逐次判定しながら撮影対象物を追尾する。この追尾を行う際、移動装置は撮影対象物の大きさを変えないようにするために、必要に応じて被写体に対する接近または離反動作を行う。
(Embodiment 2)
In the second embodiment of the present invention, the steered body is an endoscope. The moving device expands and contracts the distal end of the endoscope, and holds the imaging device therein. The mobile device is wirelessly connected so as to be communicable with the operating device, and is driven according to an instruction from the operating device, and receives an instruction signal for the imaging device and transmits it to the imaging device. The steered object has a lock-on function, and the object to be photographed (subject) moves relative to the imaging device when the steered object (endoscope) is inserted into the subject and the examination is performed. In this case, the object to be imaged is tracked while sequentially determining whether or not the movement can be tracked. When performing this tracking, the moving device performs an approach or separation operation on the subject as necessary in order not to change the size of the object to be imaged.
 図21は、本発明の実施の形態2に係る撮像システムの概略構成を示す図である。図22は、本実施の形態2に係る撮像システムの機能構成を示すブロック図である。図21および図22に示す撮像システム1Aは、撮影対象物である被検体の体内に挿入されて被検体の内部を撮像する内視鏡2Aと、内視鏡2Aの操作指示信号の入力を受け付ける操作装置3Aと、内視鏡2Aと通信可能に接続され、撮像システム1Aを統括して制御するプロセッサ6Aと、内視鏡2Aが撮像した画像を表示する表示装置7Aと、を備える。 FIG. 21 is a diagram showing a schematic configuration of an imaging system according to Embodiment 2 of the present invention. FIG. 22 is a block diagram illustrating a functional configuration of the imaging system according to the second embodiment. An imaging system 1A shown in FIGS. 21 and 22 receives an endoscope 2A that is inserted into the body of a subject that is a subject to be imaged and images the inside of the subject, and an operation instruction signal for the endoscope 2A. The controller 3A includes a processor 6A that is communicably connected to the endoscope 2A and controls the imaging system 1A in an integrated manner, and a display device 7A that displays an image captured by the endoscope 2A.
 内視鏡2Aは、可撓性を有する細長形状をなして被検体の体腔内に挿入される挿入部21Aと、挿入部21Aの基端側に接続され、各種の操作信号の入力を受け付ける操作部22Aと、操作部22Aから挿入部21Aが延びる方向と異なる方向に延び、プロセッサ6Aに接続する各種ケーブルを内蔵するユニバーサルコード23Aと、を備える。 The endoscope 2A has an elongated shape having flexibility and is inserted into the body cavity of the subject, and is connected to the proximal end side of the insertion portion 21A and receives an operation signal. 22A, and a universal cord 23A that extends in a direction different from the direction in which the insertion portion 21A extends from the operation portion 22A and incorporates various cables connected to the processor 6A.
 挿入部21Aは、移動装置4Aおよび撮像装置5Aを内蔵する先端部24Aと、先端部24Aの基端側に接続され、可撓性を有する長尺状の可撓管部25Aと、を有する。先端部24Aは、静電アクチュエータ、導電性高分子アクチュエータまたは超音波モータ等のいずれかを用いて構成され、湾曲やクランク曲げ等を行うことが可能な柔軟構造を有している。 The insertion portion 21A includes a distal end portion 24A that houses the moving device 4A and the imaging device 5A, and a long flexible tube portion 25A that is connected to the proximal end side of the distal end portion 24A and has flexibility. The distal end portion 24A is configured by using any one of an electrostatic actuator, a conductive polymer actuator, an ultrasonic motor, and the like, and has a flexible structure capable of bending, crank bending, and the like.
 移動装置4Aおよび撮像装置5Aは、実施の形態1で説明した移動装置4および撮像装置5とそれぞれ同様の構成を有する。このため、内視鏡2Aは被操縦体2としての機能を有する。以下、移動装置4Aおよび撮像装置5Aにおいて移動装置4および撮像装置5とそれぞれ対応する構成要素には、移動装置4および撮像装置5の構成要素の末尾にAを付して記載する。例えば、移動装置4Aが有する推進部の符号は「41A」となる。 The moving device 4A and the imaging device 5A have the same configurations as the moving device 4 and the imaging device 5 described in the first embodiment. For this reason, the endoscope 2 </ b> A has a function as the steered body 2. Hereinafter, the components corresponding to the moving device 4 and the imaging device 5 in the moving device 4A and the imaging device 5A will be described by adding A to the end of the components of the moving device 4 and the imaging device 5, respectively. For example, the code | symbol of the propulsion part which the moving apparatus 4A has is "41A".
 移動装置4Aは、挿入部21Aの先端から進退可能な筒状をなして挿入部21Aに取り付けられており、筒状内部に撮像装置5Aを保持している。推進部41Aは、操作装置3Aから送られてくる操作指示信号に応じて挿入部21Aの先端で移動装置4Aを進退するアクチュエータを用いて構成されている。 The moving device 4A is attached to the insertion portion 21A in a cylindrical shape that can be advanced and retracted from the distal end of the insertion portion 21A, and holds the imaging device 5A inside the cylindrical shape. The propulsion unit 41A is configured using an actuator that advances and retracts the moving device 4A at the distal end of the insertion unit 21A in response to an operation instruction signal sent from the operation device 3A.
 プロセッサ6Aは、内視鏡2Aが撮像した画像データを取得して画像処理を行う画像処理部61Aと、内視鏡2Aの挿入部21Aの先端から被検体へ照射する照明光を発生する光源部62Aと、プロセッサ6A自身を含む撮像システム1A全体を統括して制御する制御部63Aとを有する。 The processor 6A acquires an image data captured by the endoscope 2A and performs image processing, and a light source unit that generates illumination light that irradiates the subject from the distal end of the insertion unit 21A of the endoscope 2A. 62A and a control unit 63A that controls the entire imaging system 1A including the processor 6A itself.
 画像処理部61Aおよび制御部63Aは、例えばCPU、FPGA(Field Programmable Gate Array)およびASIC(Application Specific Integrated Circuit)等のうち一つまたは複数を用いて構成される。 The image processing unit 61A and the control unit 63A are configured using one or more of, for example, a CPU, an FPGA (Field Programmable Gate Array), an ASIC (Application Specific Integrated Circuit), and the like.
 光源部62Aは、内視鏡2Aの挿入部21Aの先端から被検体(被写体)に向けて照射する照明光を出力する。光源部62Aは、例えばLED(Light Emitting Diode)、レーザー光源、キセノンランプ、ハロゲンランプ等のいずれかを用いて構成される。プロセッサ6Aは、撮像システム1A全体を統括して制御する。 The light source unit 62A outputs illumination light that irradiates the subject (subject) from the distal end of the insertion unit 21A of the endoscope 2A. The light source unit 62A is configured using, for example, an LED (Light Emitting Diode), a laser light source, a xenon lamp, a halogen lamp, or the like. The processor 6A controls the entire imaging system 1A.
 表示装置7Aは、プロセッサ6Aが画像処理を施した撮像信号に対応する画像を表示する。また、表示装置7Aは、撮像システム1Aに関する各種情報を表示する。表示装置7Aは、液晶や有機EL等の表示パネル等を用いて構成される。 The display device 7A displays an image corresponding to the imaging signal subjected to image processing by the processor 6A. The display device 7A displays various information related to the imaging system 1A. The display device 7A is configured using a display panel such as a liquid crystal or an organic EL.
 なお、移動装置4Aの第1制御部49Aおよび撮像装置5Aの第2制御部56Aの機能をプロセッサ6Aが具備してもよい。また、操作装置3Aの機能を内視鏡2Aの操作部22Aが具備してもよい。 The processor 6A may include the functions of the first control unit 49A of the moving device 4A and the second control unit 56A of the imaging device 5A. Further, the operation unit 22A of the endoscope 2A may include the function of the operation device 3A.
 以上の構成を有する撮像システム1Aにおいて、操作装置3A、移動装置4Aおよび撮像装置5Aが行う処理の概要は、実施の形態1とほぼ同様である。以下では、本実施の形態2において特有のスライド制御処理の概要について説明する。 In the imaging system 1A having the above configuration, the outline of the processing performed by the operation device 3A, the moving device 4A, and the imaging device 5A is substantially the same as in the first embodiment. Hereinafter, an outline of a slide control process unique to the second embodiment will be described.
 まず、図23に示すフローチャートを参照して、内視鏡2Aの移動装置4Aが行うスライド制御の処理の概要を説明する。ステップS901~S908は、実施の形態1で説明したスライド制御におけるステップS601~608(図18を参照)に順次対応している。なお、ステップS901における移動装置4Aのスライド可能性判定は、例えば空間情報取得部43Aがスライド方向における被検体の内臓の壁面の有無を検出した結果に基づいて行われる。 First, an overview of the slide control process performed by the moving device 4A of the endoscope 2A will be described with reference to the flowchart shown in FIG. Steps S901 to S908 sequentially correspond to steps S601 to 608 (see FIG. 18) in the slide control described in the first embodiment. Note that the determination of the slidability of the moving device 4A in step S901 is performed based on, for example, a result of the spatial information acquisition unit 43A detecting the presence or absence of the internal wall of the subject in the sliding direction.
 ステップS909以降の処理について説明する。ステップS909において、移動判定部492Aは被写体像の大きさがスライド前後で変化したか否かを判定する(ステップS909)。図24は、被写体像の大きさが変化した場合の内視鏡先端部と被写体との関係を模式的に示す図である。図24に示すように、スライド量が大きい場合には、破線で示す移動前の先端部24Aからクランク曲げで被写体300に対してΔd程度遠ざかった状態となる。このため、図25に示すように、移動後の被写体像300bは、移動前の被写体像300aよりも小さくなる。判定の結果、被写体像の大きさが変化した場合(ステップS909:Yes)、移動判定部492Aは、被写体像の大きさの変化に応じて、スライド前の大きさへ戻すための被写体への接近または離反が可能であるか否かを判定する(ステップS910)。ここでの判定は、空間情報取得部43Aが取得した情報等に基づいて行われる。なお、ステップS909において、移動判定部492Aが、被写体像の大きさが変化していないと判定した場合(ステップS909:No)、移動装置4Aはメインルーチンへ戻る。 Processing after step S909 will be described. In step S909, the movement determination unit 492A determines whether or not the size of the subject image has changed before and after the slide (step S909). FIG. 24 is a diagram schematically illustrating the relationship between the distal end portion of the endoscope and the subject when the size of the subject image changes. As shown in FIG. 24, when the slide amount is large, the state is about Δd away from the subject 300 by crank bending from the front end portion 24A before the movement indicated by the broken line. For this reason, as shown in FIG. 25, the subject image 300b after the movement becomes smaller than the subject image 300a before the movement. If the size of the subject image changes as a result of the determination (step S909: Yes), the movement determination unit 492A approaches the subject to return to the size before the slide according to the change in the size of the subject image. Alternatively, it is determined whether or not separation is possible (step S910). This determination is made based on the information acquired by the spatial information acquisition unit 43A. If the movement determination unit 492A determines in step S909 that the size of the subject image has not changed (step S909: No), the moving device 4A returns to the main routine.
 ステップS910において、移動判定部492Aが、被写体への接近または離反が可能であると判定した場合(ステップS910:Yes)、推進制御部494Aは推進部41Aに接近または離反動作を行わせる(ステップS911)。図26は、図24に示す状況から移動装置4Aが前進して被写体300に接近した状況を模式的に示す図である。この状況において、撮像装置5Aは、図27に示すように、移動前の被写体像300aとほぼ同じ大きさの被写体像300cを撮像する。ステップS911の後、移動装置4Aはメインルーチンへ戻る。 In step S910, when the movement determination unit 492A determines that the approach or separation from the subject is possible (step S910: Yes), the propulsion control unit 494A causes the propulsion unit 41A to perform an approach or separation operation (step S911). ). FIG. 26 is a diagram schematically showing a situation in which the moving device 4A moves forward from the situation shown in FIG. In this situation, as illustrated in FIG. 27, the imaging device 5A captures a subject image 300c having approximately the same size as the subject image 300a before movement. After step S911, the moving device 4A returns to the main routine.
 ステップS910において、被写体への接近または離反が不可能であると判定した場合(ステップS910:No)、第1制御部49Aは、撮像装置5Aに対してズーム指示の信号を送信する(ステップS912)。 If it is determined in step S910 that approach or separation from the subject is impossible (step S910: No), the first control unit 49A transmits a zoom instruction signal to the imaging device 5A (step S912). .
 この後、撮像装置5Aからズーム失敗を示すエラー情報を受信した場合(ステップS913:Yes)、第1制御部49Aは操作装置3Aにエラー情報を送信する(ステップS914)。この後、移動装置4Aはスライド制御処理を終了する。ステップS913において、撮像装置5Aからエラー情報を受信しない場合(ステップS913:No)、移動装置4Aはスライド制御処理を終了する。 Thereafter, when error information indicating zoom failure is received from the imaging device 5A (step S913: Yes), the first control unit 49A transmits error information to the controller device 3A (step S914). Thereafter, the moving device 4A ends the slide control process. In step S913, when error information is not received from the imaging device 5A (step S913: No), the moving device 4A ends the slide control process.
 ステップS901において、第1制御部49Aが移動装置4Aのスライドを不可能であると判定した場合(ステップS901:No)に続けて行うステップS915~S917の処理は、実施の形態1の移動装置4のスライド制御で説明したステップS609~S611(図18を参照)に順次対応している。ステップS917の後、移動装置4Aはスライド制御処理を終了する。 In step S901, when the first control unit 49A determines that the sliding of the moving device 4A is impossible (step S901: No), the processing of steps S915 to S917 performed subsequently is the moving device 4 of the first embodiment. Steps S609 to S611 (see FIG. 18) described in the slide control of FIG. After step S917, the moving device 4A ends the slide control process.
 次に、図28に示すフローチャートを参照して、内視鏡2Aの撮像装置5Aが行うスライド制御の処理の概要を説明する。ステップS1001~S1005の処理は、実施の形態1で説明したステップS724~S728(図19Bを参照)に順次対応している。 Next, an overview of the slide control process performed by the imaging device 5A of the endoscope 2A will be described with reference to the flowchart shown in FIG. The processing in steps S1001 to S1005 sequentially corresponds to steps S724 to S728 (see FIG. 19B) described in the first embodiment.
 ステップS1004の後に行うステップS1006以降の処理を説明する。移動装置4Aからズーム指示信号を受信した場合(ステップS1006:Yes)、撮影制御部567Aは光学ズームによって要求ズーム位置までのズームが可能であるか否かを判定する(ステップS1007)。光学ズームによる要求ズーム位置までのズームが可能である場合(ステップS1007:Yes)、撮影制御部567Aは、光学系511Aに対して光学ズーム制御を行う(ステップS1008)。その後、撮像装置5Aはズーム制御処理を終了する。 Processing after step S1006 performed after step S1004 will be described. When the zoom instruction signal is received from the moving device 4A (step S1006: Yes), the imaging control unit 567A determines whether or not the zoom to the requested zoom position is possible by the optical zoom (step S1007). When zooming to the requested zoom position by optical zoom is possible (step S1007: Yes), the imaging control unit 567A performs optical zoom control on the optical system 511A (step S1008). Thereafter, the imaging device 5A ends the zoom control process.
 ステップS1006において、移動装置4Aからズーム指示信号を受信していない場合(ステップS1006:No)、撮像装置5Aはズーム制御処理を終了する。 In step S1006, when the zoom instruction signal is not received from the moving device 4A (step S1006: No), the imaging device 5A ends the zoom control process.
 光学ズームによる要求ズーム位置までのズームが不可能である場合(ステップS1007:No)、撮影制御部567Aは電子ズームによって要求ズーム位置までのズームが可能であるか否かを判定する(ステップS1009)。電子ズームによる要求ズーム位置までのズームが可能である場合(ステップS1009:Yes)、撮影制御部567Aは、要求ズーム位置まで電子ズームを行う(ステップS1010)。その後、撮像装置5Aはズーム制御処理を終了する。 If zooming to the required zoom position by optical zoom is impossible (step S1007: No), the imaging control unit 567A determines whether zooming to the required zoom position is possible by electronic zoom (step S1009). . When zooming to the requested zoom position by electronic zoom is possible (step S1009: Yes), the imaging control unit 567A performs electronic zoom to the requested zoom position (step S1010). Thereafter, the imaging device 5A ends the zoom control process.
 ステップS1009において、電子ズームによる要求ズーム位置までのズームが不可能である場合(ステップS1009:No)、撮像装置5Aはエラー情報を移動装置4Aへ送信する(ステップS1011)。その後、撮像装置5Aはスライド制御を終了する。 In step S1009, when zooming to the requested zoom position by electronic zoom is impossible (step S1009: No), the imaging device 5A transmits error information to the moving device 4A (step S1011). Thereafter, the imaging device 5A ends the slide control.
 ステップS1001において、移動装置4Aからスライド情報を受信しない場合(ステップS1001:No)に行うステップS1012~S1015の処理は、実施の形態1で説明したステップS735~S738の処理に順次対応している。ステップS1015の後、撮像装置5Aはスライド制御を終了する。なお、ステップS1013において、トリミングを行うことができない場合(ステップS1013:No)、撮像装置5AはステップS1011へ移行してエラー情報を移動装置4Aに送信する。 In step S1001, when the slide information is not received from the moving apparatus 4A (step S1001: No), the processing in steps S1012 to S1015 sequentially corresponds to the processing in steps S735 to S738 described in the first embodiment. After step S1015, the imaging device 5A ends the slide control. If trimming cannot be performed in step S1013 (step S1013: No), the imaging device 5A moves to step S1011 and transmits error information to the mobile device 4A.
 以上説明した本発明の実施の形態2によれば、実施の形態1と同様に、撮影時の移動経路が定まっていない状況下でも適切な処理を行うことができる。 According to the second embodiment of the present invention described above, as in the first embodiment, appropriate processing can be performed even under a situation where a moving path at the time of shooting is not determined.
 また、本実施の形態2によれば、ユーザはスライド操作を行うだけで装置側で画角が自動的に調整されるため、被検体の観察を違和感なく行うことができる。 Further, according to the second embodiment, the user can perform the observation of the subject without a sense of incongruity because the angle of view is automatically adjusted on the apparatus side only by performing the slide operation.
 ここまで、本発明を実施するための形態を説明してきたが、本発明は上述した実施の形態によってのみ限定されるべきものではない。例えば、移動装置として無人航空機や内視鏡を例示したが、自走可能なロボット、工業用内視鏡、カプセル内視鏡などにも適用することが可能である。また、顕微鏡の撮像装置を保持する鏡筒部分を移動装置としてもよい。 So far, the embodiment for carrying out the present invention has been described, but the present invention should not be limited only by the embodiment described above. For example, although an unmanned aerial vehicle or an endoscope is exemplified as the moving device, it can be applied to a self-propelled robot, an industrial endoscope, a capsule endoscope, or the like. Further, the lens barrel portion that holds the imaging device of the microscope may be a moving device.
 本明細書におけるフローチャートの説明では、「まず」、「この後」、「続いて」等の表現を用いてステップ間の処理の前後関係を明示していたが、上述した実施の形態を実施するために必要な処理の順序は、それらの表現によって一意的に定められるわけではない。すなわち、本明細書で記載したフローチャートにおける処理の順序は、矛盾のない範囲で変更することができる。 In the description of the flowchart in the present specification, the context of processing between steps is clearly indicated using expressions such as “first”, “after”, and “follow”, but the above-described embodiment is implemented. Therefore, the order of the processes necessary for this is not uniquely determined by their expressions. That is, the order of processing in the flowcharts described in this specification can be changed within a consistent range.
 また、本明細書においてフローチャートを用いて説明した処理のアルゴリズムは、プログラムとして記述することが可能である。このようなプログラムは、コンピュータ内部の記録部が記録してもよいし、コンピュータ読み取り可能な記録媒体に記録してもよい。プログラムの記録部または記録媒体への記録は、コンピュータまたは記録媒体を製品として出荷する際に行ってもよいし、通信ネットワークを介したダウンロードにより行ってもよい。 In addition, the processing algorithm described using the flowchart in this specification can be described as a program. Such a program may be recorded by a recording unit inside the computer, or may be recorded on a computer-readable recording medium. Recording of the program in the recording unit or recording medium may be performed when the computer or recording medium is shipped as a product, or may be performed by downloading via a communication network.
 このように、本発明は、ここでは記載していない様々な実施の形態を含みうるものであり、請求の範囲によって特定される技術的思想の範囲内で種々の設計変更等を行うことが可能である。 As described above, the present invention can include various embodiments not described herein, and various design changes can be made within the scope of the technical idea specified by the claims. It is.
 1、1A・・・撮像システム;2・・・被操縦体;2A・・・内視鏡;3、3A・・・操作装置;4、4A・・・移動装置;5、5A・・・撮像装置;21・・・被操縦体制御部;36・・・第3制御部;49・・・第1制御部;56・・・第2制御部;492・・・移動判定部;562・・・被写体検出部;563・・・追尾処理部;564・・・距離算出部;565・・・トリミング部 DESCRIPTION OF SYMBOLS 1, 1A ... Imaging system; 2 ... Steered object; 2A ... Endoscope; 3, 3A ... Operating device; 4, 4A ... Moving device; 21; Controlled body control unit; 36 ... Third control unit; 49 ... First control unit; 56 ... Second control unit; 492 ... Movement determination unit; Subject detection unit; 563, tracking processing unit; 564, distance calculation unit; 565, trimming unit

Claims (15)

  1.  被写体を撮像して画像データを生成する撮像装置、および前記撮像装置と通信可能であり、前記撮像装置を保持して前記撮像装置とともに移動可能な移動装置を備えた被操縦体であって、
     予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定するために必要な情報を検出し、前記情報を用いて前記相対的な関係の変化に応じた移動が可能であるか否かを判定し、判定結果に応じた制御を行う被操縦体制御部を備えたことを特徴とする被操縦体。
    An image pickup apparatus that picks up an image of a subject and generates image data, and a steered object that is capable of communicating with the image pickup apparatus and includes a moving device that holds the image pickup apparatus and is movable with the image pickup apparatus.
    Information necessary for determining whether or not the object can be moved in accordance with a change in the relative relationship between the imaging object specified in advance and the imaging device is detected, and the relative information is detected using the information. A steered body including a steered body control unit that determines whether or not movement according to a change in the relationship is possible and performs control according to the determination result.
  2.  被写体を撮像して画像データを生成する撮像装置と通信可能であり、前記撮像装置を保持して前記撮像装置とともに移動可能な移動装置であって、
     予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定し、判定結果に応じた制御を行う第1制御部を備えたことを特徴とする移動装置。
    A moving device that can communicate with an imaging device that captures an image of a subject and generates image data, and that can move with the imaging device while holding the imaging device;
    A first control unit that determines whether or not the object can be moved in accordance with a change in a relative relationship between the imaging object specified in advance and the imaging apparatus, and performs control according to the determination result; Feature mobile device.
  3.  前記相対的な関係は、前記撮影対象物と前記撮像装置との距離であり、
     前記第1制御部は、
     前記距離の変化によって生じる前記撮像装置が撮像した前記撮影対象物のサイズの変化に基づいて、前記距離を維持するための当該移動装置の移動距離を算出する移動判定部を有し、
     前記移動距離を用いて前記撮影対象物を追尾する制御を行うことを特徴とする請求項2に記載の移動装置。
    The relative relationship is a distance between the photographing object and the imaging device,
    The first controller is
    A movement determination unit that calculates a movement distance of the moving device for maintaining the distance based on a change in the size of the photographing object captured by the imaging device caused by the change in the distance;
    The moving apparatus according to claim 2, wherein control for tracking the photographing object is performed using the moving distance.
  4.  前記移動判定部は、
     当該移動装置の移動経路上に障害物がある場合には、前記移動距離と前記障害物までの距離との大小関係に応じて移動の可否を判定することを特徴とする請求項3に記載の移動装置。
    The movement determination unit
    4. When there is an obstacle on the movement path of the moving device, whether to move is determined according to a magnitude relationship between the moving distance and the distance to the obstacle. Mobile equipment.
  5.  前記相対的な関係は、前記撮像装置に対する前記撮影対象物の向きであり、
     前記第1制御部は、
     前記向きが変化した場合に当該移動装置の移動の可否を判定する移動判定部を有し、
     当該移動装置が移動可能である場合、前記向きを維持するように前記撮像装置における撮像方向を変更する制御を行うことを特徴とする請求項2に記載の移動装置。
    The relative relationship is an orientation of the photographing object with respect to the imaging device,
    The first controller is
    A movement determining unit that determines whether the moving device can move when the orientation changes;
    The mobile device according to claim 2, wherein when the mobile device is movable, control is performed to change an imaging direction in the imaging device so as to maintain the orientation.
  6.  当該移動装置の操作指示信号の入力を受け付ける操作装置と通信可能であり、
     前記第1制御部は、
     前記操作装置から前記撮像装置のズーム操作信号を受信した場合、ズーム制御指示信号を前記撮像装置に送信し、
     前記移動判定部は、
     前記ズーム制御指示信号を送信後に前記撮像装置から移動要求を受信した場合、前記ズーム操作信号に応じた分の移動が可能であるか否かを判定することを特徴とする請求項3~5のいずれか一項に記載の移動装置。
    It is possible to communicate with an operation device that accepts an input of an operation instruction signal of the mobile device,
    The first controller is
    When receiving a zoom operation signal of the imaging device from the operating device, a zoom control instruction signal is transmitted to the imaging device,
    The movement determination unit
    6. The apparatus according to claim 3, wherein when a movement request is received from the imaging apparatus after transmitting the zoom control instruction signal, it is determined whether or not movement corresponding to the zoom operation signal is possible. The movement apparatus as described in any one.
  7.  当該移動装置の操作指示信号の入力を受け付ける操作装置と通信可能であり、
     前記移動判定部は、
     前記操作装置からスライド操作信号を受信した場合、前記スライド操作信号に応じたスライドが可能であるか否かを判定し、
     前記第1制御部は、
     スライドが可能である場合にはスライド動作を開始させる制御を行う一方、スライドが可能でない場合には前記撮像装置に対してスライド情報および前記画像データのトリミング要求を送信することを特徴とする請求項3~6のいずれか一項に記載の移動装置。
    It is possible to communicate with an operation device that accepts an input of an operation instruction signal of the mobile device,
    The movement determination unit
    When a slide operation signal is received from the operation device, it is determined whether or not a slide according to the slide operation signal is possible,
    The first controller is
    The slide information and the trimming request for the image data are transmitted to the imaging device when the slide is possible, while the slide operation is controlled when the slide is possible. The moving device according to any one of 3 to 6.
  8.  移動装置と通信可能であるとともに前記移動装置に保持されてなり、被写体を撮像して画像データを生成する撮像装置であって、
     予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを前記移動装置が判定するために必要な情報を検出し、検出した情報を前記移動装置に送信する第2制御部を備えたことを特徴とする撮像装置。
    An imaging device that is communicable with a mobile device and is held by the mobile device, images a subject and generates image data,
    The mobile device detects information necessary for the mobile device to determine whether it can move in accordance with a change in the relative relationship between the imaging object specified in advance and the imaging device. An imaging apparatus comprising a second control unit that transmits to a mobile device.
  9.  前記第2制御部は、
     被写体としての前記撮影対象物を前記画像データから検出する被写体検出部と、
     前記被写体検出部が検出した前記撮影対象物を異なる複数の画像データ間で追尾する追尾処理部と、
     前記画像データを用いて当該撮像装置と前記撮影対象物との距離を算出するとともに前記撮影対象物の基準位置からのずれ量を算出する距離算出部と、
     を有することを特徴とする請求項8に記載の撮像装置。
    The second controller is
    A subject detection unit for detecting the object to be photographed as a subject from the image data;
    A tracking processing unit that tracks the imaging object detected by the subject detection unit between a plurality of different image data;
    A distance calculating unit that calculates a distance between the imaging device and the photographing object using the image data and calculates a deviation amount from a reference position of the photographing object;
    The imaging apparatus according to claim 8, comprising:
  10.  前記第2制御部は、
     前記移動装置から受信したズーム指示信号に応じてズーム可能であるか否かを判定し、ズーム可能である場合にはズーム制御を行う一方、ズーム可能でない場合には前記移動装置に移動要求を送信する撮影制御部を有することを特徴とする請求項8または9に記載の撮像装置。
    The second controller is
    In response to a zoom instruction signal received from the mobile device, it is determined whether or not zooming is possible. If zooming is possible, zoom control is performed. If zooming is not possible, a movement request is transmitted to the mobile device. The imaging apparatus according to claim 8, further comprising an imaging control unit that performs the imaging control unit.
  11.  前記第2制御部は、
     前記移動装置からトリミング要求およびスライド情報を受信した場合、前記トリミング要求およびスライド情報に基づいて前記画像データのトリミングが可能であるか否かを判定し、トリミングが可能であると判定した場合には前記スライド情報に基づいてトリミング画像のデータを生成するトリミング部を有することを特徴とする請求項8~10のいずれか一項に記載の撮像装置。
    The second controller is
    When a trimming request and slide information are received from the mobile device, it is determined whether or not the image data can be trimmed based on the trimming request and slide information, and if it is determined that trimming is possible 11. The imaging apparatus according to claim 8, further comprising a trimming unit that generates trimmed image data based on the slide information.
  12.  被写体を撮像して画像データを生成する撮像装置と通信可能であり、前記撮像装置を保持して前記撮像装置とともに移動可能な移動装置が行う移動制御方法であって、
     予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定する判定ステップと、
     前記判定ステップにおける判定結果を記録部から読み出して前記判定結果に応じた制御を行う制御ステップと、
     を有することを特徴とする移動制御方法。
    A movement control method performed by a moving device that is capable of communicating with an imaging device that captures an image of a subject and generates image data, and that holds the imaging device and is movable with the imaging device,
    A determination step of determining whether or not the object can be moved in accordance with a change in a relative relationship between a pre-designated object to be imaged and the imaging device;
    A control step of reading the determination result in the determination step from the recording unit and performing control according to the determination result;
    A movement control method characterized by comprising:
  13.  移動装置と通信可能であるとともに前記移動装置に保持されてなり、被写体を撮像して画像データを生成する撮像装置が行う移動補助方法であって、
     予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを前記移動装置が判定するために必要な情報を検出する検出ステップと、
     前記検出ステップが検出した情報を記録部から読み出して前記移動装置に送信する送信ステップと、
     を有することを特徴とする移動補助方法。
    A movement assistance method performed by an imaging device that is communicable with a mobile device and is held by the mobile device and that captures an image of a subject and generates image data,
    A detection step of detecting information necessary for the moving device to determine whether or not the moving object can be moved in accordance with a change in a relative relationship between a pre-designated imaging object and the imaging device;
    A transmission step of reading information detected by the detection step from a recording unit and transmitting the information to the mobile device;
    A movement assisting method characterized by comprising:
  14.  被写体を撮像して画像データを生成する撮像装置と通信可能であり、前記撮像装置を保持して前記撮像装置とともに移動可能な移動装置に、
     予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを判定する判定ステップと、
     前記判定ステップにおける判定結果を記録部から読み出して前記判定結果に応じた制御を行う制御ステップと、
     を実行させることを特徴とする移動制御プログラム。
    A mobile device that can communicate with an imaging device that captures an image of a subject and generates image data, and that can move with the imaging device while holding the imaging device.
    A determination step of determining whether or not the object can be moved in accordance with a change in a relative relationship between a pre-designated object to be imaged and the imaging device;
    A control step of reading the determination result in the determination step from the recording unit and performing control according to the determination result;
    A movement control program characterized in that is executed.
  15.  移動装置と通信可能であるとともに前記移動装置に保持されてなり、被写体を撮像して画像データを生成する撮像装置に、
     予め指定された撮影対象物と前記撮像装置との相対的な関係の変化に応じて移動可能であるか否かを前記移動装置が判定するために必要な情報を検出する検出ステップと、
     前記検出ステップが検出した情報を記録部から読み出して前記移動装置に送信する送信ステップと、
     を実行させることを特徴とする移動補助プログラム。
    An imaging device that is communicable with a mobile device and is held by the mobile device to capture an image of a subject and generate image data.
    A detection step of detecting information necessary for the moving device to determine whether or not the moving object can be moved in accordance with a change in a relative relationship between a pre-designated imaging object and the imaging device;
    A transmission step of reading information detected by the detection step from a recording unit and transmitting the information to the mobile device;
    A mobility assistance program characterized by causing
PCT/JP2018/003690 2017-02-16 2018-02-02 Steered object, moving device, imaging device, movement control method, movement assist method, movement control program, and movement assist program WO2018150917A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-027316 2017-02-16
JP2017027316A JP2018133749A (en) 2017-02-16 2017-02-16 Controlled object, moving device, imaging apparatus, movement control method, movement assisting method, movement control program, and movement assisting program

Publications (1)

Publication Number Publication Date
WO2018150917A1 true WO2018150917A1 (en) 2018-08-23

Family

ID=63169390

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/003690 WO2018150917A1 (en) 2017-02-16 2018-02-02 Steered object, moving device, imaging device, movement control method, movement assist method, movement control program, and movement assist program

Country Status (2)

Country Link
JP (1) JP2018133749A (en)
WO (1) WO2018150917A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020100321A1 (en) * 2018-11-12 2020-05-22 Hiroyuki Nakanishi Capsule endoscope
CN113424515A (en) * 2019-02-21 2021-09-21 索尼集团公司 Information processing apparatus, information processing method, and program
CN114007938A (en) * 2019-06-18 2022-02-01 日本电气方案创新株式会社 Manipulation support device, manipulation support method, and computer-readable recording medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI820194B (en) * 2018-08-31 2023-11-01 日商索尼半導體解決方案公司 Electronic equipment and solid-state imaging devices
JP2020102817A (en) * 2018-12-25 2020-07-02 凸版印刷株式会社 Monitor target identification apparatus, monitor target identification system, and monitor target identification method
US20220413518A1 (en) * 2019-10-24 2022-12-29 Sony Group Corporation Movable object, information processing method, program, and information processing system
JP7219204B2 (en) * 2019-11-26 2023-02-07 弘幸 中西 unmanned aerial vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015207149A (en) * 2014-04-21 2015-11-19 薫 渡部 monitoring system and monitoring method
JP2016212465A (en) * 2015-04-28 2016-12-15 株式会社ニコン Electronic device and imaging system
JP2017021445A (en) * 2015-07-07 2017-01-26 キヤノン株式会社 Communication device, control method thereof, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015207149A (en) * 2014-04-21 2015-11-19 薫 渡部 monitoring system and monitoring method
JP2016212465A (en) * 2015-04-28 2016-12-15 株式会社ニコン Electronic device and imaging system
JP2017021445A (en) * 2015-07-07 2017-01-26 キヤノン株式会社 Communication device, control method thereof, and program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020100321A1 (en) * 2018-11-12 2020-05-22 Hiroyuki Nakanishi Capsule endoscope
CN113424515A (en) * 2019-02-21 2021-09-21 索尼集团公司 Information processing apparatus, information processing method, and program
CN114007938A (en) * 2019-06-18 2022-02-01 日本电气方案创新株式会社 Manipulation support device, manipulation support method, and computer-readable recording medium

Also Published As

Publication number Publication date
JP2018133749A (en) 2018-08-23

Similar Documents

Publication Publication Date Title
WO2018150917A1 (en) Steered object, moving device, imaging device, movement control method, movement assist method, movement control program, and movement assist program
US11649052B2 (en) System and method for providing autonomous photography and videography
US10447912B2 (en) Systems, methods, and devices for setting camera parameters
EP3071482B1 (en) Uav panoramic imaging
JP2014062789A (en) Photograph measuring camera and aerial photographing device
EP3972235A1 (en) Focusing method and apparatus, aerial photography camera, and unmanned aerial vehicle
CN107205111B (en) Image pickup apparatus, mobile apparatus, image pickup system, image pickup method, and recording medium
US10356294B2 (en) Photographing device, moving body for photographing, and photographing control apparatus for moving body
CN111356954B (en) Control device, mobile body, control method, and program
WO2020172800A1 (en) Patrol control method for movable platform, and movable platform
JP7391053B2 (en) Information processing device, information processing method and program
CN111417836A (en) Environment acquisition system
WO2019183789A1 (en) Method and apparatus for controlling unmanned aerial vehicle, and unmanned aerial vehicle
WO2020062089A1 (en) Magnetic sensor calibration method and movable platform
WO2021217371A1 (en) Control method and apparatus for movable platform
JP6910785B2 (en) Mobile imager and its control method, as well as imager and its control method, unmanned aerial vehicle, program, storage medium
WO2021168821A1 (en) Mobile platform control method and device
JP2019084898A (en) Flight body maneuvering system and method for maneuvering flight body using flight body maneuvering system
JP2021113005A (en) Unmanned aircraft system and flight control method
JP2020068426A (en) Camera device, image processing device, and mirror movable mechanism
CN106060357B (en) Imaging device, unmanned aerial vehicle and robot
CN111357271B (en) Control device, mobile body, and control method
CN111373735A (en) Shooting control method, movable platform and storage medium
WO2021059684A1 (en) Information processing system, information processing method, and information processing program
WO2021217372A1 (en) Control method and device for movable platform

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18754012

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18754012

Country of ref document: EP

Kind code of ref document: A1