WO2018150917A1 - Objet dirigé, dispositif mobile, dispositif d'imagerie, procédé de commande de mouvement, procédé d'aide au mouvement, programme de commande de mouvement et programme d'aide au mouvement - Google Patents

Objet dirigé, dispositif mobile, dispositif d'imagerie, procédé de commande de mouvement, procédé d'aide au mouvement, programme de commande de mouvement et programme d'aide au mouvement Download PDF

Info

Publication number
WO2018150917A1
WO2018150917A1 PCT/JP2018/003690 JP2018003690W WO2018150917A1 WO 2018150917 A1 WO2018150917 A1 WO 2018150917A1 JP 2018003690 W JP2018003690 W JP 2018003690W WO 2018150917 A1 WO2018150917 A1 WO 2018150917A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
movement
imaging device
unit
subject
Prior art date
Application number
PCT/JP2018/003690
Other languages
English (en)
Japanese (ja)
Inventor
聡司 原
浩一 新谷
真琴 尾崎
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Publication of WO2018150917A1 publication Critical patent/WO2018150917A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters

Definitions

  • the present invention relates to a steered object, a moving device, an imaging device, a movement control method, a movement assist method, a movement control program, and a movement assist program.
  • the unmanned air vehicle measures the distance between the airframe and the object to be imaged while flying on a predetermined flight route, and determines the zoom magnification of the camera from the result to image the object to be imaged.
  • the present invention has been made in view of the above, and a steered object, a moving device, an imaging device, a movement control method, a movement that can perform appropriate processing even in a situation where a moving route at the time of shooting is not fixed
  • An object is to provide an auxiliary method, a movement control program, and a movement auxiliary program.
  • a steered body is capable of communicating with an imaging device that captures an image of a subject and generates image data, and the imaging device. Whether or not the steered body is provided with a movable device that can be held and moved together with the imaging device, and is movable according to a change in a relative relationship between a predetermined imaging target and the imaging device. To detect the information necessary for determining whether or not the information can be used to determine whether or not movement according to the change in the relative relationship is possible, and to perform control according to the determination result A body control unit is provided.
  • the moving device is a moving device that can communicate with an imaging device that captures an image of a subject and generates image data, and that can move with the imaging device while holding the imaging device.
  • a first control unit is provided that determines whether or not the object is movable according to a change in a relative relationship between the object to be imaged and the imaging device, and performs control according to the determination result.
  • An imaging apparatus is an imaging apparatus that is communicable with a mobile device and is held by the mobile device, and that captures an image of a subject to generate image data.
  • a second control unit that detects information necessary for the mobile device to determine whether it can move according to a change in a relative relationship with the imaging device and transmits the detected information to the mobile device It is provided with.
  • the movement control method is a movement control method performed by a moving apparatus that is capable of communicating with an imaging apparatus that captures an image of a subject and generates image data, and that is movable with the imaging apparatus.
  • a determination step for determining whether or not the object can be moved in accordance with a change in a relative relationship between the imaging object specified in advance and the imaging device, and a determination result in the determination step is read from the recording unit.
  • a movement assistance method is a movement assistance method performed by an imaging apparatus that is communicable with a mobile device and is held by the mobile device and that captures an image of a subject and generates image data.
  • the movement control program is capable of communicating with an imaging apparatus that captures an image of a subject and generates image data.
  • the movement assistance program according to the present invention is communicable with a mobile device and is held by the mobile device, and captures an image of a subject and generates image data.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an imaging system according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram showing a functional configuration of the imaging system according to Embodiment 1 of the present invention.
  • FIG. 3 is a diagram schematically showing the external configuration of the operating device.
  • FIG. 4 is a diagram schematically showing operation assignment of the operation input unit of the operation device in the normal operation mode.
  • FIG. 5 is a diagram for explaining the normal lock-on mode.
  • FIG. 6 is a diagram for explaining the angle lock-on mode.
  • FIG. 7 is a diagram showing an outline of processing when shifting from the normal operation mode to the lock-on mode.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an imaging system according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram showing a functional configuration of the imaging system according to Embodiment 1 of the present invention.
  • FIG. 3 is a diagram schematically showing the external configuration of the operating device
  • FIG. 8 is a diagram illustrating a display example on the display unit of the controller device when the mode is shifted to the lock-on mode.
  • FIG. 9 is a diagram illustrating a display example on the display unit of the controller device when the composition adjustment mode is set.
  • FIG. 10 is a diagram schematically illustrating an example of operation assignment in the operation input unit when the composition adjustment mode is set.
  • FIG. 11 is a flowchart illustrating an outline of processing performed by the controller device.
  • FIG. 12A is a flowchart (part 1) showing an overview of processing performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 12B is a flowchart (part 2) illustrating an overview of the process performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 13 is a flowchart showing an outline of subject tracking control processing performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 14 is a diagram showing an outline of the movement distance calculation process performed by the movement determination unit of the mobile device according to Embodiment 1 of the present invention.
  • FIG. 15 is a flowchart showing an overview of processing of shooting direction tracking control performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 16 is a flowchart showing an overview of zoom control processing performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 17 is a diagram illustrating an outline of a moving distance calculation process performed by the movement determination unit of the moving device according to the first embodiment of the present invention during the zoom control process.
  • FIG. 18 is a flowchart showing an overview of slide control processing performed by the mobile device according to Embodiment 1 of the present invention.
  • FIG. 19A is a flowchart (part 1) illustrating an overview of processing performed by the imaging apparatus according to Embodiment 1 of the present invention.
  • FIG. 19B is a flowchart (part 2) illustrating an overview of the process performed by the imaging device according to Embodiment 1 of the present invention.
  • FIG. 20 is a flowchart illustrating an overview of zoom control processing performed by the imaging apparatus according to Embodiment 1 of the present invention.
  • FIG. 21 is a diagram showing a schematic configuration of an imaging system according to Embodiment 2 of the present invention.
  • FIG. 22 is a block diagram showing a functional configuration of the imaging system according to Embodiment 2 of the present invention.
  • FIG. 23 is a flowchart illustrating an outline of a slide control process performed by the mobile device according to the second embodiment of the present invention.
  • FIG. 24 is a diagram schematically illustrating the relationship between the distal end portion of the endoscope and the subject when the size of the subject image changes.
  • FIG. 25 is a diagram schematically showing changes in the subject image in the case shown in FIG.
  • FIG. 26 is a diagram schematically illustrating a situation in which the moving device has advanced from the situation illustrated in FIG. 24 and has approached the subject.
  • FIG. 27 is a diagram schematically showing changes in the subject image in the case shown in FIG.
  • FIG. 28 is a flowchart illustrating an outline of slide control processing performed by the imaging apparatus according to Embodiment 2 of the present invention.
  • An imaging system includes a steered body in which an imaging device is mounted on a moving device.
  • the steered object detects information necessary for determining whether or not the steerable object can move in accordance with a change in the relative relationship between the imaging object specified in advance and the imaging apparatus, and uses the information. It is determined whether or not movement according to a change in relative relationship is possible, and control according to the determination result is performed.
  • the mobile device constituting the steered body is an unmanned aerial vehicle (Embodiment 1) and an endoscope (Embodiment 2) will be described by way of example.
  • Embodiment 1 unmanned aerial vehicle
  • Embodiment 2 an endoscope
  • these are merely examples, and the mobile device It is also possible to apply a self-running robot.
  • the moving device is an unmanned aerial vehicle.
  • the imaging device is mounted on an unmanned aerial vehicle.
  • the mobile device is wirelessly connected so as to be communicable with the operating device, and is driven according to an instruction from the operating device, and receives an instruction signal for the imaging device and transmits the instruction signal to the imaging device.
  • the steered object has a lock-on function, and tracks the shooting target while sequentially determining whether or not the movement of the shooting target can be tracked.
  • FIG. 1 is a schematic diagram showing a schematic configuration of an imaging system according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram illustrating a functional configuration of the imaging system according to the first embodiment.
  • An imaging system 1 shown in FIGS. 1 and 2 includes a steered body 2 that has an imaging function and can be moved by flying, and an operating device 3 that steers imaging and movement of the steered body 2.
  • the steered body 2 includes a moving device 4 and an imaging device 5 that is detachably attached to the moving device 4.
  • the imaging device 5 may be attached to the moving device 4 via a stabilizer, a vibration correction mechanism (for example, a gimbal), a rig, or the like.
  • the mobile device 4 and the imaging device 5 are connected to each other via a cable 6 such as a USB (Universal Serial Bus) so as to be capable of bidirectional communication.
  • the mobile device 4 and the controller device 3 can perform wireless communication in a predetermined frequency band.
  • the mobile device 4 and the imaging device 5 are connected by the cable 6.
  • the present invention is not limited to this, and a configuration capable of wireless communication may be adopted.
  • the positional relationship between the moving device 4 and the imaging device 5 may be fixed, or the posture of the imaging device 5 with respect to the moving device 4 may be changed.
  • the to-be-steered body 2 detects information necessary for determining whether or not it can move according to a change in the relative relationship between the imaging object 5 specified in advance and the imaging device 5, and the detected information
  • the to-be-steered body control part 21 which determines whether the movement according to the change of the relative relationship is possible using is performed, and performs control according to the determination result is provided.
  • the steered body control unit 21 is configured using a CPU (Central Processing Unit) or the like, and controls the steered body 2 according to various instruction signals transmitted from the operation device 3.
  • the steered body control unit 21 includes a first control unit 49 of the moving device 4 and a second control unit 56 of the imaging device 5 which will be described later.
  • the steered body control unit 21 is a circuit unit that performs various types of control by specific sequence control in cooperation with a dedicated circuit or program, and includes a circuit unit for artificial intelligence as necessary. It is also possible to perform control using results such as deep learning and machine learning (this is also true for the third control unit 36 of the controller device 3).
  • a dedicated circuit or program cooperates with a circuit unit that performs various types of control with specific sequence control, including artificial intelligence circuit units as necessary, and results such as deep learning and machine learning.
  • a part of the configuration and circuit of the control unit that performs the used control is classified according to function. Note that some of the individual functions of the first control unit 49 and the second control unit 56 may be included in the other control unit.
  • the moving device 4 determines whether or not the moving device 4 can move according to a change in the relative relationship between the imaging object specified in advance and the imaging device 5, and performs control according to the determination result.
  • the moving device 4 is an unmanned aerial vehicle (UAV: Unmanned Aero Vehicle), and is configured as a drone of a rotary wing unmanned aircraft having four rotors 4a.
  • the number of rotors 4a is not limited to four and may be other numbers.
  • the moving device 4 is not limited to a drone of a rotary wing drone, but may be constituted by other unmanned aircraft such as a drone of a fixed wing drone.
  • the mobile device 4 may be any device that can be operated wirelessly and can be self-propelled, and may be, for example, a self-propelled robot, a car, a model car, a ship, or the like.
  • the mobile device 4 includes a propulsion unit 41, a power source 42, a spatial information acquisition unit 43, a position / orientation detection unit 44, an altitude / attitude detection unit 45, a first recording unit 46, a first communication unit 47, and a first communication unit 47. 2 communication unit 48 and first control unit 49.
  • the propulsion unit 41 uses a plurality of (four in the example of FIG. 1) rotors 4a and a plurality of motors (not shown) for driving the rotors 4a under the control of the first control unit 49.
  • the mobile device 4 is made to fly.
  • the power source 42 is configured using a battery and a booster circuit.
  • the power source 42 supplies a predetermined voltage to each part of the moving device 4.
  • the spatial information acquisition unit 43 is configured using a laser radar or the like.
  • the spatial information acquisition unit 43 irradiates the pulse laser beam radially from the moving device 4 and measures the feedback time of the reflected pulse laser beam to thereby detect an object (direction of each irradiation point, each of the irradiation points) around the moving device 4. Spatial information regarding the distance to the irradiation point) is acquired, and the acquired result is output to the first control unit 49.
  • the spatial information acquisition unit 43 may acquire the spatial information by irradiating ultrasonic waves instead of the laser.
  • the spatial information acquisition unit 43 may further include a non-contact type proximity sensor.
  • the position / orientation detection unit 44 is configured using a GPS (Global Positioning System) receiver, a magnetic direction sensor, or the like.
  • the position / orientation detection unit 44 includes the current position information regarding the current position of the mobile device 4 and an angle (azimuth) formed by the nose direction (the direction in which the nose faces) of the mobile device 4 with respect to a predetermined reference orientation (for example, north). )
  • Direction information is detected, and the detection result is output to the first control unit 49.
  • the altitude posture detection unit 45 is configured using an atmospheric pressure sensor, a gyro sensor (angular velocity sensor), an inclination sensor (acceleration sensor), and the like.
  • the altitude posture detection unit 45 includes altitude information regarding the altitude of the imaging device 5, tilt angle information regarding the tilt angle of the image capturing device 5 (the tilt angle with respect to the reference posture in which the four rotors 4 a are positioned in the horizontal plane), and rotation of the moving device 4. Rotation angle information related to the angle (rotation angle centered on a vertical line passing through the center position of the four rotors 4 a) is detected, and the detection result is output to the first control unit 49.
  • the first recording unit 46 is configured using a recording medium such as a flash memory or an SDRAM (Synchronous Dynamic Random Access Memory).
  • the first recording unit 46 records various programs for driving the mobile device 4 and temporarily records information being processed.
  • the first communication unit 47 is configured using a predetermined communication module.
  • the first communication unit 47 performs wireless communication with the operation device 3 that remotely controls the flight operation of the mobile device 4 under the control of the first control unit 49. Further, the first communication unit 47 transmits information input from the imaging device 5 to the operation device 3 under the control of the first control unit 49.
  • the second communication unit 48 is configured using a communication module.
  • the second communication unit 48 communicates with the imaging device 5 via the cable 6 under the control of the first control unit 49.
  • the first control unit 49 determines whether or not the first control unit 49 is movable in accordance with a change in the relative relationship between the imaging target specified in advance and the imaging device 5, and performs control according to the determination result.
  • the first control unit 49 is configured using a CPU or the like.
  • the first control unit 49 receives an instruction signal from the operation device 3 input via the first communication unit 47 and the imaging device 5 input via the second communication unit 48. In accordance with the data, the operation of the propulsion unit 41 (the flight operation of the moving device 4) and the communication with the imaging device 5 and the operation device 3 are controlled.
  • the first control unit 49 includes a power supply determination unit 491, a movement determination unit 492, an attitude determination unit 493, a propulsion control unit 494, a direction control unit 495, and a first communication control unit 496.
  • the power source determination unit 491 detects the remaining amount (power remaining amount) of the power source 42 and determines the flight time and the flight distance that the mobile device 4 is capable of based on the detection result.
  • the movement determination unit 492 has the spatial information acquired by the spatial information acquisition unit 43, the current position information and direction information detected by the position / orientation detection unit 44, the altitude information detected by the altitude / attitude detection unit 45, and the power source determination unit 491.
  • the moving distance of the moving device 4 is determined based on the flight time information and the like. Note that the function of the movement determination unit 492 may be included in the second control unit 56 of the imaging device 5.
  • the posture determination unit 493 determines the posture of the imaging device 5 based on the tilt angle information and the rotation angle information detected by the altitude posture detection unit 45.
  • the propulsion control unit 494 propels the mobile device 4 based on the determination results of the power source determination unit 491, the movement determination unit 492, and the posture determination unit 493. Specifically, the propulsion control unit 494 raises, moves forward, stops, moves backward, and moves down the moving device 4 by independently controlling the rotational speeds of the four rotors 4a. For example, the propulsion control unit 494 changes the moving direction of the moving device 4 to the front and rear (the nose direction of the moving device 4 is front) or to the left and right (right and left when viewed in the nose direction of the moving device 4).
  • the direction control unit 495 rotates the moving device 4 based on the determination results of the power source determination unit 491, the movement determination unit 492, and the posture determination unit 493. Specifically, the direction control unit 495 rotates the moving device 4 by independently controlling the number of rotations of the four rotors 4a.
  • the first communication control unit 496 controls communication between the first communication unit 47 and the second communication unit 48. Specifically, the first communication control unit 496 transmits information input from the imaging device 5 via the second communication unit 48 to the controller device 3 via the first communication unit 47. In addition, the first communication control unit 496 receives information including operation information input from the operation device 3 via the first communication unit 47 (hereinafter referred to as “mobile device information”) via the second communication unit 48. It transmits to the imaging device 5.
  • the moving device information is the current position of the moving device 4, control information (control information) related to a control signal for controlling the movement of the moving device 4 transmitted from the operating device 3, and the moving direction and moving of the moving device 4.
  • the altitude of the apparatus 4 and the state of the moving apparatus 4 for example, normal movement, return movement, remaining battery level).
  • the imaging device 5 detects and detects information necessary for the moving device 4 to determine whether or not it can move according to a change in the relative relationship between the imaging object specified in advance and itself. Information is transmitted to the mobile device 4.
  • the imaging device 5 is fixed so as not to move with respect to the moving device 4 in a state where the shooting direction is aligned with the nose direction of the moving device 4.
  • the imaging device 5 can perform still image shooting and moving image shooting.
  • the imaging device 5 may be fixed to the moving device 4 via a rotation mechanism that can rotate with respect to the moving device 4.
  • the imaging device 5 may be fixed to the moving device 4 via a stabilizer, a vibration correction mechanism (for example, a gimbal), a rig, or the like.
  • the imaging device 5 includes an imaging unit 51, an elevation angle direction detection unit 52, a voice input unit 53, a third communication unit 54, a second recording unit 55, and a second control unit 56.
  • the imaging unit 51 captures an image of a subject that is an object to be imaged under the control of the second control unit 56 and generates image data.
  • the imaging unit 51 includes an optical system 511 and an imaging element 512.
  • the optical system 511 includes a focus lens, a zoom lens, a shutter, and a diaphragm.
  • the optical system 511 forms a subject image on the light receiving surface of the image sensor 512.
  • the optical system 511 changes each shooting parameter such as zoom magnification, focus position, aperture value, and shutter speed under the control of the second control unit 56.
  • the image sensor 512 receives the subject image formed by the optical system 511 and performs photoelectric conversion to generate image data, and outputs the image data to the second control unit 56.
  • the imaging element 512 is configured using a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like.
  • the imaging element 512 includes pixels that generate image data arranged in a two-dimensional matrix.
  • the image sensor 512 may be further provided with a phase difference pixel for detecting the distance of the subject, or the image data generation pixel described above may be used as the phase difference pixel.
  • the elevation angle azimuth detecting unit 52 is configured using an azimuth sensor, a triaxial acceleration sensor, a gyro sensor, and a magnetic azimuth sensor.
  • the elevation angle azimuth detection unit 52 includes elevation angle information regarding the inclination angle (elevation angle) of the imaging device 5 (the optical axis of the optical system 511) with respect to the horizontal plane, and the shooting direction of the imaging device 5 with respect to a predetermined reference orientation (for example, north).
  • a predetermined reference orientation for example, north
  • the audio input unit 53 is configured using a microphone or the like that collects external sounds and generates audio data.
  • the third communication unit 54 communicates with the mobile device 4 via the cable 6 under the control of the second control unit 56.
  • the second recording unit 55 is configured using a flash memory, an SDRAM, a memory card, and the like.
  • the second recording unit 55 includes an image data recording unit 551 that records image data generated by the imaging unit 51, and a program recording unit 552 that records various programs executed by the imaging device 5.
  • the second control unit 56 detects information necessary for the moving device 4 to determine whether or not it can move in accordance with a change in the relative relationship between the photographing object specified in advance and itself, The detected information is transmitted to the mobile device 4.
  • the second control unit 56 is configured using a CPU or the like.
  • the imaging of the imaging unit 51 is controlled and the image data generated by the imaging unit 51 is transmitted to the mobile device 4.
  • the second control unit 56 instead of transmitting the image data as it is, the data thinned out temporally or pixelally (resized image data) or compressed image data (reduced image data or compression processing). Image data) may be transmitted.
  • the second control unit 56 may transmit the image data after performing a process for increasing the visibility (for example, contrast enhancement process or exposure value reduction process), or a result of analyzing the image (described later).
  • the feature amount of the image extracted by the subject detection unit 562 may be transmitted.
  • the second control unit 56 includes an image processing unit 561, a subject detection unit 562, a tracking processing unit 563, a distance calculation unit 564, a trimming unit 565, a second communication control unit 566, an imaging control unit 567, Have
  • the image processing unit 561 performs predetermined image processing on the image data generated by the imaging unit 51. Specifically, the image processing unit 561 performs gain-up processing, white balance processing, gradation processing, synchronization processing, and the like on the image data.
  • the subject detection unit 562 detects a subject included in the image data by extracting the feature amount of the image subjected to the image processing by the image processing unit 561.
  • the subject include a pre-designated photographing object and an object (obstacle) existing between the photographing object.
  • the subject detection unit 562 is based on the luminance value for each pixel in the image data, the amount of movement of the representative pixel from the previous frame, and the like. Detect information such as.
  • the subject detection unit 562 may include an artificial intelligence circuit unit as necessary to perform detection using results such as deep learning using big data and machine learning. Note that detection may be performed with reference to touch designation on the user's screen or voice designation.
  • a subject image to be a specific subject image is designated and input.
  • the determination result is a part of the image data, but the image data itself may be recorded as data as a determination result, or the color distribution, outline, size, etc. may be converted into data, and the image feature amount may be determined. It may be recorded.
  • a system may be constructed in which artificial intelligence can infer target information that the user wants to take by performing deep learning, machine learning, or the like. As a result, it is possible to perform learning using a frequently photographed image as a teacher image.
  • the tracking processing unit 563 tracks the subject in the image based on information detected by the subject detection unit 562 for a preset subject. At this time, the tracking processing unit 563 tracks the subject by performing pattern matching processing based on information detected by the subject detection unit 562, for example. Similarly to the subject detection unit 562, the tracking processing unit 563 may include an artificial intelligence circuit unit as necessary to perform detection using the results of deep learning using big data, machine learning, or the like.
  • the distance calculation unit 564 is configured from the imaging device 5 to the subject based on a contrast change between two pieces of image data that are generated before and after the time generated by the imaging unit 51 or phase difference information by phase difference pixels provided in the imaging element 512. The distance is calculated.
  • the distance calculation unit 564 may determine the size of the subject and calculate the distance to the subject based on a change in the size of the subject between frames.
  • the distance calculation unit 564 may calculate the distance to the subject using the spatial information acquired by the spatial information acquisition unit 43 of the mobile device 4.
  • the information on the distance to the subject can be used for focusing the optical system 511, and is also important information for the user who operates the moving device 4 as the distance information between the moving device 4 and the subject and the moving device 4 and the object. .
  • the user 100 may perform an operation for designating stationary control such as hovering by operating the operation device 3 or the first control of the moving device 4.
  • the unit 49 may automatically perform the same stationary control.
  • the distance information to the subject can be effectively used when, for example, the subject is photographed in a predetermined order while changing the distance, or when the subject is photographed from various angles with the distance kept constant.
  • the distance information can also be used when the imaging device 5 performs shooting to acquire accurate 3D information.
  • the functions of the subject detection unit 562, the tracking processing unit 563, and the distance calculation unit 564 may be included in the first control unit 49 of the moving device 4.
  • the trimming unit 565 generates trimming image data and thumbnail image data by performing trimming processing on the image data subjected to image processing by the image processing unit 561.
  • the second communication control unit 566 performs communication control of the third communication unit 54. Specifically, the second communication control unit 566 transmits information that can be displayed on the display unit 33 of the controller device 3 to the mobile device 4 via the third communication unit 54.
  • the imaging control unit 567 controls imaging of the imaging unit 51. Specifically, when the imaging control unit 567 receives an instruction signal instructing imaging from the mobile device 4 via the third communication unit 54, the imaging control unit 567 causes the imaging unit 51 to perform imaging according to the instruction signal. In the first embodiment, the imaging device 5 can perform moving image shooting and still image shooting.
  • the controller device 3 includes a fourth communication unit 31, an operation input unit 32, a display unit 33, an audio output unit 34, a third recording unit 35, and a third control unit 36.
  • the fourth communication unit 31 is configured using a communication module.
  • the fourth communication unit 31 transmits an instruction signal received by the operation input unit 32 by performing wireless communication with the first communication unit 47 of the mobile device 4 under the control of the third control unit 36.
  • the image data captured by the imaging device 5 is received and output to the third control unit 36.
  • the operation input unit 32 is configured by using a button, a switch, a cross key, or the like that receives an operation by the user, receives an input of an instruction signal corresponding to the operation by the user, and outputs the instruction signal to the third control unit 36.
  • the operation input unit 32 includes a touch panel that is provided so as to overlap the display unit 33 and receives an input of a signal corresponding to the contact position of an object from the outside.
  • the operation input unit 32 may further include a voice input unit such as a microphone, and may receive an input of an instruction signal corresponding to the user's voice input.
  • the display unit 33 displays various information related to the moving device 4 under the control of the third control unit 36.
  • the display unit 33 is configured using a display panel such as liquid crystal or organic EL (Electro Luminescence).
  • the display unit 33 displays an image corresponding to the image data generated by the imaging device 5 and various types of information regarding the imaging device 5 under the control of the third control unit 36.
  • the audio output unit 34 is configured using a speaker or the like that outputs audio, and outputs sound recorded corresponding to data such as a moving image captured by the imaging device 5.
  • FIG. 3 is a diagram schematically showing the external configuration of the operating device 3.
  • the operation device 3 shown in the figure is provided with a display unit 33 and an operation input unit 32 on the surface.
  • the operation input unit 32 includes a left stick 32a, a right stick 32b, and a touch panel 32c arranged on the left and right.
  • FIG. 4 is a diagram schematically showing operation assignment of the operation input unit 32 of the operation device 3 in the normal operation mode.
  • the movement of the left lever back and forth corresponds to the movement back and forth.
  • the left and right movements of the left stick 32a correspond to left and right rotations, respectively.
  • the forward / backward movement of the right stick 32b corresponds to ascending and descending, respectively.
  • the right and left movements of the right stick 32b correspond to the left and right dolly, respectively.
  • a stick operation corresponding to the operation may be performed after the user selects a desired operation on the touch panel 32c.
  • the display unit 33 shows the assignment of the stick operation after the selection input on the touch panel 32c. Further, all operations may be performed on the touch panel 32c.
  • the third recording unit 35 records various programs related to the operation device 3.
  • the third recording unit 35 is configured using a flash memory or an SDRAM.
  • the third control unit 36 is configured using a CPU and controls each unit of the controller device 3.
  • the third control unit 36 controls a mode control unit 361 that controls the operation mode of the steered object 2, and a third communication control unit 362 that controls communication between the fourth communication unit 31 and the first communication unit 47 of the mobile device 4.
  • a first display control unit 363 that controls the display of the display unit 33.
  • the mode control unit 361 can set a normal control mode and a lock-on mode as the operation mode of the control target 2.
  • the normal maneuvering mode is a mode in which the to-be-steered body 2 operates based on an operation signal in which the controller device 3 receives an input.
  • the lock on mode is a mode for tracking a subject to be locked on.
  • the tracking process is performed so that the lock-on target subject is at the same position in the screen and the distance to the lock-on target subject remains the same.
  • the normal lock-on mode is a mode in which the orientation of the subject 200 in the screen does not matter as shown in FIG. 5 when the subject is kept within the angle of view.
  • the angle lock-on mode is a mode in which the tracking process is performed so that the orientation of the subject 200 in the screen remains the same as shown in FIG. Therefore, for example, when the subject to be locked on is a person, the face direction of the person may change in the normal lock-on mode, while the face direction of the person does not change in the angle lock-on mode.
  • the user When shifting from the normal operation mode to the lock-on mode, the user first touches the subject 200 to be locked-on on the screen of the display unit 33 as shown in FIG. Thereafter, as shown in FIG. 8, the display unit 33 displays a normal lock-on mode icon 33a, an angle lock-on mode icon 33b, and a return button 33c.
  • the mode control unit 361 of the controller device 3 performs processing according to each icon or button. Specifically, the mode control unit 361 performs a process of setting a lock-on mode corresponding to the touched icon or resetting the normal operation mode with the return button 33c.
  • the composition adjustment mode is a mode in which the operation device 3 accepts input of only an instruction signal for composition adjustment for an image displayed on the display unit 33 and does not accept input of an instruction signal related to movement of the moving device 4.
  • the composition adjustment mode is set, the user can concentrate on the composition adjustment without being distracted by the operation of the mobile device 4.
  • FIG. 9 is a diagram showing a display example on the display unit 33 when the composition adjustment mode is set.
  • the display unit 33 displays a return button 33c, a shooting icon 33d, a zoom icon 33e, an ISO icon 33f, an aperture icon 33g, a shutter speed icon 33h, and an exposure correction icon 33i.
  • the return button 33c is touched, the display unit 33 returns to the display shown in FIG.
  • the controller device 3 transmits a signal instructing shooting to be performed by the imaging device 5 to the moving device 4.
  • the controller device 3 When any one of the zoom icon 33e, the ISO icon 33f, the aperture icon 33g, the shutter speed icon 33h, and the exposure correction icon 33i is touched, the controller device 3 is in a state where it can accept the change of the parameter corresponding to the touched icon. Transition.
  • FIG. 10 is a diagram schematically illustrating an example of operation assignment in the operation input unit 32 when the composition adjustment mode is set.
  • the forward / backward movement of the left stick 32a corresponds to an angle change by rotation (pitch) with the horizontal direction of the machine body as the rotation center.
  • the left / right movement of the left stick 32a corresponds to an angle change by rotation (roll) with the front-rear direction of the machine body as the center of rotation.
  • the forward direction and the backward direction of the right stick 32b correspond to the up and down of the selected parameter among the zoom icon 33e, ISO icon 33f, aperture icon 33g, shutter speed icon 33h, and exposure correction icon 33i, respectively.
  • the left direction of the right stick 32b corresponds to the left slide, and the right direction corresponds to the right slide.
  • the zoom up / down operation may be received by pinch out / pinch in on the touch panel 32c.
  • the left and right slide operations (dolly) may be received by left and right slides on the touch panel 32c.
  • the operating device 3 may not have a configuration with a lever.
  • a mobile terminal such as a smartphone may be realized as the operation device 3.
  • the input of the operation signal by the user may be realized by voice input, or may be realized by changing the direction of the main body of the operation device 3 or shaking the main body of the operation device 3.
  • step S101: Yes the mode control unit 361 sets the operation mode to the normal operation mode (step S102).
  • step S101: No the controller device 3 repeats step S101.
  • the third communication control unit 362 establishes communication with the mobile device 4 (step S103).
  • the first display control unit 363 receives the image data from the mobile device 4 and starts displaying the image corresponding to the received image data on the display unit 33 (step S104).
  • the controller device 3 transmits and receives necessary data in addition to the image data after establishing communication with the mobile device 4.
  • step S105: No the case where the operation input unit 32 does not accept the input of the mode change instruction signal (step S105: No) will be described.
  • a normal control signal corresponding to the operation signal is transmitted to the mobile device 4 (step S107).
  • the normal control signal here includes a signal for instructing the imaging device 5 to capture a still image or a moving image, a signal for requesting information on the remaining battery level to supply power to the moving device 4, and a flight of the moving device 4. It includes signals to steer.
  • step S106: No the controller device 3 proceeds to step S108.
  • step S108 when the power of the controller device 3 is cut off (step S108: Yes), the controller device 3 ends the process. On the other hand, when the power source of the controller device 3 is not cut off (step S108: No), the controller device 3 returns to step S105 if the operation mode is the normal control mode (step S109: Yes). If the operation mode is not the normal operation mode in step S109 (step S109: No), the controller device 3 returns to step S102.
  • step S105 Yes
  • the change of the operation mode is a change to one of the normal lock-on mode and the angle lock-on mode. In either mode, it is necessary to designate a subject to be locked on.
  • a touch panel (a part of the operation input unit 32) provided on the screen of the display unit 33 detects a touch of a subject to be locked on, and inputs a signal for designating the subject to be locked on. This is realized by receiving an input signal based on the position.
  • the mode control unit 361 changes the setting of the operation mode to the normal lock on mode or the angle lock on mode, and changes to the normal or angle lock on mode.
  • the mode change signal to be instructed and the position information of the subject to be locked on are transmitted to the moving device 4 (step S110).
  • the mode control unit 361 sets the operation mode to the other.
  • the mode change signal for instructing the change to the normal or angle lock on mode and the position information of the subject to be locked on are transmitted to the moving device 4 (step S112).
  • step S113: Yes the controller device 3 proceeds to step S108. If the power is not cut off in step S108 (step S108: No), the operation mode is not the normal operation mode (step S109: No), and the controller device 3 returns to step S102. In step S102, the mode control unit 361 sets the operation mode to the normal operation mode.
  • the imaging device 5 may be allowed to capture a still image or a moving image.
  • step S111 when the operation input unit 32 does not accept an input of an instruction signal for changing from one of the normal lock-on mode and the angle lock-on mode to the other (step S111: No), the controller device 3 proceeds to step S113.
  • step S113 when the input of the signal for ending the normal or angle lock-on mode is not accepted (step S113: No), the controller device 3 proceeds to step S114.
  • step S114 when the operation input unit 32 receives an input of an instruction signal for changing to the composition adjustment mode (step S114: Yes), the mode control unit 361 changes the operation mode setting to the composition adjustment mode (step S115). ).
  • step S116: Yes when the operation input unit 32 receives an input of a zoom operation signal (step S116: Yes), the third communication control unit 362 performs zoom control communication with the mobile device 4 (step S117).
  • step S116: No When the operation input unit 32 does not accept the input of the zoom operation signal (step S116: No), the controller device 3 proceeds to step S118.
  • step S118 when the operation input unit 32 receives an input of a slide operation (step S118: Yes), the third communication control unit 362 performs slide control communication with the mobile device 4 (step S119). On the other hand, when the operation input unit 32 does not accept the input of the slide operation (step S118: No), the controller device 3 proceeds to step S120.
  • step S120 when the operation input unit 32 receives an input of an angle change operation (step S120: Yes), the third communication control unit 362 performs angle change control communication with the mobile device 4 (step S121). On the other hand, when the operation input unit 32 does not accept the input of the angle change operation (step S120: No), the controller device 3 proceeds to step S122.
  • step S122 when the operation input unit 32 receives an input of an exposure change operation (step S122: Yes), the third communication control unit 362 performs exposure change control communication with the mobile device 4 (step S123). On the other hand, when the operation input unit 32 does not accept the input of the exposure change operation (step S122: No), the controller device 3 proceeds to step S124.
  • step S124 when the operation input unit 32 receives an input of a focus change operation (step S124: Yes), the third communication control unit 362 performs focus change control communication with the mobile device 4 (step S125). On the other hand, when the operation input unit 32 does not accept the input of the focus change operation (step S124: No), the controller device 3 proceeds to step S126.
  • step S126 when the operation input unit 32 receives an input of a shooting operation (step S126: Yes), the third communication control unit 362 performs shooting control communication with the mobile device 4 (step S127).
  • step S126: No When the operation input unit 32 does not accept the input of the shooting operation (step S126: No), the controller device 3 proceeds to step S128.
  • step S1208 when the operation input unit 32 receives an input of a signal for ending the composition adjustment mode (step S128: Yes), the controller device 3 returns to step S108. If the power is not cut off in step S108 (step S108: No), the operation mode is not the normal operation mode (step S109: No), and the controller device 3 returns to step S102. In step S102, the mode control unit 361 sets the operation mode to the normal operation mode.
  • step S128 when the operation input unit 32 does not accept the input of the signal for ending the composition adjustment mode (step S128: No), the controller device 3 returns to step S116.
  • step S114 when the operation input unit 32 does not accept the input of the instruction signal for changing to the composition adjustment mode (step S114: No), the controller device 3 returns to step S111.
  • the moving device 4 determines whether or not the moving device 4 can move in accordance with a change in the relative relationship between the imaging object 5 designated in advance and the imaging device 5, and reads the determination result from the first recording unit 46 for determination. Control according to the result. Detailed processing of the mobile device 4 including such movement control processing will be described with reference to the flowcharts shown in FIGS. 12A and 12B.
  • step S201: Yes the first communication control unit 496 establishes communication with the imaging device 5 and the operation device 3 (step S202). If the mobile device 4 is not powered on in step S201 (step S201: No), the mobile device 4 repeats step S201.
  • the first communication control unit 496 starts receiving image data from the imaging device 5 and transmitting image data to the operation device 3 (step S203). Thereafter, the moving device 4 receives the image data from the imaging device 5 at a predetermined interval, and transmits the image data to the operation device 3. The mobile device 4 transmits / receives necessary data in addition to the image data after establishing communication with the imaging device 5 and the operation device 3.
  • step S203 when the mobile device 4 receives the normal control signal from the controller device 3 (step S204: Yes), the first control unit 49 performs processing according to the normal control signal (step S205).
  • the processing according to the normal control signal here refers to, for example, transmission of a still image shooting start instruction signal, moving image shooting start or end instruction signal to the imaging device 5, and operation of the image data received from the imaging device 5. And the confirmation of the remaining battery level of the mobile device 4. Thereafter, the mobile device 4 proceeds to Step S206.
  • step S204 when the mobile device 4 does not receive the normal control signal (step S204: No), the mobile device 4 proceeds to step S206.
  • step S206 when the moving device 4 receives the mode change signal to the normal or angle lock-on mode and the position information of the subject to be locked on (step S206: Yes), the first control unit 49 sets the lock-on reference position. Is stored in the first recording unit 46 (step S207), and a signal for instructing tracking of the lock-on target subject and the position information of the lock-on target subject are transmitted to the imaging device 5 (step S208).
  • step S206 when the moving device 4 does not receive the mode change signal to the normal or angle lock-on mode and the position information of the subject to be locked on (step S206: No), the moving device 4 proceeds to step S224 to be described later. .
  • FIG. 13 is a flowchart showing an outline of subject tracking control processing performed by the moving device 4.
  • the moving device 4 determines the distance from the imaging device 5 to the subject that is the lock-on target calculated by the imaging device 5, the amount of deviation from the reference position of the subject, and the distance to the obstacle when there is an obstacle.
  • Obtain (step S301).
  • the “reference position” here is the center of the screen in the initial setting, and when a slide operation is performed, the center is the position where the center has been slid.
  • the “deviation amount” here is a three-dimensional deviation amount. The detection of the distance to the subject and the obstacle by the imaging device 5 and the calculation of the shift amount will be described in detail when the processing of the imaging device 5 is described.
  • the movement determination unit 492 calculates the movement distance of the moving device 4 for maintaining the position and size of the subject based on the acquired deviation amount (step S302).
  • the movement determination unit 492 sets the movement distance ⁇ L in the direction orthogonal to the screen, the distance L to the subject 200, the size of the subject before the subject 200 moves, and the subject 200 moving.
  • the moving distance ⁇ L has a positive value in the direction approaching the subject 200. In other words, when the subject 200 moves and becomes smaller on the screen, the moving distance becomes positive, and when the subject 200 moves and becomes larger on the screen, the moving distance becomes negative.
  • the movement distance in the direction parallel to the screen may be calculated in the same way.
  • step S303: Yes the movement determination unit 492 determines that the movement distance of the moving device 4 is less than the distance to the obstacle (step S304: Yes). 4 is determined to be possible (step S305). The movement determination unit 492 determines that the movement of the moving device 4 is possible even when there is no obstacle on the movement route (step S303: No) (step S305).
  • the propulsion control unit 494 follows the movement of the subject according to the movement distance calculated in step S302 so that the imaging device 5 can continue to capture the subject within the angle of view. Control is performed (step S306). Thereafter, the mobile device 4 proceeds to Step S308.
  • step S304 if the moving distance of the moving device 4 is equal to or greater than the distance to the obstacle (step S304: No), the movement determining unit 492 determines that the moving device 4 cannot move (step S307). . Thereafter, the mobile device 4 proceeds to Step S308.
  • step S308 when the moving device 4 is set to the angle lock on mode (step S308: Yes), the first control unit 49 performs imaging direction tracking control (step S309).
  • FIG. 15 is a flowchart illustrating an outline of processing of imaging direction tracking control performed by the moving device 4.
  • the imaging device 5 sends a change in the direction of the subject. The processing of the imaging device 5 at this time will be described later.
  • the movement determination unit 492 moves the movement device 4 based on the information. It is determined whether or not it is possible (step S402).
  • the movement determination unit 492 employs a model that approximates the subject's face to a sphere, and whether or not the moving device 4 can be rotated to a reference angle while keeping the distance from the rotation center constant with the center of the sphere as the rotation center. Determine.
  • the direction control unit 495 starts changing the imaging direction (step S404). For example, when a model that approximates the face with a sphere is adopted, the direction control unit 495 rotates the moving device 4 by the angle calculated in step S402 while keeping the distance from the rotation center constant with the center of the sphere as the rotation center. Do.
  • the posture of the imaging device 5 with respect to the moving device 4 can be changed, the moving device 4 moves while maintaining its own posture and changes the imaging direction by changing the posture of the imaging device 5 with respect to the moving device 4. You may make it make it. In this case, the movement of the moving device 4 and the posture change of the imaging device 5 may be performed simultaneously, or the posture of the imaging device 5 may be changed after the movement of the moving device 4.
  • step S404 when the lost information of the subject is received from the imaging device 5 while the imaging direction of the moving device 4 is being changed (step S405: Yes), the direction control unit 495 performs the rotation operation of the moving device 4. Control to cancel is performed (step S406), and lost information is transmitted to the controller device 3 (step S407).
  • “when the subject is lost” includes a case where a part of the subject disappears from the image, in other words, “when the subject is likely to be lost”.
  • step S407 the moving device 4 ends the shooting direction tracking control and returns to the main routine.
  • step S405 when the lost information of the subject is not received from the imaging device 5 while the imaging direction of the mobile device 4 is being changed (step S405: No), the change of the imaging direction is completed (step S408: Yes), the moving device 4 ends the shooting direction tracking control and returns to the main routine.
  • step S408: No when the change of the imaging direction is not completed in step S408 (step S408: No), the moving device 4 returns to step S405.
  • step S401 when the moving device 4 does not acquire information on the change in the direction of the lock-on subject from the imaging device 5 (step S401: No), and in step S403, the movement determining unit 492 determines that the movement is not possible. In the case (step S403: No), the moving device 4 ends the shooting direction tracking control and returns to the main routine.
  • step S308 when the moving device 4 is not set to the angle lock on mode (step S308: No), the moving device 4 ends the subject tracking control and returns to the main routine.
  • the processing after step S209 described above will be described with reference to FIG. 12B.
  • the first control unit 49 determines whether or not a control signal for instructing the change to the composition adjustment mode has been received (step S210). First, the case where the composition adjustment control signal is received (step S210: Yes) will be described. In this case, when a zoom operation signal is received (step S211: Yes), the first control unit 49 performs zoom control (step S212).
  • FIG. 16 is a flowchart showing an overview of zoom control processing performed by the moving device 4.
  • the power source determination unit 491 performs the process after determining that the remaining amount of the battery is equal to or greater than a predetermined value (for example, 50% or more of the full charge value).
  • a predetermined value for example, 50% or more of the full charge value.
  • the movement determination unit 492 determines whether or not movement is possible (step S503).
  • the movement request is sent from the imaging device 5 when the imaging device 5 determines that optical zoom cannot be performed.
  • the movement determination unit 492 determines whether or not the movement corresponding to the zoom up or zoom down corresponding to the zoom operation signal received from the controller device 3 is possible. Specifically, as illustrated in FIG. 17, the movement determination unit 492 sets the movement distance ⁇ L ′ in the direction orthogonal to the screen, the distance L to the subject, the size of the subject before moving the imaging device 5, and the imaging.
  • ⁇ L ′ ⁇ 1 ⁇ (c / d) ⁇ L, where d is the size of the subject after movement of the device 5.
  • the determination of whether to move based on the calculated movement distance ⁇ L ′ is the same as in steps S303 to S307 described with reference to FIG.
  • the movement distance in the direction parallel to the screen may be calculated in the same manner.
  • step S503 when it is determined that the movement determination unit 492 is movable (step S504: Yes), the propulsion control unit 494 drives the propulsion unit 41 to control the movement of the moving device 4 (step). S505).
  • the propulsion control unit 494 performs control to move the moving device 4 to a predetermined position before the obstacle.
  • step S506 when the lost information of the subject is received from the imaging device 5 while the moving device 4 is moving (step S506: Yes), the propulsion control unit 494 performs control to stop the movement of the moving device 4 ( In step S507, the lost information is transmitted to the controller device 3 (step S508).
  • “when the subject is lost” also includes a case where a part of the subject disappears from the image, that is, “when the subject is likely to be lost”.
  • step S508 the moving device 4 ends the zoom control and returns to the main routine.
  • step S506: No If the lost information of the subject is not received from the imaging device 5 while the moving device 4 is moving in step S506 (step S506: No), when the movement is completed (step S509: Yes), the first control unit 49 Transmits information on the end of movement to the imaging device 5 (step S510). Thereafter, the moving device 4 ends the zoom control and returns to the main routine. When the movement is not completed in step S509 (step S509: No), the moving device 4 returns to step S506.
  • step S502 If the movement request is not received from the imaging device 5 in step S502 (step S502: No) and the determination result is not movable in step S504 (step S504: No), the moving device 4 returns to the main routine.
  • step S211: No A case where the zoom operation signal is not received in step S211 (step S211: No) will be described.
  • step S213: Yes when the first control unit 49 receives a slide operation signal (step S213: Yes), the first control unit 49 performs slide control (step S214).
  • FIG. 18 is a flowchart showing an outline of the slide control process performed by the moving device 4.
  • the movement determination unit 492 determines whether or not the moving device 4 can perform a sliding operation (step S601).
  • the propulsion control unit 494 starts the slide operation (step S602).
  • the first control unit 49 also starts transmitting information regarding the slide operation to the imaging device 5.
  • the “information relating to the slide operation” here includes, for example, information relating to the amount and direction of the slide.
  • step S602 when the moving device 4 receives the lost information of the subject from the imaging device 5 during the sliding operation (step S603: Yes), the propulsion control unit 494 performs control to stop the sliding operation of the moving device 4 ( In step S604, the lost information is transmitted to the controller device 3 (step S605).
  • the subject is lost includes a case where a part of the subject disappears from the image, that is, “a case where the subject is likely to be lost”.
  • step S605 the mobile device 4 returns to the main routine.
  • step S603: No when the sliding operation of the moving device 4 is completed (step S606: Yes), the first control unit 49 changes depending on the sliding operation.
  • the reference position after the recording is stored in the first recording unit 16 (step S607), and the reference position is transmitted to the imaging device 5 (step S608). Thereafter, the moving device 4 ends the slide control and returns to the main routine.
  • step S606: No the moving device 4 returns to step S603.
  • step S601 when the movement determination unit 492 determines that the sliding operation of the moving device 4 is not possible (step S601: No), the first control unit 49 transmits a signal requesting image trimming to the imaging device 5. (Step S609). At this time, the first control unit 49 transmits information regarding the slide amount to the imaging device 5 together with the request signal.
  • the moving device 4 receives trimming image data or error information from the imaging device 5 (step S610).
  • the error information is information sent from the imaging device 5 when the imaging device 5 determines that trimming is impossible.
  • the moving device 4 transmits the trimmed image data or error information received from the imaging device 5 to the controller device 3 (step S611). Thereafter, the moving device 4 ends the slide control and returns to the main routine.
  • step S213: No the first control unit 49 does not receive the slide operation signal in step S213
  • step S215: Yes the first control unit 49 receives the angle change operation signal
  • step S216 the direction control unit 495 performs angle change control
  • the direction control unit 495 determines whether or not the process can be executed, and executes the movement of the moving device 4 and the change of the tilt angle (posture) when the process can be executed. Instead of simultaneously executing the movement of the moving device 4 and changing the angle, the angle of the moving device 4 may be changed after the moving device 4 is moved.
  • Step S215 the case where the first control unit 49 does not receive the angle change operation signal (Step S215: No) will be described.
  • the first control unit 49 receives the focus change operation signal (step S217: Yes)
  • the first control unit 49 transmits a control signal instructing the focus change to the imaging device 5 (step S218).
  • step S217: No The case where the first control unit 49 does not receive the focus change operation signal in step S217 (step S217: No) will be described.
  • the first control unit 49 receives the exposure change operation signal (step S219: Yes)
  • the first control unit 49 transmits a control signal instructing the exposure change to the imaging device 5 (step S220).
  • Step S219 the case where the first control unit 49 does not receive an exposure change operation signal (Step S219: No) will be described.
  • the first control unit 49 receives a shooting operation signal (step S221: Yes)
  • the first control unit 49 transmits a control signal instructing shooting to the imaging device 5 (step S222).
  • step S221 when the first control unit 49 does not receive the shooting operation signal (step S221: No), the moving device 4 proceeds to step S223.
  • step S223 when the first control unit 49 receives an end signal of the composition adjustment mode from the controller device 3 (step S223: Yes), when the mobile device 4 is powered off (step S224: Yes), the mobile device 4 ends the process. If the mobile device 4 is not turned off in step S224 (step S224: No), the mobile device 4 returns to step S204.
  • step S223: Yes when the first control unit 49 does not receive the composition adjustment mode end signal (step S223: No), the mobile device 4 returns to step S209.
  • step S210: No the control signal instructing the change to the composition adjustment mode is not received in step S210.
  • step S225: Yes the moving device 4 proceeds to step S211.
  • step S225 When the operation mode is not the composition adjustment mode in step S225 (step S225: No), when the first control unit 49 receives the lock-on mode end signal from the controller device 3 (step S226: Yes), the moving device 4 A lock-on mode end signal is transmitted to the imaging device 5 (step S227), and the process proceeds to step S224. On the other hand, when the first control unit 49 does not receive the lock-on mode end signal from the controller device 3 (step S226: No), the moving device 4 returns to step S209.
  • steps S211 to S222 corresponding to the composition adjustment mode process can be performed in parallel.
  • the imaging device 5 detects information necessary for the moving device 4 to determine whether it can move in accordance with a change in the relative relationship between the imaging object 5 specified in advance and the imaging device 5, The detected information is read from the second recording unit 55 and transmitted to the moving device 4 to assist the processing of the moving device 4. Detailed processing of the imaging apparatus 5 including such movement assistance processing will be described with reference to the flowcharts shown in FIGS. 19A and 19B.
  • step S701: Yes the imaging control unit 567 performs control to start imaging (step S702). If the power of the imaging device 5 is not turned on in step S701 (step S701: No), the imaging device 5 repeats step S701.
  • step S702 the second communication control unit 566 establishes communication with the mobile device 4 (step S703). Subsequently, the second communication control unit 566 starts transmitting the image data generated by the image processing unit 561 to the moving device 4 (step S704).
  • step S705 when the normal control signal is received from the mobile device 4 (step S705: Yes), the second control unit 56 performs normal control according to the signal (step S706).
  • the normal control here includes still image or moving image shooting, transmission of information related to the remaining battery level of the imaging device 5 to the moving device 4, and the like.
  • step S705 when the normal control signal is not received from the mobile device 4 (step S705: No), the imaging device 5 proceeds to step S707 described later.
  • step S707 when the signal for instructing tracking of the lock-on target subject and the position information of the lock-on target subject are received from the mobile device 4 (step S707: Yes), the subject detection unit 562 detects the lock-on target subject. Detection is started (step S708). The subject detection unit 562 detects the subject based on the color, shape, pattern, and the like of the subject, for example.
  • Step S707 when the signal for instructing tracking of the lock-on target and the position information of the lock-on target are not received from the moving device 4 (Step S707: No), the imaging device 5 returns to Step S705.
  • step S708 A case will be described in which the subject detection unit 562 starts subject detection in step S708 and then loses the subject to be locked on (step S709: Yes).
  • the second control unit 56 transmits lost information to the mobile device 4 (step S710).
  • “when the subject is lost” includes a case where a part of the subject disappears from the image, in other words, “when the subject is likely to be lost”.
  • step S711: Yes the subject detection unit 562 finishes the lock-on target subject detection (step S712).
  • step S711: No when the lock-on mode end signal is not received from the mobile device 4 (step S711: No), the imaging device 5 returns to step S709.
  • step S712 when the power of the imaging device 5 is turned off (step S713: Yes), the imaging device 5 ends the process. If the power of the imaging device 5 is not turned off in step S713 (step S713: No), the imaging device 5 returns to step S705.
  • step S714 the processing after step S714 performed when the subject to be locked on has not been lost in step S709 (step S709: No) will be described.
  • the distance calculation unit 564 calculates the distance to the subject that is the lock-on target.
  • a three-dimensional deviation amount from the reference position is calculated by obtaining a movement in a two-dimensional plane using a motion vector or the like (step S715).
  • the imaging device 5 transmits the calculation result of step S715 to the moving device 4 (step S716).
  • step S714 when the operation mode is set to the angle lock on mode (step S714: No), the subject detection unit 562 detects the orientation of the subject (step S717). Thereafter, the imaging device 5 proceeds to step S715. In this case, the imaging device 5 also transmits the orientation of the subject to the moving device 4 in step S716.
  • step S717 when the subject detection unit 562 detects an obstacle between the subject (step S718: Yes), the distance calculation unit 564 calculates the distance to the obstacle (step S719). Thereafter, the imaging device 5 transmits the distance to the obstacle to the moving device 4 (step S720). After step S720 and when the subject detection unit 562 does not detect an obstacle between the subject and the subject in step S718 (step S718: No), the imaging device 5 proceeds to step S721.
  • step S721 when the imaging device 5 receives a signal instructing the change to the composition adjustment mode (step S721: Yes), the imaging device 5 proceeds to step S722. On the other hand, when the imaging device 5 does not receive a signal for instructing change to the composition adjustment mode (step S721: No), the imaging device 5 proceeds to step S711.
  • step S722 when a zoom control signal is received from the mobile device 4 (step S722: Yes), the imaging control unit 567 performs zoom control (step S723). In step S722, when a zoom control signal is not received from the moving device 4 (step S722: No), the imaging device 5 proceeds to step S724 described later.
  • the imaging control unit 567 refers to the second recording unit 55 to calculate the change in the center of gravity of the steered body 2 based on the zoom instruction amount (step S801).
  • step S802 determines whether or not optical zoom is possible (step S803).
  • the determination in step S802 may be performed using an amount other than the change in the center of gravity.
  • the amount of movement of the mechanism for obtaining the buoyancy of the steered object 2, the value converted into the output of a gyro sensor or the like for balancing the steered object 2, or the position of the weight in the optical axis direction A threshold value may be set for a value converted by a change or the like, and step S803 may be performed when the value is equal to or less than the threshold value. Further, a threshold value (for example, 10%) is set for the ratio of the center of gravity movement amount of the steered object 2 in the optical axis direction of the imaging device 5 with respect to the entire length of the steered object 2, and the step is performed when the threshold is equal to or less than this threshold S803 may be performed.
  • a threshold value for example, 10%
  • the imaging control unit 567 may determine whether or not there is a change in the center of gravity using a plurality of parameters. If the optical zoom is possible (step S803: Yes), the imaging control unit 567 performs optical zoom control on the optical system 511 (step S804). Whether or not optical zoom is possible is determined based on conditions such as the type and nature of the lens and the required image quality. At the time of optical zoom, a signal indicating that may be transmitted to the moving device 4 before zoom control, and the moving device 4 may perform a stabilization operation in preparation for unexpected state fluctuations. Such processing of the imaging device 5 may be performed during the processing of step S802 or S803, or may be performed as a separate process.
  • the determination based on the influence of the vibration may be performed, or the determination based on the influence may be comprehensively determined in combination with the determination described above. For example, when the wind speed is strong and it is difficult to approach the object due to the influence of the wind, it may be used together with the determination of performing zooming.
  • step S805: Yes when the shooting control unit 567 determines that the movement to the requested zoom position has been completed based on information from the imaging device 5 (step S805: Yes), the shooting control unit 567 ends the zoom control.
  • step S805: No when the shooting control unit 567 determines that the movement to the requested zoom position has not been completed (step S805: No), the shooting control unit 567 performs electronic zoom to the requested zoom position (step S806). Thereafter, the imaging control unit 567 ends the zoom control.
  • step S803 If the optical zoom is not possible in step S803 (step S803: No), the imaging device 5 proceeds to step S806.
  • step S802 determines the magnitude of the amount of motion between the frames of the subject and the predetermined threshold (step S807). If the amount of movement of the subject is equal to or greater than the threshold (the amount of motion is large) (step S807: Yes), processing according to the magnitude relationship between the zoom speed of the subject and the approach speed to the subject is performed.
  • step S804 when the enlargement speed by the zoom of the subject image is larger than the enlargement speed by the approach (step S808: Yes), if the imaging device 5 can execute the optical zoom and the electronic zoom (step S809: Yes), the step The process proceeds to S804 to perform optical zoom control.
  • the optical zoom or the like is performed when it is determined that the movement of the center of gravity of the steered object 2 is large because the steerable object 2 may be able to move safely while losing its posture.
  • the imaging device 5 may comprehensively determine whether or not the optical zoom or the like can be performed from the surrounding environment at the time of determination, the stability of the steered object 2 and the performance information of the moving device 4.
  • a determination based on the influence of wind or vibration in the steered object 2 may be performed.
  • the determination based on the above may be combined with the determination described above to make a comprehensive determination. For example, when the wind speed is strong and it is difficult to approach the object due to the influence of the wind, it may be used together with the determination of performing zooming. Further, when the influence of the vibration of the to-be-steered body 2 is large, there is a case where it is more advantageous for blurring and framing to zoom in close to the object instead of zooming in on the telephoto side, so that step S809 is performed.
  • step S807 A determination based on the influence of wind and the influence of vibration of the steered body 2 may be added to the above process.
  • step S807 if the amount of movement of the subject is smaller than the threshold value (the movement is small) in step S807 (step S807: No), the imaging device 5 assumes that the subject will not escape and the imaging device 5 performs steps described later.
  • the process proceeds to S810.
  • the amount of movement of the subject may be determined based on a change in the distance to the subject, a change in the size of the subject image, or the like, particularly in view of the fact that the optical axis direction of the photographing lens is in focus. In order to simplify the description and make the features of the invention easier to understand, the outline of the process has been described using several branches.
  • step S808: Yes When the enlargement speed by the zoom of the subject image is larger than the enlargement speed by the approach (step S808: Yes) and the imaging device 5 cannot execute the optical zoom or the electronic zoom (No at step S809), and the enlargement by the zoom of the subject image
  • the subject detection unit 562 detects an obstacle in the image (step S810), and the obstacle detection result and the movement request to the moving device 4 are detected. Is transmitted (step S811).
  • This movement request is a movement request before the obstacle.
  • the determination based on the influence of the wind or the vibration of the steered body 2 may be performed, or the determination based on the influence may be comprehensively determined in combination with the above-described determination.
  • the movement of the subject can be detected by the time change of the image, the subject may be predicted by an image or the like even if the subject is not actually moving, and such a result may be used. It can also be determined by performing machine learning of image changes about what kind of object moves and how it moves after that.
  • the features of the imaging device 5 are described here, there is also a feature as the steered body 2 including the moving device 4 communicating with the imaging device 5 or the imaging device 5 and the moving device 4.
  • the steered body 2 including the moving device 4 that can communicate with the imaging device 5 and can hold the imaging device 5 and move together with the imaging device 5 is connected between the imaging object 5 and the imaging object 5 designated in advance.
  • changes such as the center of gravity of the imaging device 5, movements such as the agility of the object, prediction of those changes and movements
  • Information such as the performance of the pilot 2 and the environment around the pilot 2 is required.
  • the steered body control unit 21 of the steered body 2 detects such information and can use the detected information to move in accordance with a change in the relative relationship between the object to be imaged and the imaging device 5. It is one of the features of the first embodiment to determine whether or not to perform control according to the determination result.
  • the imaging device 5 transmits lost information to the moving device 4 (step S813). Thereafter, the imaging device 5 ends the zoom control and returns to the main routine.
  • step S812 If the subject detection unit 562 does not lose the subject in step S812 (step S812: No), the imaging device 5 performs zoom control when the imaging device 5 receives a movement end notification from the moving device 4 (step S814: Yes). End and return to the main routine.
  • step S814 when the imaging device 5 does not receive the movement end notification from the moving device 4 (step S814: No), the imaging device 5 returns to step S812.
  • step S724: Yes the tracking processing unit 563 performs lost determination (step S725).
  • step S725: No when the subject is not lost (step S725: No)
  • step S726: Yes when the reference position after the slide is received (step S726: Yes), the second control unit 56 writes information on the reference position in the second recording unit 55. Store (step S727).
  • step S725 When the subject is lost in step S725 (step S725: Yes), the second control unit 56 transmits lost information to the moving device 4 (step S728).
  • step S731 when an exposure change operation signal is received from the moving device 4 (step S731: Yes), the imaging control unit 567 performs a process of changing the exposure (step S732). After step S732 and when no exposure change operation signal is received from the moving device 4 in step S731 (step S731: No), the imaging device 5 proceeds to step S733.
  • step S733 when a shooting operation signal is received from the mobile device 4 (step S733: Yes), the shooting control unit 567 performs shooting according to the shooting operation signal (step S734). After step S734 and when no shooting operation signal is received from the mobile device 4 in step S733 (step S733: No), the imaging device 5 proceeds to step S711.
  • step S724: No The case where the slide information is not received from the mobile device 4 in step S724 (step S724: No) will be described.
  • the trimming unit 565 determines whether trimming is possible (step S736). The determination here is based on the presence or absence of a margin part that can supplement the image of the part that was not displayed in the captured image due to the movement when the reference position of the subject to be locked on is slid. Do.
  • step S736 Yes
  • the trimming unit 565 performs trimming (step S737), and transmits the trimmed image data to the moving device 4 (step S738).
  • step S736 determines whether trimming cannot be performed. If the result of determination is that trimming cannot be performed (step S736: No), the second control unit 56 transmits error information to the mobile device 4 (step S739). After step S738 or S739, the imaging device 5 proceeds to step S729.
  • step S735 when the trimming request is not received (step S735: No), the imaging device 5 proceeds to step S729.
  • the imaging device 5 is described as not changing the angle with respect to the moving device 4, but the imaging device 5 may be configured to be able to change the angle with respect to the moving device 4.
  • the angle change instruction signal may be received from the moving device 4 and control for changing the angle with respect to the moving device 4 may be performed.
  • the moving device is movable in accordance with a change in the relative relationship between the imaging object specified in advance and the imaging device. While performing control according to the result, it detects information necessary for the mobile device to determine whether or not the imaging device is movable according to a change in the relative relationship between the object to be imaged and the imaging device. Therefore, it is possible to perform appropriate processing even in a situation where the movement path at the time of shooting is not fixed.
  • the user when the composition adjustment mode is set, the user can concentrate on composition adjustment by leaving the operation to the imaging system. Therefore, even in an environment where the subject is tracked, the user can concentrate on shooting without being distracted by the processing for tracking.
  • the steered body is an endoscope.
  • the moving device expands and contracts the distal end of the endoscope, and holds the imaging device therein.
  • the mobile device is wirelessly connected so as to be communicable with the operating device, and is driven according to an instruction from the operating device, and receives an instruction signal for the imaging device and transmits it to the imaging device.
  • the steered object has a lock-on function, and the object to be photographed (subject) moves relative to the imaging device when the steered object (endoscope) is inserted into the subject and the examination is performed. In this case, the object to be imaged is tracked while sequentially determining whether or not the movement can be tracked. When performing this tracking, the moving device performs an approach or separation operation on the subject as necessary in order not to change the size of the object to be imaged.
  • FIG. 21 is a diagram showing a schematic configuration of an imaging system according to Embodiment 2 of the present invention.
  • FIG. 22 is a block diagram illustrating a functional configuration of the imaging system according to the second embodiment.
  • An imaging system 1A shown in FIGS. 21 and 22 receives an endoscope 2A that is inserted into the body of a subject that is a subject to be imaged and images the inside of the subject, and an operation instruction signal for the endoscope 2A.
  • the controller 3A includes a processor 6A that is communicably connected to the endoscope 2A and controls the imaging system 1A in an integrated manner, and a display device 7A that displays an image captured by the endoscope 2A.
  • the endoscope 2A has an elongated shape having flexibility and is inserted into the body cavity of the subject, and is connected to the proximal end side of the insertion portion 21A and receives an operation signal. 22A, and a universal cord 23A that extends in a direction different from the direction in which the insertion portion 21A extends from the operation portion 22A and incorporates various cables connected to the processor 6A.
  • the insertion portion 21A includes a distal end portion 24A that houses the moving device 4A and the imaging device 5A, and a long flexible tube portion 25A that is connected to the proximal end side of the distal end portion 24A and has flexibility.
  • the distal end portion 24A is configured by using any one of an electrostatic actuator, a conductive polymer actuator, an ultrasonic motor, and the like, and has a flexible structure capable of bending, crank bending, and the like.
  • the moving device 4A and the imaging device 5A have the same configurations as the moving device 4 and the imaging device 5 described in the first embodiment. For this reason, the endoscope 2 ⁇ / b> A has a function as the steered body 2.
  • the components corresponding to the moving device 4 and the imaging device 5 in the moving device 4A and the imaging device 5A will be described by adding A to the end of the components of the moving device 4 and the imaging device 5, respectively.
  • symbol of the propulsion part which the moving apparatus 4A has is "41A".
  • the moving device 4A is attached to the insertion portion 21A in a cylindrical shape that can be advanced and retracted from the distal end of the insertion portion 21A, and holds the imaging device 5A inside the cylindrical shape.
  • the propulsion unit 41A is configured using an actuator that advances and retracts the moving device 4A at the distal end of the insertion unit 21A in response to an operation instruction signal sent from the operation device 3A.
  • the processor 6A acquires an image data captured by the endoscope 2A and performs image processing, and a light source unit that generates illumination light that irradiates the subject from the distal end of the insertion unit 21A of the endoscope 2A. 62A and a control unit 63A that controls the entire imaging system 1A including the processor 6A itself.
  • the image processing unit 61A and the control unit 63A are configured using one or more of, for example, a CPU, an FPGA (Field Programmable Gate Array), an ASIC (Application Specific Integrated Circuit), and the like.
  • the light source unit 62A outputs illumination light that irradiates the subject (subject) from the distal end of the insertion unit 21A of the endoscope 2A.
  • the light source unit 62A is configured using, for example, an LED (Light Emitting Diode), a laser light source, a xenon lamp, a halogen lamp, or the like.
  • the processor 6A controls the entire imaging system 1A.
  • the display device 7A displays an image corresponding to the imaging signal subjected to image processing by the processor 6A.
  • the display device 7A displays various information related to the imaging system 1A.
  • the display device 7A is configured using a display panel such as a liquid crystal or an organic EL.
  • the processor 6A may include the functions of the first control unit 49A of the moving device 4A and the second control unit 56A of the imaging device 5A. Further, the operation unit 22A of the endoscope 2A may include the function of the operation device 3A.
  • the outline of the processing performed by the operation device 3A, the moving device 4A, and the imaging device 5A is substantially the same as in the first embodiment.
  • an outline of a slide control process unique to the second embodiment will be described.
  • Steps S901 to S908 sequentially correspond to steps S601 to 608 (see FIG. 18) in the slide control described in the first embodiment.
  • the determination of the slidability of the moving device 4A in step S901 is performed based on, for example, a result of the spatial information acquisition unit 43A detecting the presence or absence of the internal wall of the subject in the sliding direction.
  • step S909 the movement determination unit 492A determines whether or not the size of the subject image has changed before and after the slide (step S909).
  • FIG. 24 is a diagram schematically illustrating the relationship between the distal end portion of the endoscope and the subject when the size of the subject image changes. As shown in FIG. 24, when the slide amount is large, the state is about ⁇ d away from the subject 300 by crank bending from the front end portion 24A before the movement indicated by the broken line. For this reason, as shown in FIG. 25, the subject image 300b after the movement becomes smaller than the subject image 300a before the movement.
  • step S909 If the size of the subject image changes as a result of the determination (step S909: Yes), the movement determination unit 492A approaches the subject to return to the size before the slide according to the change in the size of the subject image. Alternatively, it is determined whether or not separation is possible (step S910). This determination is made based on the information acquired by the spatial information acquisition unit 43A. If the movement determination unit 492A determines in step S909 that the size of the subject image has not changed (step S909: No), the moving device 4A returns to the main routine.
  • step S910 when the movement determination unit 492A determines that the approach or separation from the subject is possible (step S910: Yes), the propulsion control unit 494A causes the propulsion unit 41A to perform an approach or separation operation (step S911). ).
  • FIG. 26 is a diagram schematically showing a situation in which the moving device 4A moves forward from the situation shown in FIG. In this situation, as illustrated in FIG. 27, the imaging device 5A captures a subject image 300c having approximately the same size as the subject image 300a before movement. After step S911, the moving device 4A returns to the main routine.
  • step S910 If it is determined in step S910 that approach or separation from the subject is impossible (step S910: No), the first control unit 49A transmits a zoom instruction signal to the imaging device 5A (step S912). .
  • step S913: Yes when error information indicating zoom failure is received from the imaging device 5A (step S913: Yes), the first control unit 49A transmits error information to the controller device 3A (step S914). Thereafter, the moving device 4A ends the slide control process. In step S913, when error information is not received from the imaging device 5A (step S913: No), the moving device 4A ends the slide control process.
  • step S901 when the first control unit 49A determines that the sliding of the moving device 4A is impossible (step S901: No), the processing of steps S915 to S917 performed subsequently is the moving device 4 of the first embodiment. Steps S609 to S611 (see FIG. 18) described in the slide control of FIG. After step S917, the moving device 4A ends the slide control process.
  • steps S1001 to S1005 sequentially corresponds to steps S724 to S728 (see FIG. 19B) described in the first embodiment.
  • step S1006 performed after step S1004 will be described.
  • the imaging control unit 567A determines whether or not the zoom to the requested zoom position is possible by the optical zoom (step S1007).
  • the imaging control unit 567A performs optical zoom control on the optical system 511A (step S1008). Thereafter, the imaging device 5A ends the zoom control process.
  • step S1006 when the zoom instruction signal is not received from the moving device 4A (step S1006: No), the imaging device 5A ends the zoom control process.
  • step S1007 If zooming to the required zoom position by optical zoom is impossible (step S1007: No), the imaging control unit 567A determines whether zooming to the required zoom position is possible by electronic zoom (step S1009). . When zooming to the requested zoom position by electronic zoom is possible (step S1009: Yes), the imaging control unit 567A performs electronic zoom to the requested zoom position (step S1010). Thereafter, the imaging device 5A ends the zoom control process.
  • step S1009 when zooming to the requested zoom position by electronic zoom is impossible (step S1009: No), the imaging device 5A transmits error information to the moving device 4A (step S1011). Thereafter, the imaging device 5A ends the slide control.
  • step S1001 when the slide information is not received from the moving apparatus 4A (step S1001: No), the processing in steps S1012 to S1015 sequentially corresponds to the processing in steps S735 to S738 described in the first embodiment. After step S1015, the imaging device 5A ends the slide control. If trimming cannot be performed in step S1013 (step S1013: No), the imaging device 5A moves to step S1011 and transmits error information to the mobile device 4A.
  • the user can perform the observation of the subject without a sense of incongruity because the angle of view is automatically adjusted on the apparatus side only by performing the slide operation.
  • an unmanned aerial vehicle or an endoscope is exemplified as the moving device, it can be applied to a self-propelled robot, an industrial endoscope, a capsule endoscope, or the like.
  • the lens barrel portion that holds the imaging device of the microscope may be a moving device.
  • processing algorithm described using the flowchart in this specification can be described as a program.
  • a program may be recorded by a recording unit inside the computer, or may be recorded on a computer-readable recording medium. Recording of the program in the recording unit or recording medium may be performed when the computer or recording medium is shipped as a product, or may be performed by downloading via a communication network.
  • the present invention can include various embodiments not described herein, and various design changes can be made within the scope of the technical idea specified by the claims. It is.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

L'invention concerne un objet dirigé, un dispositif mobile, un dispositif d'imagerie, un procédé de commande de mouvement, un procédé d'aide au mouvement, un programme de commande de mouvement et un programme d'aide au mouvement avec lesquels il est possible d'effectuer un processus approprié même dans une situation dans laquelle un itinéraire de mouvement est non déterminé au moment de l'imagerie. Des informations nécessaires pour déterminer si un mouvement est ou non possible en correspondance avec un changement de la relation relative entre un objet à imager désigné au préalable et un dispositif d'imagerie sont détectées, la détermination est effectuée à l'aide des informations indiquant si oui ou non le mouvement est possible en correspondance avec un changement de la relation relative, et une commande est exercée qui correspond au résultat de détermination.
PCT/JP2018/003690 2017-02-16 2018-02-02 Objet dirigé, dispositif mobile, dispositif d'imagerie, procédé de commande de mouvement, procédé d'aide au mouvement, programme de commande de mouvement et programme d'aide au mouvement WO2018150917A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-027316 2017-02-16
JP2017027316A JP2018133749A (ja) 2017-02-16 2017-02-16 被操縦体、移動装置、撮像装置、移動制御方法、移動補助方法、移動制御プログラムおよび移動補助プログラム

Publications (1)

Publication Number Publication Date
WO2018150917A1 true WO2018150917A1 (fr) 2018-08-23

Family

ID=63169390

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/003690 WO2018150917A1 (fr) 2017-02-16 2018-02-02 Objet dirigé, dispositif mobile, dispositif d'imagerie, procédé de commande de mouvement, procédé d'aide au mouvement, programme de commande de mouvement et programme d'aide au mouvement

Country Status (2)

Country Link
JP (1) JP2018133749A (fr)
WO (1) WO2018150917A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020100321A1 (fr) * 2018-11-12 2020-05-22 Hiroyuki Nakanishi Endoscope à capsule
CN113424515A (zh) * 2019-02-21 2021-09-21 索尼集团公司 信息处理设备、信息处理方法和程序
CN114007938A (zh) * 2019-06-18 2022-02-01 日本电气方案创新株式会社 操纵支持装置、操纵支持方法和计算机可读记录介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI820194B (zh) * 2018-08-31 2023-11-01 日商索尼半導體解決方案公司 電子機器及固體攝像裝置
JP2020102817A (ja) * 2018-12-25 2020-07-02 凸版印刷株式会社 監視対象識別装置、監視対象識別システム、および、監視対象識別方法
US20220413518A1 (en) * 2019-10-24 2022-12-29 Sony Group Corporation Movable object, information processing method, program, and information processing system
JP7219204B2 (ja) * 2019-11-26 2023-02-07 弘幸 中西 無人飛行体

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015207149A (ja) * 2014-04-21 2015-11-19 薫 渡部 監視システム及び監視方法
JP2016212465A (ja) * 2015-04-28 2016-12-15 株式会社ニコン 電子機器および撮像システム
JP2017021445A (ja) * 2015-07-07 2017-01-26 キヤノン株式会社 通信装置、その制御方法、プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015207149A (ja) * 2014-04-21 2015-11-19 薫 渡部 監視システム及び監視方法
JP2016212465A (ja) * 2015-04-28 2016-12-15 株式会社ニコン 電子機器および撮像システム
JP2017021445A (ja) * 2015-07-07 2017-01-26 キヤノン株式会社 通信装置、その制御方法、プログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020100321A1 (fr) * 2018-11-12 2020-05-22 Hiroyuki Nakanishi Endoscope à capsule
CN113424515A (zh) * 2019-02-21 2021-09-21 索尼集团公司 信息处理设备、信息处理方法和程序
CN114007938A (zh) * 2019-06-18 2022-02-01 日本电气方案创新株式会社 操纵支持装置、操纵支持方法和计算机可读记录介质

Also Published As

Publication number Publication date
JP2018133749A (ja) 2018-08-23

Similar Documents

Publication Publication Date Title
WO2018150917A1 (fr) Objet dirigé, dispositif mobile, dispositif d'imagerie, procédé de commande de mouvement, procédé d'aide au mouvement, programme de commande de mouvement et programme d'aide au mouvement
US11649052B2 (en) System and method for providing autonomous photography and videography
US10447912B2 (en) Systems, methods, and devices for setting camera parameters
EP3071482B1 (fr) Imagerie panoramique de véhicule aérien sans pilote
JP2014062789A (ja) 写真計測用カメラ及び航空写真装置
CN107205111B (zh) 摄像装置、移动装置、摄像系统、摄像方法和记录介质
US10356294B2 (en) Photographing device, moving body for photographing, and photographing control apparatus for moving body
CN111356954B (zh) 控制装置、移动体、控制方法以及程序
WO2020172800A1 (fr) Procédé de commande de patrouille pour plate-forme mobile et plate-forme mobile
JP7391053B2 (ja) 情報処理装置、情報処理方法およびプログラム
CN111417836A (zh) 环境取得系统
WO2021217371A1 (fr) Procédé et appareil de commande pour plateforme mobile
WO2019183789A1 (fr) Procédé et appareil de commande de véhicule aérien sans pilote, et véhicule aérien sans pilote
WO2020062089A1 (fr) Procédé d'étalonnage de capteur magnétique et plateforme mobile
JP6910785B2 (ja) 移動撮像装置およびその制御方法、ならびに撮像装置およびその制御方法、無人機、プログラム、記憶媒体
WO2021168821A1 (fr) Procédé de commande de plateforme mobile et dispositif
JP2021113005A (ja) 無人航空機システムおよび飛行制御方法
JP2020068426A (ja) カメラ装置、画像処理装置、およびミラー可動機構
CN106060357B (zh) 成像设备、无人机及机器人
CN111357271B (zh) 控制装置、移动体、控制方法
CN111373735A (zh) 拍摄控制方法、可移动平台与存储介质
WO2021059684A1 (fr) Système de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
WO2021217372A1 (fr) Procédé et dispositif de commande pour plateforme mobile
KR20230115042A (ko) 충돌회피 드론 및 제어방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18754012

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18754012

Country of ref document: EP

Kind code of ref document: A1