WO2020063770A1 - 控制装置、摄像装置、移动体、控制方法以及程序 - Google Patents

控制装置、摄像装置、移动体、控制方法以及程序 Download PDF

Info

Publication number
WO2020063770A1
WO2020063770A1 PCT/CN2019/108224 CN2019108224W WO2020063770A1 WO 2020063770 A1 WO2020063770 A1 WO 2020063770A1 CN 2019108224 W CN2019108224 W CN 2019108224W WO 2020063770 A1 WO2020063770 A1 WO 2020063770A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
imaging device
time
coordinate system
distance
Prior art date
Application number
PCT/CN2019/108224
Other languages
English (en)
French (fr)
Inventor
高宫诚
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980005027.0A priority Critical patent/CN111213369B/zh
Publication of WO2020063770A1 publication Critical patent/WO2020063770A1/zh
Priority to US17/198,233 priority patent/US20210218879A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/006Apparatus mounted on flying objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]

Definitions

  • the present invention relates to a control device, an imaging device, a moving body, a control method, and a program.
  • Patent Document 1 discloses a method for controlling a lens focus driving unit based on a moving path, a moving speed of a subject, a position of a subject expected at a certain time, and a distance to the subject, and further performing an imaging lens. Focus drive method.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 10-142486
  • the control device may include a derivation unit based on the positional relationship between the imaging device and the target object photographed by the imaging device at the first time, the speed and moving direction of the imaging device at the first time, and the target object The speed and moving direction at the first time point are used to derive the distance from the imaging device to the target object at the second time point after the first time point.
  • the control device may include a first control unit that controls the position of the focus lens of the imaging device based on the distance at the second time.
  • the deriving unit may determine the distance between the imaging device and the target and the direction from the imaging device to the target as the positional relationship.
  • the deriving unit may determine the position of the imaging device on the preset three-dimensional coordinate system and the position of the target object based on the positional relationship.
  • the derivation unit may be based on the position of the imaging device on the coordinate system at the first time, the position of the target on the coordinate system, the speed and moving direction of the imaging device at the first time, and the speed and moving direction of the target at the first time.
  • the deriving unit may derive the distance at the second time based on the position of the imaging device at the second time and the position of the target.
  • the deriving unit may set a coordinate system based on the moving direction of the target at the first time.
  • the deriving unit may set the first axis of the coordinate system along the moving direction of the target.
  • the deriving unit may set the position of the target at the first time as the origin of the coordinate system.
  • the deriving unit may assume that the target does not move in the second axis direction of the coordinate system along the vertical direction of the vehicle, and determine that the imaging device is at the second time. The position on the coordinate system and the position of the target on the coordinate system.
  • the deriving unit may assume that the target has not moved in the second axis direction of the coordinate system along the vertical direction of the vehicle, and The target does not move in the third axis direction along the lateral coordinate system of the vehicle, and the position of the imaging device on the coordinate system and the position of the target on the coordinate system are determined at the second time.
  • the imaging device may include the control device described above.
  • the imaging device may include a focusing lens.
  • the imaging device may include an image sensor.
  • the moving object according to one aspect of the present invention may be a moving object that is mounted on an imaging device and moves.
  • the control device may include a second control unit that controls the movement of the moving body so that the distance from the imaging device to the target is within a predetermined distance range, wherein the predetermined distance range is such that the focus lens is within each unit distance of the focus distance The amount of change in position is less than or equal to a predetermined threshold.
  • the control device may include a determination section that changes the position of the focus lens to obtain a plurality of distances to the subject in focus and a position of the plurality of focus lenses, and based on the obtained plurality of distances and the positions of the plurality of focus lenses To derive the relationship between the position of the focusing lens and the focusing distance, and then determine a predetermined distance range according to the relationship.
  • a control method may include a positional relationship between the imaging device and a target object captured by the imaging device at a first time, a speed and a moving direction of the imaging device at the first time, and a target object at the first time. Speed and direction of movement to derive the stage of the distance from the imaging device to the target at the second time after the first time.
  • the control method may include a step of controlling the position of the focus lens of the imaging device based on the distance at the second time.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the control device described above.
  • the in-focus state of the target object can be maintained more reliably.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned aircraft and a remote operation device.
  • FIG. 2 is a diagram showing an example of functional blocks of an unmanned aircraft.
  • FIG. 3 is a diagram showing a drone tracking a target.
  • FIG. 4 is a diagram showing an example of a coordinate system representing a positional relationship between an unmanned aircraft and a target.
  • FIG. 5 is a diagram showing an example of a coordinate system representing a positional relationship between an unmanned aircraft and a target.
  • FIG. 6 is a diagram illustrating an example of a relationship between a focusing distance and a position of a focusing lens.
  • FIG. 7 is a diagram for describing a method for determining a focus stabilization range.
  • FIG. 8 is a diagram for describing a method of determining a focus stabilization range.
  • FIG. 9 is a diagram for describing a method for determining a focus stabilization range.
  • FIG. 10 is a flowchart showing an example of a procedure for determining a focus stabilization range.
  • FIG. 11 is a diagram showing an example of a hardware configuration.
  • the blocks may represent (1) a stage of a process of performing an operation or (2) a "part" of a device having a role of performing an operation.
  • the specific stages and “departments” may be implemented by programmable circuits and / or processors.
  • the dedicated circuits may include digital and / or analog hardware circuits. It may include integrated circuits (ICs) and / or discrete circuits.
  • Programmable circuits may include reconfigurable hardware circuits.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, as well as flip-flops, registers, field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), and the like Storage elements, etc.
  • the computer-readable medium may include any tangible device that can store instructions executed by a suitable device.
  • a computer-readable medium having instructions stored thereon includes a product including instructions that can be executed to create a means for performing the operations specified by the flowchart or block diagram.
  • an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included.
  • Computer-readable media may include floppy disk (registered trademark), floppy disk, hard disk, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory) , Electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, integrated Circuit cards, etc.
  • floppy disk registered trademark
  • floppy disk hard disk
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • EEPROM Electrically erasable programmable read-only memory
  • SRAM compact disc read-only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • Computer-readable instructions may include any of source code or object code described by any combination of one or more programming languages.
  • the source or object code includes traditional procedural programming languages.
  • Traditional programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA (registered trademark), C ++, etc.
  • the computer-readable instructions may be provided to a processor or a programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet.
  • WAN wide area network
  • LAN local area network
  • a processor or programmable circuit can execute computer-readable instructions to create a means for performing the operations specified in the flowchart or block diagram.
  • Examples of the processor include a computer processor, a processing unit, a microprocessor, a digital signal processor, a controller, a microcontroller, and the like.
  • FIG. 1 illustrates an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV body 20, a universal joint 50, a plurality of imaging devices 60, and an imaging device 100.
  • the gimbal 50 and the imaging device 100 are examples of an imaging system.
  • UAV10 is an example of a moving body.
  • a moving body is a concept including a flying body moving in the air, a vehicle moving on the ground, and a ship moving on the water.
  • a flying body moving in the air refers to a concept that includes not only UAV but also other aircraft, airships, and helicopters moving in the air.
  • the UAV body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion part.
  • the UAV body 20 controls the rotation of a plurality of rotors to fly the UAV 10.
  • the UAV body 20 uses, for example, four rotors to fly the UAV 10.
  • the number of rotors is not limited to four.
  • UAV 10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures an object included in a desired imaging range.
  • the gimbal 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a support mechanism.
  • the gimbal 50 supports the imaging device 100 so that it can rotate on a pitch axis using an actuator.
  • the gimbal 50 supports the imaging device 100 so that it can also rotate around the roll axis and the yaw axis using actuators, respectively.
  • the gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 around at least one of a yaw axis, a pitch axis, and a roll axis.
  • the plurality of imaging devices 60 are sensing cameras that capture the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two imaging devices 60 can be installed on the nose of the UAV 10, that is, on the front side.
  • the other two imaging devices 60 may be installed on the bottom surface of the UAV 10.
  • the two image pickup devices 60 on the front side may be paired and function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom surface side may be paired to function as a stereo camera.
  • the imaging device 60 can measure the presence of an object included in the imaging range of the imaging device 60 and the distance to the object.
  • the imaging device 60 is an example of a measurement device for measuring an object existing in the imaging direction of the imaging device 100.
  • the measurement device may be another sensor such as an infrared sensor or an ultrasonic sensor that measures an object existing in the imaging direction of the imaging device 100.
  • the three-dimensional space data around the UAV 10 can be generated based on the images captured by the plurality of imaging devices 60.
  • the number of imaging devices 60 included in UAV 10 is not limited to four.
  • the UAV 10 may include at least one camera 60.
  • the UAV 10 may also include at least one camera 60 on the nose, tail, side, bottom, and top of the UAV 10.
  • the angle of view settable in the imaging device 60 may be greater than the angle of view settable in the imaging device 100.
  • the imaging device 60 may include a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can perform wireless communication with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various instructions related to the movement of the UAV 10 such as ascent, descent, acceleration, deceleration, forward, backward, and rotation.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the instruction information may show the height at which the UAV 10 should be located.
  • the UAV 10 moves to a height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include a rising instruction for causing the UAV 10 to rise. UAV10 rises while receiving the rising instruction. When the height of UAV 10 reaches the upper limit, even if a rising command is received, UAV 10 can be restricted from rising.
  • FIG. 2 shows an example of the functional blocks of the UAV 10.
  • UAV 10 includes UAV control unit 30, memory 32, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, The imaging device 60 and the imaging device 100.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 may receive instruction information including various instructions to the UAV control section 30 from the remote operation device 300.
  • the memory 32 stores the UAV control unit 30 pair of the propulsion unit 40, the GPS receiver 41, the inertial measurement unit (IMU) 42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the universal joint 50, the camera 60 and Programs and the like necessary for the imaging device 100 to perform control.
  • the memory 32 may be a computer-readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 32 may be provided inside the UAV body 20. It can be detachably provided on the UAV body 20.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with a program stored in the memory 32.
  • the UAV control unit 30 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the advancing unit 40 advances the UAV 10.
  • the propulsion unit 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors through a plurality of drive motors in accordance with a command from the UAV control unit 30 to fly the UAV 10.
  • the GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received multiple signals.
  • IMU42 detects the posture of UAV 10.
  • the IMU 42 detects accelerations in the 3-axis directions of the front-rear, left-right, and up-down directions of the UAV 10, and angular speeds in the 3-axis directions of the pitch axis, the roll axis, and the yaw axis, as the posture of the UAV 10.
  • the magnetic compass 43 detects the orientation of the nose of the UAV 10.
  • the barometric altimeter 44 detects the flying altitude of the UAV 10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure into an altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the humidity sensor 46 detects the humidity around the UAV 10.
  • the imaging device 100 includes an imaging section 102 and a lens section 200.
  • the lens unit 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be composed of a CCD or a CMOS.
  • the image sensor 120 captures an optical image formed through the plurality of lenses 210 and outputs the captured image data to the imaging control section 110.
  • the imaging control unit 110 may be composed of a microprocessor such as a CPU or an MPU, and a microcontroller such as an MCU.
  • the imaging control unit 110 may control the imaging apparatus 100 based on an operation instruction of the imaging apparatus 100 from the UAV control unit 30.
  • the memory 130 may be a computer-readable recording medium, and may also include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside a casing of the imaging apparatus 100.
  • the memory 130 may be detachably provided on a casing of the imaging device 100.
  • the lens unit 200 includes a plurality of lenses 210, a plurality of lens driving units 212, and a lens control unit 220.
  • the plurality of lenses 210 may be used as a zoom lens, a variable focal length lens, and a focusing lens. At least a part or all of the plurality of lenses 210 are movably arranged along the optical axis.
  • the lens unit 200 may be an interchangeable lens provided to be removable from the imaging unit 102.
  • the lens driving unit 212 moves at least a part or all of the plurality of lenses 210 along an optical axis via a mechanism member such as a cam ring.
  • the lens driving section 212 may include an actuator.
  • the actuator may include a stepper motor.
  • the lens control section 220 drives the lens driving section 212 in accordance with a lens control instruction from the imaging section 102 to move one or more lenses 210 along the optical axis direction via a mechanism member.
  • the lens control instruction is, for example, a zoom control instruction and a focus control instruction.
  • the lens unit 200 further includes a memory 222 and a position sensor 214.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens driving unit 212 in accordance with a lens operation instruction from the imaging unit 102.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens driving unit 212 in accordance with a lens operation instruction from the imaging unit 102.
  • a part or all of the lens 210 moves along the optical axis.
  • the lens control section 220 performs at least one of a zoom operation and a focus operation by moving at least one of the lenses 210 along the optical axis.
  • the position sensor 214 detects the position of the lens 210.
  • the position sensor 214 can detect a current zoom position or a focus position.
  • the lens driving section 212 may include a shake correction mechanism.
  • the lens control section 220 may perform the shake correction by moving the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via a shake correction mechanism.
  • the lens driving section 212 may drive a shake correction mechanism by a stepping motor to perform shake correction.
  • the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or in a direction perpendicular to the optical axis to perform shake correction.
  • the memory 222 stores control values of the plurality of lenses 210 that are moved through the lens driving unit 212.
  • the memory 222 may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the distance between the target and the imaging device 100 is predicted.
  • the imaging device 100 controls the position of the focusing lens based on the predicted distance to focus on the target.
  • the UAV 10 tracks a vehicle traveling on a road 600 as shown in FIG. 3 as the target 500, and at the same time causes the imaging device 100 to capture the target 500.
  • the imaging control unit 110 includes a derivation unit 112 and a focus control unit 114.
  • the derivation unit 112 is based on the positional relationship between the imaging device 100 and the target 500 photographed by the imaging device 100 at the first time, the speed and moving direction of the imaging device 100 at the first time, and the speed and movement of the target 500 at the first time. Direction to derive the distance from the imaging device 100 to the target 500 at the second time after the first time.
  • the deriving unit 112 predicts based on the positional relationship between the imaging device 100 and the target 500 photographed by the imaging device 100 at the current time, the speed and moving direction of the imaging device 100 at the current time, and the speed and moving direction of the target 500 at the current time.
  • the distance from the imaging device 100 to the target object 500 at a time after a predetermined time (for example, 5 seconds later) from the current time is derived.
  • the focus control unit 114 controls the position of the focus lens of the imaging device 100 based on the distance at the second time.
  • the focus control unit 114 controls the position of the focus lens at the second time so that the focus is focused on the distance derived by the derivation unit 112.
  • the deriving unit 112 determines the positional relationship between the distance between the imaging device 100 and the target 500 at the first time and the direction from the imaging device 100 to the target 500 at the first time.
  • the deriving unit 112 may determine the imaging device 100 and the target object at the first time based on a result of the contrast autofocus (contrast AF) or the phase difference AF based on a plurality of images captured by the imaging device 100 executed by the focus control unit 114.
  • the distance between 500 The deriving unit 112 determines the distance from the imaging apparatus 100 to the target 500 based on the result of the distance measured by the distance measurement sensor included in the imaging apparatus 100.
  • the derivation unit 112 may determine the position of the UAV 10 based on the plurality of signals received by the GPS receiver 41, the direction of the universal joint 50 with respect to the UAV 10 determined based on the driving instruction of the universal joint 50, and The image captured by the imaging device 100 determines the direction from the imaging device 100 to the target 500 at the first time.
  • the derivation unit 112 determines the position information of the UAV 10 from the GPS receiver 41 and the altitude information from the barometric altimeter 44 and the position information and altitude information of the target 500 from the GPS receiver included in the target 500.
  • the deriving unit 112 may determine the position of the imaging device 100 on a preset three-dimensional coordinate system and the position of the target object 500 on a preset three-dimensional coordinate system based on the positional relationship.
  • the deriving unit 112 may be based on the position of the imaging device 100 on the coordinate system at the first time, the position of the target 500 on the coordinate system, the speed and moving direction of the imaging device 100 at the first time, and the target 500 at the first time To determine the position of the imaging device 100 on the coordinate system and the position of the target 500 on the coordinate system at the second moment.
  • the deriving unit 112 may derive the distance at the second time based on the position of the imaging device 100 at the second time and the position of the target.
  • the deriving unit 112 may set a coordinate system based on the moving direction of the target 500 at the first time.
  • the deriving unit 112 may set the first axis of the coordinate system along the moving direction of the target 500.
  • the deriving unit 112 may set the position of the target 500 at the first time as the origin of the coordinate system.
  • the deriving unit 112 may set the X axis of the coordinate system along the direction of the movement vector 510 of the target 500.
  • the derivation unit 112 may determine the movement vector 510 of the target object 500 based on the optical flows determined from the plurality of images captured by the imaging device 100.
  • the derivation unit 112 may determine the movement vector 510 of the target object 500 based on the optical flows determined from the plurality of images captured by the imaging device 100.
  • the derivation unit 112 may set the motion vector 520 of the UAV 10 in the coordinate system.
  • the derivation unit 112 may determine the movement vector 520 of the UAV 10 based on the operation instruction of the UAV 10 transmitted by the remote operation device 300.
  • the deriving unit 112 may determine the movement vector 510 of the target object 500 based on the optical flow determined from the plurality of images captured by the imaging device 100 and the operation instruction of the UAV 10 transmitted by the remote operation device 300.
  • the derivation unit 112 can use the following formula to calculate the coordinate point (x o (0) , y (0) , z o (0) ) of the target 500 on the coordinate system and the coordinate point (x of the UAV 10 on the coordinate system) according to the following formula. d (0) , y d (0) , z d (0) ) to determine Distance (0) that represents the distance between the imaging device 100 and the target 500 at the first time.
  • the derivation unit 112 may calculate the coordinate points (x o (0) , y o (0) , z o (0) ) of the target 500 on the coordinate system at the first moment, and the UAV 10 at
  • the coordinate points (x d (0) , y d (0) , z d (0) ) on the coordinate system, the movement vector 510 of the target 500 at the first moment, and the movement vector 520 of the UAV 10 are determined to be in the second Coordinate points (x o (1) , y o (1) , z o (1) ) of the camera device 100 on the coordinate system at time, UAV 10 coordinate points (x d (1) , y d ( 1) , z d (1) ).
  • the derivation unit 112 may use the following formula based on the coordinate points (x o (1) , y o (1) , z o (1) ) of the imaging device 100 on the coordinate system at the second moment and UAV 10 in the coordinate system.
  • the coordinate points (x d (1) , y d (1) , z d (1) ) on the coordinates are used to determine Distance (1) representing the distance between the imaging device 100 and the target 500 at the second time.
  • the derivation unit 112 may periodically determine the moving direction of the target 500 and periodically update the coordinate system based on the moving direction of the target 500.
  • the deriving unit 112 updates the coordinate points of the target 500 and the UAV 10 on the coordinate system while updating the coordinate system.
  • the derivation unit 112 assumes that the target object 500 has not moved in the Z-axis direction of the coordinate system along the vertical direction of the vehicle, and it can be determined at the second time
  • the vertical direction of the vehicle may be a direction perpendicular to the moving direction of the vehicle.
  • the derivation unit 112 assumes that the target object 500 has not moved in the Z-axis direction of the coordinate system along the vertical direction of the vehicle, and it can be determined at the second time Coordinate points (x o (1) , y o (1) , z o (1) ) of the camera device 100 on the coordinate system and UAV 10 coordinate points (x d (1) , y d (1) ) , Z d (1) ).
  • the deriving section 112 assumes that the target object 500 has not moved in the Z-axis direction of the coordinate system along the vertical direction of the vehicle And the target 500 does not move in the Y-axis direction of the coordinate system along the lateral direction (left-right direction) of the vehicle, then the position of the camera 100 on the coordinate system and the target 500 on the coordinate system can be determined at the second moment. s position.
  • the deriving unit 112 can determine that the target 500 is a vehicle by pattern recognition using an image captured by the imaging device 100.
  • the derivation unit 112 may determine that the target object 500 is a vehicle based on the type of the target object set in advance by the user.
  • the derivation section 112 can determine whether the vehicle is traveling on a straight road by pattern recognition using an image captured by the imaging device 100.
  • the deriving unit 112 can determine whether the vehicle is traveling on a straight road based on the GPS information and map information of the vehicle obtained through the communication interface 36.
  • the deriving section 112 can determine the moving direction of the target 500 by using pattern recognition of an image captured by the imaging device 100.
  • the deriving section 112 may determine the moving direction of the target object 500 based on the result of the main subject detection performed by the focus control section 114.
  • the deriving unit 112 may determine the moving direction of the target object 500 based on the panning information of the imaging device 100.
  • the deriving unit 112 may determine the panning information of the imaging device 100 based on the driving instruction of the universal joint 50 and determine the moving direction of the target 500.
  • the derivation unit 112 may determine the moving direction of the target object 500 based on the distance to the target object 500 measured by the ranging sensor or the stereo camera, and the moving direction of the UAV 10.
  • the deriving section 112 may determine the moving direction of the target 500 based on the distance to the target 500 and the moving direction of the UAV 10 determined by the contrast AF and phase difference AF of the focus control section 114.
  • the position of the focus lens is controlled based on the predicted distance.
  • the imaging device 100 mounted on the UAV 10 can maintain the in-focus state on the target 500.
  • FIG. 6 shows an example of the relationship between the position of the focusing lens and the focusing distance.
  • the in-focus distance indicates a distance to a subject that can obtain a predetermined in-focus state with respect to the position of the focusing lens.
  • the focus distance indicates a distance to a subject whose contrast value with respect to the position of the focusing lens is greater than or equal to a predetermined threshold.
  • the ratio of the change in position of the focus lens to the change in distance increases. That is, when the distance between the imaging device 100 and the target 500 is short, the ratio of the change in position of the focus lens to the change in distance increases. Therefore, when the distance between the imaging device 100 and the target object 500 is short, it may be too late to move the focus lens, and the imaging device 100 cannot track the target object 500 while maintaining a proper focus state.
  • the UAV control unit 30 can control the movement of the UAV 10 so that the distance from the imaging device 100 to the target 500 falls within the focus stable range, where the focus stable range refers to the unit that makes the focus lens at each unit of the focus distance.
  • the focus stabilization range can be determined in advance through experiments or simulations.
  • the focus stabilization range depends on the optical characteristics of the focusing lens. That is, the focus stabilization range depends on the type of the lens section 200. If the lens unit 200 mounted on the imaging unit 102 is an interchangeable lens, the focus stabilization range changes depending on the type of the interchangeable lens. Therefore, if the lens unit 200 mounted on the imaging unit 102 is an interchangeable lens, the focus lens can be driven before the UAV 10 starts flying or before tracking the target, and the relationship between the position of the focus lens and the focus distance can be determined, and set Focus stable range relative to the mounted interchangeable lens.
  • FIG. 7 is a diagram for describing a method for determining a focus stabilization range.
  • f represents a focal distance.
  • X 1 represents the distance from the focal plane F to the object.
  • a represents the distance from the front principal plane to the object.
  • X 2 represents the amount of defocus.
  • b indicates a distance from the rear main plane to an image formed on the image sensor 120.
  • Hd represents the distance between the front principal plane and the rear principal plane.
  • D represents the distance from the object to the imaging surface of the image sensor 120, that is, the subject distance.
  • the imaging control section 110 may include a determination section 116 to determine a focus stabilization range.
  • the determination section 116 changes the position of the focus lens to obtain a plurality of distances to a subject in focus and a position of a plurality of focus lenses, and derives a focus lens based on the obtained plurality of distances and the positions of the plurality of focus lenses.
  • the relationship between the position of the lens and the focus distance is used to determine a focus stabilization range as a predetermined distance range according to the relationship.
  • the determination section 116 may determine the position of the focus lens corresponding to each of the plurality of focus distances, and determine a curve showing the relationship between the position of the focus lens and the focus distance based on the result. For example, as shown in FIG. 8, the determination section 116 determines the positions of the focus lens at the focusing distances of 5 m and 20 m, respectively. The determination unit 116 may determine a curve 700 showing the relationship between the position of the focus lens and the focus distance as shown in FIG. 9 from these two points, and determine the position of the focus lens per unit distance of the focus distance through the curve 700. A focus stabilization range in which the position change amount is less than or equal to a predetermined threshold.
  • FIG. 10 is a flowchart showing an example of a procedure for determining a focus stabilization range.
  • the determination section 116 determines whether the lens section 200 mounted on the imaging section 102 is an interchangeable lens in which a focus stabilization range has been registered (S100).
  • the determination section 116 obtains the distance to the subject through the ranging sensor during the UAV 10 flight, and determines the position of the focusing lens corresponding to the distance, thereby performing calibration (S102).
  • the determination unit 116 determines the focus stabilization range based on the relationship between the position of the focus lens and the focus distance determined by the calibration (S104).
  • the determination section 116 notifies the UAV control section 30 of the registered or determined focus stabilization range.
  • the UAV control unit 30 controls the flight of the UAV 10 so that the distance to the subject falls within the focus stabilization range (S106).
  • the in-focus state of the target can be maintained even if the imaging device 100 moves.
  • the flight of the UAV 10 is controlled so that the distance from the imaging device 100 to the target 500 falls within a focus stable range based on the relationship between the position of the focus lens and the focus distance. It is thus possible to prevent the imaging device 100 from being unable to track the target 500 while maintaining a proper focus state due to the inability to move the focus lens in time.
  • FIG. 11 illustrates an example of a computer 1200 that may fully or partially embody aspects of the present invention.
  • a program installed on the computer 1200 enables the computer 1200 to function as an operation associated with a device according to an embodiment of the present invention or one or more “parts” of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute a process or a stage of the process according to an embodiment of the present invention.
  • Such a program may be executed by the CPU 1212 to cause the computer 1200 to perform specified operations associated with some or all of the blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 also includes a communication interface 1222, an input / output unit, and they are connected to the host controller 1210 through an input / output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program and the like executed by the computer 1200 at the time of operation, and / or a program that depends on the hardware of the computer 1200.
  • the program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card or a network.
  • the program is installed in a RAM 1214 or a ROM 1230 which is also an example of a computer-readable recording medium, and is executed by the CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes the cooperation between the program and the various types of hardware resources described above.
  • the apparatus or method may be constituted by realizing the operation or processing of information according to the use of the computer 1200.
  • the CPU 1212 may execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in a transmission buffer provided in a recording medium such as a RAM 1214 or a USB memory, and sends the read transmission data to the network, or from the network
  • the received reception data is written in a reception buffer or the like provided in the recording medium.
  • the CPU 1212 can cause the RAM 1214 to read all or required parts of a file or database stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. The CPU 1212 can then write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • Various types of information can be stored in a recording medium and subjected to information processing.
  • the CPU 1212 can perform various types of operations, including information specified by the program's instruction sequence, described in various places in this disclosure, information processing, conditional judgment, conditional transfer, unconditional transfer, information Retrieve / replace various types of processing and write the results back to RAM 1214.
  • the CPU 1212 can retrieve information in files, databases, etc. in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value that specifies the first attribute from the plurality of entries. An entry with a matching condition, and read the attribute value of the second attribute stored in the entry, thereby obtaining the attribute value of the second attribute associated with the first attribute that meets the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)

Abstract

即使目标物和摄像装置发生移动,也可以维持对目标物的对焦状态。控制装置可以包含:导出部,其基于摄像装置与由摄像装置拍摄的目标物在第1时刻的位置关系、摄像装置在第1时刻的速度和移动方向、以及目标物在第1时刻的速度和移动方向,来导出在第1时刻之后的第2时刻从摄像装置到目标物的距离;以及,第1控制部,其根据第2时刻的距离来控制摄像装置的聚焦镜头的位置。

Description

控制装置、摄像装置、移动体、控制方法以及程序 【技术领域】
本发明涉及一种控制装置、摄像装置、移动体、控制方法以及程序。
【背景技术】
专利文献1中公开了一种根据被摄体的移动路径、移动速度、在某一时刻所预想的被摄体位置、以及到被摄体的距离来控制镜头对焦驱动部,进而进行摄像镜头的对焦驱动的方法。专利文献1日本特开平10-142486号公报
【发明内容】
【发明所要解决的技术问题】
根据专利文献1中所述的预测测距装置,除了被摄体之外,没有考虑到摄像装置移动的情况。
【用于解决问题的技术手段】
根据本发明的一个方面的控制装置可以包含导出部,其基于摄像装置与由摄像装置拍摄的目标物在第1时刻的位置关系、摄像装置在第1时刻的速度和移动方向、以及目标物在第1时刻的速度和移动方向,来导出在第1时刻之后的第2时刻从摄像装置到目标物的距离。控制装置可以包含第1控制部,其根据第2时刻的距离来控制摄像装置的聚焦镜头的位置。
导出部可以将摄像装置与目标物之间的距离、以及从摄像装置到目标物的方向确定为位置关系。
导出部可以基于位置关系来确定第1时刻在预设的三维坐标系上的摄像装置的位置以及目标物的位置。导出部可以基于在第1时刻摄像装置在坐标系上的位置、目标物在坐标系上的位置、摄像装置在第1时刻的速度和移动方向、以及目标物在第1 时刻的速度和移动方向,来确定在第2时刻摄像装置在坐标系上的位置和目标物在坐标系上的位置。导出部可以基于在第2时刻的摄像装置位置和目标物的位置来导出第2时刻的距离。
导出部可以基于目标物在第1时刻的移动方向来设置坐标系。
导出部可以沿着目标物的移动方向来设置坐标系的第1轴。
导出部可以将目标物在第1时刻的位置设置为坐标系的原点。
当基于由摄像装置拍摄的图像判断出目标物是车辆时,导出部可以假定目标物在沿着车辆的垂直方向的坐标系的第2轴方向上没有移动,并确定在第2时刻摄像装置在坐标系上的位置和目标物在坐标系上的位置。
当基于由摄像装置拍摄的图像判断出目标物是车辆,并且车辆在直线道路上行驶时,导出部可以假定目标物在沿着车辆的垂直方向的坐标系的第2轴方向上没有移动,并且目标物在沿着车辆的横向的坐标系的第3轴方向上没有移动,并确定在第2时刻摄像装置在坐标系上的位置和目标物在坐标系上的位置。
本发明的一个方面所涉及的摄像装置可以包含上述控制装置。摄像装置可以包含聚焦镜头。摄像装置可以包含图像传感器。
本发明的一个方面所涉及的移动体可以是搭载摄像装置并进行移动的移动体。
控制装置可以包含第2控制部,其控制移动体的移动,以使从摄像装置到目标物的距离在预定距离范围内,其中,所述预定距离范围使得聚焦镜头在对焦距离的每单位距离中的位置变化量小于或等于预定阈值。
控制装置可以包含确定部,其改变聚焦镜头的位置以获得多个到处于对焦状态的被摄体的距离和多个聚焦镜头的位置,并且基于所获得的多个距离和多个聚焦镜头的位置,来导出聚焦镜头的位置和对焦距离的关系,进而根据关系确定预定的距离范围。
根据本发明的一个方面的控制方法可以包含基于摄像装置与由摄像装置拍摄的目标物在第1时刻的位置关系、摄像装置在第1时刻的速度和移动方向、以及目标物在第1时刻的速度和移动方向,来导出在第1时刻之后的第2时刻从摄像装置到目标 物的距离的阶段。控制方法可以包含根据第2时刻的距离来控制摄像装置的聚焦镜头的位置的阶段。
本发明的一个方面所涉及的程序可以是一种用于使计算机用作上述控制装置发挥功能的程序。
根据本发明的一个方面,即使目标物和摄像装置发生移动,也可以更可靠地维持对目标物的对焦状态。
此外,上述发明内容未列举本发明的必要的全部特征。此外,这些特征组的子组合也可以构成发明。
【附图说明】
图1是示出无人驾驶航空器及远程操作装置的外观的一个示例的图。
图2是示出无人驾驶航空器的功能块的一个示例的图。
图3是示出无人驾驶航空器跟踪目标物的图。
图4是示出表示无人驾驶航空器与目标物的位置关系的坐标系的一个示例的图。
图5是示出表示无人驾驶航空器与目标物的位置关系的坐标系的一个示例的图。
图6是示出对焦距离与聚焦镜头的位置的关系的一个示例的图。
图7是用于对对焦稳定范围的确定方法进行说明的图。
图8是用于对对焦稳定范围的确定方法进行说明的图。
图9是用于对对焦稳定范围的确定方法进行说明的图。
图10是示出对焦稳定范围的确定程序的一个示例的流程图。
图11是示出硬件构成的一个示例的图。
【符号说明】
10 UAV
20 UAV主体
30 UAV控制部
32 存储器
36 通信接口
40 推进部
41 GPS接收器
42 惯性测量装置
43 磁罗盘
44 气压高度计
45 温度传感器
46 湿度传感器
50 万向节
60 摄像装置
100 摄像装置
102 摄像部
110 摄像控制部
112 导出部
114 对焦控制部
116 确定部
120 图像传感器
130 存储器
200 镜头部
210 镜头
212 镜头驱动部
214 位置传感器
220 镜头控制部
222 存储器
300 远程操作装置
500 目标物
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM
【具体实施方式】
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。具体的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑AND、逻辑OR、逻辑XOR、逻辑NAND、逻辑NOR和其他逻辑操作,以及触发器、寄存器、现场可编程门阵列(FPGA),可编程逻辑阵列(PLA)之类的存储元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包含一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。计算机可读介质的更具体示例可以包括软盘floppy disk(注册商标)、软盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或闪存)、电可擦除可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字通用光盘(DVD)、蓝光(RTM)盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1示出无人驾驶航空器(UAV)10及远程操作装置300的外观的一个示例。UAV 10包含UAV主体20、万向节50、多个摄像装置60、以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV 10为移动体的一个示例。移动体是指,包括在空中移动的飞行体、在地面上移动的车辆、在水上移动的船舶等的概念。在空中移动的飞行体是指不仅包括UAV、还包括在空中移动的其它的飞行器、飞艇、直升机等的概念。
UAV主体20包含多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV 10飞行。UAV主体20使用例如四个旋翼来使UAV 10飞行。旋翼的数量不限于四个。另外,UAV 10也可以是没有旋翼的固定翼机。
摄像装置100为对包含在期望的摄像范围内的对象进行拍摄的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50支撑摄像装置100,使其能够使用致动器而以俯仰轴旋转。万向节50支撑摄像装置100,使其还能够使用致动器而分别以滚转轴和偏航轴为中心旋转。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少1个为中心旋转,来改变摄像装置100的姿势。
多个摄像装置60是为了控制UAV 10的飞行而对UAV 10的周围进行拍摄的传感用相机。2个摄像装置60可以设置于UAV 10的机头、即正面。并且,其它2个摄像装置60可以设置于UAV 10的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。摄像装置60可以测量出摄像装置60的摄像范围所包含的对象的存在以及与对象间的距离。摄像装置60为用于测量存在于摄像装置100的摄像方向的对象的测量装置的一个示例。测量装置也可以是对存在于摄像装置100的摄像方向上的对象进行测量的红外传感器、超声波传感器等的其它的传感器。可以基于由多个摄像装置60拍摄的图像来生成UAV 10周围的三维空间数据。UAV 10所包含的摄像装置60的数量不限于四个。UAV 10包含至少一个摄像装置60即可。UAV 10也可以在UAV 10的机头、机尾、侧面、底面及顶面分别包含至少1个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置300与UAV 10通信,以远程操作UAV 10。远程操作装置300可以与UAV 10进行无线通信。远程操作装置300向UAV 10发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV 10的移动有关的各种指令的指示信息。指示 信息包括例如使UAV 10的高度上升的指示信息。指示信息可以示出UAV 10应该位于的高度。UAV 10进行移动,以位于从远程操作装置300接收的指示信息所表示的高度。指示信息可以包括使UAV 10上升的上升指令。UAV 10在接受上升指令的期间上升。在UAV 10的高度已达到上限高度时,即使接受到上升指令,也可以限制UAV 10上升。
图2示出UAV 10的功能块的一个示例。UAV 10包含UAV控制部30、存储器32、通信接口36、推进部40、GPS接收器41、惯性测量装置42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100。
通信接口36与远程操作装置300等其它装置通信。通信接口36可以从远程操作装置300接收包括对UAV控制部30的各种指令的指示信息。存储器32存储UAV控制部30对推进部40、GPS接收器41、惯性测量装置(IMU)42、磁罗盘43、气压高度计44、温度传感器45、湿度传感器46、万向节50、摄像装置60及摄像装置100进行控制所需的程序等。存储器32可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM、EEPROM、USB存储器等闪存中的至少一个。存储器32可以设置于UAV主体20的内部。其可以可拆卸地设置在UAV主体20上。
UAV控制部30按照储存在存储器32中的程序来控制UAV 10的飞行及拍摄。UAV控制部30可以由CPU或MPU等微处理器、以及MCU等微控制器等构成。UAV控制部30按照经由通信接口36从远程操作装置300接收到的指令来控制UAV 10的飞行及拍摄。推进部40推进UAV 10。推进部40具有多个旋翼以及使多个旋翼旋转的多个驱动电机。推进部40按照来自UAV控制部30的指令,经由多个驱动电机使多个旋翼旋转,以使UAV 10飞行。
GPS接收器41接收表示从多个GPS卫星发送的时间的多个信号。GPS接收器41根据所接收的多个信号来计算出GPS接收器41的位置(纬度及经度)、即UAV 10的位置(纬度及经度)。IMU42检测UAV 10的姿势。IMU42检测UAV 10的前后、左右以及上下的3轴方向的加速度和俯仰轴、滚转轴以及偏航轴的3轴方向的角速度, 作为UAV 10的姿势。磁罗盘43检测UAV 10的机头的方位。气压高度计44检测UAV 10的飞行高度。气压高度计44检测UAV 10周围的气压,并将检测到的气压换算为高度,以检测高度。温度传感器45检测UAV 10周围的温度。湿度传感器46检测UAV 10周围的湿度。
摄像装置100包含摄像部102及镜头部200。镜头部200为镜头装置的一个示例。摄像部102具有图像传感器120、摄像控制部110及存储器130。图像传感器120可以由CCD或CMOS构成。图像传感器120拍摄经由多个镜头210成像的光学图像,并将所拍摄的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。摄像控制部110可以根据来自UAV控制部30的摄像装置100的动作指令来控制摄像装置100。存储器130可以是计算机可读记录介质,也可以包括诸如SRAM,DRAM,EPROM,EEPROM和USB存储器等闪存中的至少一种。存储器130储存摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置在摄像装置100的壳体内部。存储器130可以可拆卸地设置在摄像装置100壳体上。
镜头部200具有多个镜头210、多个镜头驱动部212、以及镜头控制部220。多个镜头210可以用作变焦镜头,可变焦距镜头和聚焦镜头。多个镜头210的至少一部分或全部沿光轴可移动地布置。镜头部200可以是被设置成能够相对摄像部102拆装的交换镜头。镜头驱动部212经由凸轮环等机构构件使多个镜头210中的至少一部分或全部沿着光轴移动。镜头驱动部212可以包括致动器。致动器可以包括步进马达。镜头控制部220按照来自摄像部102的镜头控制指令来驱动镜头驱动部212,以经由机构构件使一个或多个镜头210沿着光轴方向移动。镜头控制指令例如是变焦控制指令和聚焦控制指令。
镜头部200还具有存储器222和位置传感器214。镜头控制部220按照来自摄像部102的镜头操作指令,经由镜头驱动部212来控制镜头210向光轴方向的移动。镜头控制部220按照来自摄像部102的镜头操作指令,经由镜头驱动部212来控制镜头 210向光轴方向的移动。镜头210的一部分或者全部沿光轴移动。镜头控制部220通过使镜头210中的至少一个沿着光轴移动,来执行变焦操作和聚焦操作中的至少一个。位置传感器214检测镜头210的位置。位置传感器214可以检测当前的变焦位置或聚焦位置。
镜头驱动部212可以包括抖动校正机构。镜头控制部220可以经由抖动校正机构使镜头210在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。镜头驱动部212可以由步进电机驱动抖动校正机构,以执行抖动校正。另外,抖动校正机构可以由步进电机驱动,以使图像传感器120在沿着光轴的方向或垂直于光轴的方向上移动,来执行抖动校正。
存储器222存储经由镜头驱动部212而移动的多个镜头210的控制值。存储器222可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。
在如上所述构成的UAV 10中,在跟踪作为活动体的目标物的同时,预测目标物与摄像装置100之间的距离。摄像装置100基于所预测的距离来控制聚焦镜头的位置进而对焦在目标物上。
例如,UAV 10将在如图3所示的道路600上行驶的车辆作为目标物500进行跟踪,同时使摄像装置100拍摄目标物500。
摄像控制部110具有导出部112和对焦控制部114。导出部112基于摄像装置100与由摄像装置100拍摄的目标物500在第1时刻的位置关系、摄像装置100在第1时刻的速度和移动方向、以及目标物500在第1时刻的速度和移动方向,来导出在第1时刻之后的第2时刻从摄像装置100到目标物500的距离。导出部112基于摄像装置100与由摄像装置100拍摄的目标物500在当前时刻的位置关系、摄像装置100在当前时刻的速度和移动方向、以及目标物500在当前时刻的速度和移动方向,预测性地导出在当前时刻的预定时间(例如5秒后)之后的时刻从摄像装置100到目标物500的距离。对焦控制部114根据第2时刻的距离来控制摄像装置100的聚焦镜头的位置。 对焦控制部114在第2时刻控制聚焦镜头的位置,以使焦点对焦于由导出部112导出的距离。
导出部112将在第1时刻摄像装置100与目标物500之间的距离、以及在第1时刻从摄像装置100到目标物500的方向确定为位置关系。导出部112可以根据由对焦控制部114执行的、基于由摄像装置100拍摄的多个图像的对比度自动聚焦(对比度AF)或相位差AF的结果,来确定在第1时刻摄像装置100与目标物500之间的距离。导出部112根据由摄像装置100所包含的测距传感器所测量的距离的结果来确定从摄像装置100到目标物500的距离。导出部112可以根据基于由GPS接收器41接收到的多个信号而确定的UAV 10的位置信息、基于万向节50的驱动指令而确定的万向节50相对于UAV 10的方向、以及由摄像装置100所拍摄的图像,来确定在第1时刻从摄像装置100到目标物500的方向。
导出部112将来自GPS接收器41的UAV 10的位置信息和来自气压高度计44的高度信息、以及来自目标物500所包含的GPS接收器的目标物500的位置信息和高度信息确定为位置关系。
导出部112可以基于位置关系来确定在第1时刻摄像装置100在预设的三维坐标系上的位置和目标物500在预设的三维坐标系上的位置。导出部112可以基于在第1时刻摄像装置100在坐标系上的位置、目标物500在坐标系上的位置、摄像装置100在第1时刻的速度和移动方向、以及目标物500在第1时刻的速度和移动方向,来确定在第2时刻摄像装置100在坐标系上的位置和目标物500在坐标系上的位置。导出部112可以基于在第2时刻的摄像装置100的位置和目标物的位置来导出第2时刻的距离。
导出部112可以基于目标物500在第1时刻的移动方向来设置坐标系。导出部112可以沿着目标物500的移动方向来设置坐标系的第1轴。导出部112可以将目标物500在第1时刻的位置设置为坐标系的原点。
例如,如图4所示,导出部112可以沿着目标物500的移动向量510的方向来设置坐标系的X轴。导出部112可以基于从摄像装置100拍摄的多个图像中确定出的光流来确定目标物500的移动向量510。导出部112可以基于从摄像装置100拍摄的多个图像中确定出的光流来确定目标物500的移动向量510。导出部112可以在坐标系中设置UAV 10的移动向量520。导出部112可以基于远程操作装置300发送的UAV 10的操作指令来确定UAV 10的移动向量520。导出部112可以基于从摄像装置100拍摄的多个图像中确定出的光流、以及远程操作装置300发送的UAV 10的操作指令来确定目标物500的移动向量510。
导出部112可以按照下式,根据目标物500在坐标系上的坐标点(x o(0),y o(0),z o(0))和UAV 10在坐标系上的坐标点(x d(0),y d(0),z d(0))来确定表示在第1时刻摄像装置100与目标物500之间的距离的Distance(0)。
【式1】
Figure PCTCN2019108224-appb-000001
例如,如图5所示,导出部112可以根据在第1时刻目标物500在坐标系上的坐标点(x o(0),y o(0),z o(0))、UAV 10在坐标系上的坐标点(x d(0),y d(0),z d(0))、在第1时刻目标物500的移动向量510、UAV 10的移动向量520,来确定在第2时刻摄像装置100在坐标系上的坐标点(x o(1),y o(1),z o(1))、UAV 10在坐标系上的坐标点(x d(1),y d(1),z d(1))。另外,导出部112可以按照下式,根据在第2时刻摄像装置100在坐标系上的坐标点(x o(1),y o(1),z o(1))和UAV 10在坐标系上的坐标点(x d(1),y d(1),z d(1))来确定表示在第2时刻摄像装置100与目标物500之间的距离的Distance(1)。
【式2】
Figure PCTCN2019108224-appb-000002
导出部112可以定期地确定目标物500的移动方向,并且定期地更新基于目标物500的移动方向的坐标系。导出部112在更新坐标系的同时更新目标物500和UAV 10在坐标系上的坐标点。
当基于由摄像装置100拍摄的图像判断出目标物500是车辆时,导出部112假定目标物500在沿着车辆的垂直方向的坐标系的Z轴方向上没有移动,则可以确定在第2时刻摄像装置100在坐标系上的位置和目标物500在坐标系上的位置。车辆的垂直方向可以是垂直于车辆的移动方向的方向。当基于由摄像装置100拍摄的图像判断出目标物500是车辆时,导出部112假定目标物500在沿着车辆的垂直方向的坐标系的Z轴方向上没有移动,则可以确定在第2时刻摄像装置100在坐标系上的坐标点(x o(1),y o(1),z o(1))和UAV 10在坐标系上的坐标点(x d(1),y d(1),z d(1))。
当基于由摄像装置100拍摄的图像判断出目标物500是车辆,并且车辆在直线道路上行驶时,导出部112假定目标物500在沿着车辆的垂直方向的坐标系的Z轴方向上没有移动,并且目标物500在沿着车辆的横向(左右方向)的坐标系的Y轴方向上没有移动,则可以确定在第2时刻摄像装置100在坐标系上的位置和目标物500在坐标系上的位置。
导出部112可以通过使用由摄像装置100拍摄的图像的模式识别来判断目标物500是车辆。导出部112可以基于由用户预先设置的目标物的种类来判断目标物500是车辆。导出部112可以通过使用由摄像装置100拍摄的图像的模式识别来判断车辆是否在直线道路上行驶。导出部112可以基于通过通信接口36获得的车辆的GPS信息和地图信息来判断车辆是否在直线道路上行驶。
导出部112可以通过使用由摄像装置100拍摄的图像的模式识别来确定目标物500的移动方向。导出部112可以基于通过对焦控制部114进行的主要被摄体检测的结果来确定目标物500的移动方向。导出部112可以基于摄像装置100的横摇信息来确定目标物500的移动方向。导出部112可以基于万向节50的驱动指令确定摄像装置100的横摇信息,并确定目标物500的移动方向。导出部112可以基于由测距传感 器或立体摄像机测量的到目标物500的距离、以及UAV 10的移动方向来确定目标物500的移动方向。导出部112可以基于通过对焦控制部114的对比度AF、相位差AF而确定的到目标物500的距离、以及UAV 10的移动方向来确定目标物500的移动方向。
如上所述,根据本实施方式所涉及的UAV 10,在UAV 10的飞行过程中,在预测到作为活动体的目标物500的距离的同时,根据该预测的距离控制聚焦镜头的位置。由此,搭载在UAV 10上的摄像装置100可以维持对目标物500的对焦状态。
图6示出聚焦镜头的位置与对焦距离的关系的一个示例。对焦距离表示到相对于聚焦镜头的位置可以获得预定对焦状态的被摄体的距离。对焦距离表示到相对于聚焦镜头的位置对比度值大于或等于预定阈值的被摄体的距离。如图6所示,当从摄像装置100到被摄体的距离较短时,聚焦镜头的位置变化相对于距离变化的比率增大。也就是说,当摄像装置100与目标物500之间的距离较短时,聚焦镜头的位置变化相对于距离变化的比率增大。因此,当摄像装置100与目标物500之间的距离较短时,可能来不及移动聚焦镜头,摄像装置100不能在维持适当的对焦状态的同时跟踪目标物500。
因此,UAV控制部30可以控制UAV 10的移动,以使从摄像装置100到目标物500的距离落在对焦稳定范围内,其中,所述对焦稳定范围是指使得聚焦镜头在对焦距离的每单位距离中的位置变化量小于或等于预定阈值的预设距离范围。可以通过预先实验或模拟预先确定对焦稳定范围。
对焦稳定范围取决于聚焦镜头的光学特性。也就是说,对焦稳定范围取决于镜头部200的种类。如果安装到摄像部102上的镜头部200是可更换镜头,则对焦稳定范围根据可更换镜头的种类而变化。因此,如果安装到摄像部102上的镜头部200是可更换镜头,则可以在UAV 10开始飞行之前或者在开始跟踪目标物之前驱动聚焦镜头,确定聚焦镜头的位置与对焦距离的关系,并且设置相对于所安装的可更换镜头的对焦稳定范围。
图7是用于对对焦稳定范围的确定方法进行说明的图。在图7中,f表示焦点距离。X 1表示从焦平面F到物体的距离。a表示从前侧主平面到物体的距离。X 2表示散焦量。b表示从后侧主平面到在图像传感器120上形成的图像的距离。Hd表示前侧主平面与后侧主平面之间的距离。D表示从物体到图像传感器120的摄像面的距离,即,被摄体距离。
根据牛顿的透镜公式,X 1·X 2=f 2。当X 1=D-2·f-Hd时,X 2=f 2/X 1=f 2/(D-2·f-Hd)。根据此公式来确定散焦量。
摄像控制部110可以具有确定部116以确定对焦稳定范围。确定部116改变聚焦镜头的位置以获得多个到处于对焦状态的被摄体的距离和多个聚焦镜头的位置,并且基于所获得的多个距离和多个聚焦镜头的位置,来导出聚焦镜头的位置和对焦距离的关系,进而根据关系确定作为预定距离范围的对焦稳定范围。
确定部116可以确定与多个对焦距离中的每一个相对应的聚焦镜头的位置,并且根据其结果来确定示出聚焦镜头的位置与对焦距离的关系的曲线。例如,如图8所示,确定部116分别确定聚焦镜头在5m和20m的对焦距离处的位置。确定部116可以从这2点确定示出如图9所示的聚焦镜头的位置与对焦距离的关系的曲线700,并且通过该曲线700,确定出使得聚焦镜头在对焦距离的每单位距离中的位置变化量小于或等于预定阈值的对焦稳定范围。
图10是示出对焦稳定范围的确定程序的一个示例的流程图。在跟踪移动体时,确定部116判定安装在摄像部102上的镜头部200是否是已经注册了对焦稳定范围的可更换镜头(S100)。当没有注册的可更换镜头安装在摄像部102时,确定部116在UAV 10飞行过程中通过测距传感器获得到被摄体的距离,并且确定对应于该距离的聚焦镜头的位置,从而执行校准(S102)。确定部116基于通过校准所确定的聚焦镜头的位置和对焦距离的关系来确定对焦稳定范围(S104)。确定部116向UAV控制部30通知注册或确定的对焦稳定范围。UAV控制部30控制UAV 10的飞行,以使到被摄体的距离落在对焦稳定范围内(S106)。
根据本实施方式,除了目标500之外,即使摄像装置100发生移动,也可以维持对目标物的对焦状态。此外,控制UAV 10的飞行,以使从摄像装置100到目标物500的距离落在基于聚焦镜头的位置与对焦距离的关系的对焦稳定范围内。从而可以防止由于不能及时移动聚焦镜头而使摄像装置100不能在维持适当的对焦状态的同时跟踪目标物500。
图11示出了可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU 1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。
本实施方式的计算机1200包括CPU 1212和RAM 1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM 1230。CPU 1212按照ROM 1230及RAM 1214内存储的程序而工作,从而控制各单元。
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以储存计算机1200内的CPU 1212所使用的程序及数据。ROM 1230在其中储存运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM 1214或ROM 1230中,并通过CPU 1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。
例如,当在计算机1200和外部装置之间执行通信时,CPU 1212可执行加载在RAM 1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU 1212的控制下,读取存储在RAM 1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU 1212可以使RAM 1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM 1214上的数据执行各种类型的处理。接着,CPU 1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM 1214读取的数据,CPU 1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM 1214中。此外,CPU 1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中储存具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU 1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内储存的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。
应该注意的是,权利要求书、说明书以及附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺 序实现。关于权利要求书、说明书以及附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。

Claims (14)

  1. 一种控制装置,其特征在于,包含:导出部,其基于摄像装置和由所述摄像装置拍摄的目标物在第1时刻的位置关系、所述摄像装置在所述第1时刻的速度和移动方向、以及所述目标物在所述第1时刻的速度和移动方向,来导出在所述第1时刻之后的第2时刻从所述摄像装置到所述目标物的距离;以及
    第1控制部,其根据所述第2时刻的所述距离来控制所述摄像装置的聚焦镜头的位置。
  2. 如权利要求1所述的控制装置,其特征在于,所述导出部将所述摄像装置与所述目标物之间的距离、以及从所述摄像装置到所述目标物的方向确定为所述位置关系。
  3. 如权利要求1所述的控制装置,其特征在于,
    所述导出部基于所述位置关系来确定在所述第1时刻所述摄像装置在预设的三维坐标系上的位置和所述目标物在预设的三维坐标系上的位置;
    基于在所述第1时刻所述摄像装置在所述坐标系上的位置、所述目标物在所述坐标系上的位置、所述摄像装置在所述第1时刻的速度和移动方向、以及所述目标物在所述第1时刻的速度和移动方向,来确定在所述第2时刻所述摄像装置在所述坐标系上的位置和所述目标物在所述坐标系上的位置;
    基于在所述第2时刻的所述摄像装置的位置和所述目标物的位置来导出所述第2时刻的所述距离。
  4. 如权利要求3所述的控制装置,其特征在于,所述导出部基于所述目标物在所述第1时刻的移动方向来设置所述坐标系。
  5. 如权利要求4所述的控制装置,其特征在于,所述导出部可以沿着所述目标物的移动方向来设置所述坐标系的第1轴。
  6. 如权利要求4所述的控制装置,其特征在于,所述导出部可以将所述目标物在所述第1时刻的位置设置为所述坐标系的原点。
  7. 如权利要求5所述的控制装置,其特征在于,当基于由所述摄像装置拍摄的图像判断出所述目标物是车辆时,所述导出部假定所述目标物在沿着所述车辆的垂直方向的所述坐标系的第2轴方向上没有移动,则可以确定在所述第2时刻所述摄像装置在所述坐标系上的位置和所述目标物在所述坐标系上的位置。
  8. 如权利要求5所述的控制装置,其特征在于,当基于由所述摄像装置拍摄的图像判断出所述目标物是车辆,并且所述车辆在直线道路上行驶时,所述导出部假定所述目标物在沿着所述车辆的垂直方向的所述坐标系的第2轴方向上没有移动,并且所述目标物在沿着所述车辆的横向的所述坐标系的第3轴方向上没有移动,则可以确定在所述第2时刻所述摄像装置在所述坐标系上的位置和所述目标物在所述坐标系上的位置。
  9. 一种摄像装置,其特征在于,包含:如权利要求1至权利要求8中任意一项所述的控制装置;
    所述聚焦镜头;以及
    图像传感器。
  10. 一种移动体,其特征在于,其搭载如权利要求9所述的摄像装置并进行移动。
  11. 如权利要求10所述的移动体,其特征在于,所述控制装置包含:
    第2控制部,其控制移动体的移动,以使从所述摄像装置到所述目标物的距离落在预定距离范围内,其中,所述预定距离范围使得所述聚焦镜头在对焦距离的每单位距离中的位置变化量小于或等于预定阈值。
  12. 如权利要求11所述的移动体,其特征在于,所述控制装置还包含:
    确定部,其改变所述聚焦镜头的位置以获得多个到处于对焦状态的被摄体的距离和多个所述聚焦镜头的位置,并且基于所获得的多个所述距离和多个所述聚焦镜头的位置,来导出所述聚焦镜头的位置和对焦距离的关系,进而根据所述关系确定所述预定的距离范围。
  13. 一种控制方法,其特征在于,包含:其基于摄像装置和由所述摄像装置拍摄的目标物在第1时刻的位置关系、所述摄像装置在所述第1时刻的速度和移动方向、以及所述目标物在所述第1时刻的速度和移动方向,来导出在所述第1时刻之后的第2时刻从所述摄像装置到所述目标物的距离的阶段;以及
    根据所述第2时刻的所述距离来控制所述摄像装置的聚焦镜头的位置的阶段。
  14. 一种程序,其特征在于,其用于使计算机作为如权利要求1至权利要求8中任意一项所述的控制装置而起作用。
PCT/CN2019/108224 2018-09-27 2019-09-26 控制装置、摄像装置、移动体、控制方法以及程序 WO2020063770A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980005027.0A CN111213369B (zh) 2018-09-27 2019-09-26 控制装置、方法、摄像装置、移动体以及计算机可读存储介质
US17/198,233 US20210218879A1 (en) 2018-09-27 2021-03-10 Control device, imaging apparatus, mobile object, control method and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-181833 2018-09-27
JP2018181833A JP6696094B2 (ja) 2018-09-27 2018-09-27 移動体、制御方法、及びプログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/198,233 Continuation US20210218879A1 (en) 2018-09-27 2021-03-10 Control device, imaging apparatus, mobile object, control method and program

Publications (1)

Publication Number Publication Date
WO2020063770A1 true WO2020063770A1 (zh) 2020-04-02

Family

ID=69953362

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/108224 WO2020063770A1 (zh) 2018-09-27 2019-09-26 控制装置、摄像装置、移动体、控制方法以及程序

Country Status (4)

Country Link
US (1) US20210218879A1 (zh)
JP (1) JP6696094B2 (zh)
CN (1) CN111213369B (zh)
WO (1) WO2020063770A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080036902A1 (en) * 2006-08-11 2008-02-14 Canon Kabushiki Kaisha Image-pickup apparatus
CN101494735A (zh) * 2008-01-25 2009-07-29 索尼株式会社 摄像装置、摄像装置控制方法以及计算机程序
CN103795909A (zh) * 2012-10-29 2014-05-14 株式会社日立制作所 拍摄优化装置、摄像装置及拍摄优化方法
CN104683690A (zh) * 2013-11-29 2015-06-03 安讯士有限公司 使用摄像机跟随由标签装置所标记对象的系统
CN105376477A (zh) * 2014-08-08 2016-03-02 卡西欧计算机株式会社 检测装置以及检测方法
CN105827961A (zh) * 2016-03-22 2016-08-03 努比亚技术有限公司 移动终端及对焦方法
WO2018008834A1 (ko) * 2016-07-06 2018-01-11 주식회사 케이에스에스이미지넥스트 차량용 카메라 제어 장치 및 방법

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030213869A1 (en) * 2002-05-16 2003-11-20 Scott Mark Winfield Translational propulsion system for a hybrid vehicle
CN102034355A (zh) * 2010-12-28 2011-04-27 丁天 一种基于特征点匹配的车辆检测及跟踪方法
US9026134B2 (en) * 2011-01-03 2015-05-05 Qualcomm Incorporated Target positioning within a mobile structure
CN105452926B (zh) * 2013-09-05 2018-06-22 富士胶片株式会社 摄像装置及对焦控制方法
CN104469123B (zh) * 2013-09-17 2018-06-01 联想(北京)有限公司 一种补光的方法及一种图像采集装置
WO2016074169A1 (zh) * 2014-11-12 2016-05-19 深圳市大疆创新科技有限公司 一种对目标物体的检测方法、检测装置以及机器人
JP6596745B2 (ja) * 2015-10-20 2019-10-30 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド 対象物体を撮像するシステム
CN106357973A (zh) * 2016-08-26 2017-01-25 深圳市金立通信设备有限公司 一种聚焦的方法及终端
JP6899500B2 (ja) * 2016-10-17 2021-07-07 イームズロボティクス株式会社 移動体捕獲装置、移動体捕獲方法及びプログラム
CN107507245A (zh) * 2017-08-18 2017-12-22 南京阿尔特交通科技有限公司 一种车辆跟驰轨迹的动态采集方法及系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080036902A1 (en) * 2006-08-11 2008-02-14 Canon Kabushiki Kaisha Image-pickup apparatus
CN101494735A (zh) * 2008-01-25 2009-07-29 索尼株式会社 摄像装置、摄像装置控制方法以及计算机程序
CN103795909A (zh) * 2012-10-29 2014-05-14 株式会社日立制作所 拍摄优化装置、摄像装置及拍摄优化方法
CN104683690A (zh) * 2013-11-29 2015-06-03 安讯士有限公司 使用摄像机跟随由标签装置所标记对象的系统
CN105376477A (zh) * 2014-08-08 2016-03-02 卡西欧计算机株式会社 检测装置以及检测方法
CN105827961A (zh) * 2016-03-22 2016-08-03 努比亚技术有限公司 移动终端及对焦方法
WO2018008834A1 (ko) * 2016-07-06 2018-01-11 주식회사 케이에스에스이미지넥스트 차량용 카메라 제어 장치 및 방법

Also Published As

Publication number Publication date
US20210218879A1 (en) 2021-07-15
CN111213369A (zh) 2020-05-29
CN111213369B (zh) 2021-08-24
JP6696094B2 (ja) 2020-05-20
JP2020052255A (ja) 2020-04-02

Similar Documents

Publication Publication Date Title
CN108235815B (zh) 摄像控制装置、摄像装置、摄像系统、移动体、摄像控制方法及介质
WO2019120082A1 (zh) 控制装置、系统、控制方法以及程序
WO2019238044A1 (zh) 确定装置、移动体、确定方法以及程序
WO2020011230A1 (zh) 控制装置、移动体、控制方法以及程序
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
CN110337609B (zh) 控制装置、镜头装置、摄像装置、飞行体以及控制方法
WO2019242616A1 (zh) 确定装置、摄像系统、移动体、合成系统、确定方法以及程序
JP6543875B2 (ja) 制御装置、撮像装置、飛行体、制御方法、プログラム
JP6515423B2 (ja) 制御装置、移動体、制御方法、及びプログラム
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
JP6641574B1 (ja) 決定装置、移動体、決定方法、及びプログラム
CN110785997B (zh) 控制装置、摄像装置、移动体以及控制方法
WO2020063770A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019242611A1 (zh) 控制装置、移动体、控制方法以及程序
JP6569157B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
WO2020125414A1 (zh) 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序
WO2019085794A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序
JP2020052220A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19865715

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19865715

Country of ref document: EP

Kind code of ref document: A1