WO2021031840A1 - 装置、摄像装置、移动体、方法以及程序 - Google Patents

装置、摄像装置、移动体、方法以及程序 Download PDF

Info

Publication number
WO2021031840A1
WO2021031840A1 PCT/CN2020/106737 CN2020106737W WO2021031840A1 WO 2021031840 A1 WO2021031840 A1 WO 2021031840A1 CN 2020106737 W CN2020106737 W CN 2020106737W WO 2021031840 A1 WO2021031840 A1 WO 2021031840A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
subject
distance
area
control
Prior art date
Application number
PCT/CN2020/106737
Other languages
English (en)
French (fr)
Inventor
永山佳范
本庄谦一
关范江
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080003363.4A priority Critical patent/CN112313943A/zh
Publication of WO2021031840A1 publication Critical patent/WO2021031840A1/zh
Priority to US17/524,637 priority patent/US20220070362A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the invention relates to a control device, a camera system, a control method and a program.
  • Patent Document 1 describes that the distance to the target subject is measured based on the reflected light of the light pulse.
  • the focus control may not be properly performed sometimes.
  • the control device may be a control device for an imaging system.
  • the imaging system includes a distance measuring sensor that measures the distance to each subject associated with each of the plurality of areas, and the distance measuring sensor A camera device that performs focus control at the measured distance.
  • the control device may include a circuit, the circuit is configured to: make the imaging device perform the operation on the first subject according to the first distance to the first subject associated with the first region of the plurality of regions measured by the distance measuring sensor After the focus control, it is determined whether there is a second subject as a moving subject in the first area based on a plurality of images taken by the imaging device.
  • the circuit may be configured to: when it is determined that the second subject does not exist in the first area, based on the second distance to the first subject associated with the first area further measured by the distance measuring sensor, The imaging device executes the focus control.
  • the circuit may be configured to not cause the imaging device to perform focus control according to the second distance when it is determined that the second subject exists in the first area.
  • the circuit may be configured to derive the optical flow associated with the first region based on a plurality of images captured by the imaging device, and determine whether the second subject exists in the first region based on the optical flow.
  • the circuit may be configured to determine whether there is a second subject in the first region based on at least one of brightness information, color information, edge information, and contrast information of each of the plurality of images captured by the imaging device.
  • the circuit may be configured to determine whether there is a second subject in the first area when the angle of view of the ranging sensor is smaller than the angle of view of the imaging device.
  • the imaging system may include a supporting mechanism that supports the imaging device in a manner capable of rotating the imaging device.
  • the circuit can determine whether the support mechanism rotates the camera device in the first direction in which the camera direction of the camera device is to be changed according to the control command to the support mechanism.
  • the circuit can determine from the multiple areas measured by the distance measuring sensor that the supporting mechanism rotates the imaging device in the first direction by the first rotation amount according to the control command, and then the imaging device responds The third area in focus.
  • the circuit may be configured to: according to the third distance to the third subject associated with the third area measured by the distance measuring sensor, during the first rotation of the imaging device in the first direction, the imaging device The third subject performs focus control.
  • the third area may be an area outside the angle of view of the imaging device at a point in time before the imaging device rotates the imaging device in the first direction according to the control command.
  • the imaging system may include the above-mentioned control device, a distance measuring sensor, and an imaging device.
  • the control method may be a control method for controlling a camera system.
  • the camera system includes a distance measuring sensor that measures the distance to each subject associated with each of the multiple areas, and A camera device that performs focus control based on the distance measured by the sensor.
  • the control method may include: according to the first distance to the first subject associated with the first region of the plurality of regions measured by the distance measuring sensor, after the imaging device performs focus control on the first subject , Based on a plurality of images captured by the imaging device, it is determined whether there is a second subject as a moving subject in the first area.
  • the control method may include: when it is determined that the second subject does not exist in the first area, according to the second distance to the first subject associated with the first area further measured by the The imaging device executes the focus control.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned control device.
  • an imaging device that performs focus control based on a distance measured by a distance measuring sensor, it is possible to prevent a situation where the focus control cannot be appropriately performed when another subject passes in front of the subject to be focused.
  • Fig. 1 is an example of an external perspective view of an imaging system.
  • Fig. 2 shows an example of an external perspective view of another form of the camera system.
  • Fig. 3 shows a schematic diagram of the functional blocks of the camera system.
  • FIG. 4 shows a schematic diagram of the color distribution and optical flow when the non-primary subject crosses the front of the main subject.
  • FIG. 5 is a diagram showing one example of the relationship between the angle of view of the imaging device and the angle of view of the TOF sensor.
  • FIG. 6 is a diagram showing one example of the relationship between the angle of view of the camera device and the angle of view of the TOF sensor.
  • FIG. 7A is a diagram showing an example of a process of performing focus control using an imaging system.
  • FIG. 7B is a diagram showing an example of a process of performing focus control using the imaging system.
  • Fig. 8 is a diagram showing an example of the appearance of the unmanned aircraft and the remote operation device.
  • Fig. 9 is a diagram showing an example of the hardware configuration.
  • the blocks can represent (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium having instructions stored thereon includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included.
  • the computer readable medium may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (registered trademark) ) CDs, memory sticks, integrated circuit cards, etc.
  • floppy registered trademark
  • floppy disk floppy disk
  • hard disk random access memory
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • Blu-ray registered trademark
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, control devices, micro control devices, and the like.
  • FIG. 1 is an example of an external perspective view of an imaging system 10 according to this embodiment.
  • the imaging system 10 includes an imaging device 100, a supporting mechanism 200 and a grip 300.
  • the imaging device 100 includes a TOF sensor 160.
  • the support mechanism 200 uses actuators to rotatably support the imaging device 100 around the roll axis, the pitch axis, and the yaw axis.
  • the support mechanism 200 can change or maintain the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the roll axis, the pitch axis, and the yaw axis.
  • the support mechanism 200 includes a roll axis drive mechanism 201, a pitch axis drive mechanism 202, and a yaw axis drive mechanism 203.
  • the supporting mechanism 200 also includes a base 204 for fixing the yaw axis driving mechanism 203.
  • the grip 300 is fixed to the base 204.
  • the holding part 300 includes an operation interface 301 and a display part 302.
  • the imaging device 100 is fixed to the pitch axis driving mechanism 202.
  • the operation interface 301 receives commands for operating the imaging device 100 and the supporting mechanism 200 from the user.
  • the operation interface 301 may include a shutter/recording button for instructing to capture or record by the imaging device 100.
  • the operation interface 301 may include a power/function button for instructing to turn on or off the power of the camera system 10 and to switch between the still image shooting mode or the moving image shooting mode of the camera 100.
  • the display unit 302 can display an image captured by the imaging device 100.
  • the display unit 302 can display a menu screen for operating the imaging device 100 and the support mechanism 200.
  • the display part 302 may be a touch panel display, which may receive commands for operating the imaging device 100 and the supporting mechanism 200.
  • FIG. 2 shows an example of an external perspective view of another form of the camera system 10.
  • the camera system 10 can be used in a state where a mobile terminal including a display such as a smartphone 400 is fixed on the side of the grip 300.
  • the user holds the grip 300 and takes a still image or an animation through the imaging device 100.
  • the display of the smartphone 400 or the like displays still images or moving pictures through the imaging device 100.
  • FIG. 3 shows a schematic diagram of functional blocks of the camera system 10.
  • the imaging device 100 includes an imaging control unit 110, an image sensor 120, a memory 130, a lens control unit 150, a lens driving unit 152, a plurality of lenses 154, and a TOF sensor 160.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 is an example of an image sensor used for shooting.
  • the image sensor 120 outputs image data of optical images formed by the plurality of lenses 154 to the imaging control unit 110.
  • the imaging control unit 110 may be composed of a microprocessor such as a CPU or an MPU, a micro-control device such as an MCU, or the like.
  • the imaging control unit 110 follows the operation instruction of the holding unit 300 to the imaging device 100, and the imaging control unit 110 performs demosaicing processing on the image signal output from the image sensor 120, thereby generating image data.
  • the imaging control unit 110 stores image data in the memory 130.
  • the imaging control unit 110 controls the TOF sensor 160.
  • the imaging control unit 110 is an example of a circuit.
  • the TOF sensor 160 is a time-of-flight type sensor that measures the distance to an object.
  • the imaging device 100 adjusts the position of the focus lens according to the distance measured by the TOF sensor 160, thereby performing focus control.
  • the memory 130 may be a computer-readable storage medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the grip 300 may include other memory for storing image data captured by the imaging device 100.
  • the holding part 300 may have a slot through which the memory can be detached from the housing of the holding part 300.
  • the multiple lenses 154 can function as a zoom lens, a variable focal length lens, and a focusing lens. At least a part or all of the plurality of lenses 154 are configured to be movable along the optical axis.
  • the lens control unit 150 drives the lens driving unit 152 in accordance with a lens control command from the imaging control unit 110 to move one or more lenses 154 in the optical axis direction.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens driving part 152 may include a voice coil motor (VCM) that moves at least a part or all of the plurality of lenses 154 in the optical axis direction.
  • the lens driving part 152 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor.
  • the lens driving unit 152 can transmit the power from the motor to at least a part or all of the plurality of lenses 154 via mechanism components such as cam rings and guide shafts, and move at least a part or all of the plurality of lenses 154 along the optical axis.
  • the imaging device 100 further includes a posture control unit 210, an angular velocity sensor 212, and an acceleration sensor 214.
  • the angular velocity sensor 212 detects the angular velocity of the imaging device 100.
  • the angular velocity sensor 212 detects the respective angular velocities of the imaging device 100 around the roll axis, the pitch axis, and the yaw axis.
  • the posture control unit 210 obtains angular velocity information related to the angular velocity of the imaging device 100 from the angular velocity sensor 212.
  • the angular velocity information may show the respective angular velocities of the camera device 100 around the roll axis, pitch axis, and yaw axis.
  • the posture control unit 210 can acquire acceleration information related to the acceleration of the imaging device 100 from the acceleration sensor 214.
  • the acceleration information may also indicate the acceleration of the imaging device 100 in each direction of the roll axis, the pitch axis, and the yaw axis.
  • the angular velocity sensor 212 and the acceleration sensor 214 may be provided in a housing that houses the image sensor 120, the lens 154, and the like. In this embodiment, a mode in which the imaging device 100 and the support mechanism 200 are integrally configured will be described. However, the supporting mechanism 200 may include a base for detachably fixing the camera device 100. In this case, the angular velocity sensor 212 and the acceleration sensor 214 may be provided outside the housing of the imaging device 100 such as a base.
  • the posture control unit 210 controls the support mechanism 200 according to the angular velocity information and acceleration information to maintain or change the posture of the imaging device 100.
  • the posture control unit 210 controls the support mechanism 200 according to the operation mode of the support mechanism 200 for controlling the posture of the imaging device 100 to maintain or change the posture of the imaging device 100.
  • the working modes include the following modes: operating at least one of the roll axis driving mechanism 201, the pitch axis driving mechanism 202, and the yaw axis driving mechanism 203 of the support mechanism 200 so that the posture of the camera device 100 changes to follow the base of the support mechanism 200 204's posture changes.
  • the operation modes include the following modes: the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 of the support mechanism 200 are operated separately so that the posture of the camera device 100 changes to follow the posture of the base 204 of the support mechanism 200 The change.
  • the operation modes include the following modes: separately operating the pitch axis driving mechanism 202 and the yaw axis driving mechanism 203 of the support mechanism 200 so that the posture change of the camera device 100 follows the posture change of the base 204 of the support mechanism 200.
  • the operation mode includes the following mode: only the yaw axis driving mechanism 203 is operated so that the change of the posture of the camera 100 follows the change of the posture of the base 204 of the support mechanism 200.
  • the working modes may include FPV (First Person View) mode and fixed mode.
  • FPV mode the support mechanism 200 is operated so that the change of the posture of the camera device 100 follows the change of the posture of the base 204 of the support mechanism 200
  • fixed mode the support mechanism 200 is operated to maintain the posture of the imaging device 100.
  • the FPV mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated to make the posture change of the camera 100 follow the posture change of the base 204 of the support mechanism 200.
  • the fixed mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated in order to maintain the current posture of the imaging device 100.
  • the TOF sensor 160 includes a light emitting unit 162, a light receiving unit 164, a light emitting control unit 166, a light receiving control unit 167, and a memory 168.
  • the TOF sensor 160 is an example of a distance measuring sensor.
  • the light emitting part 162 includes at least one light emitting element 163.
  • the light emitting element 163 is a device that repeatedly emits pulsed light modulated at high speed, such as an LED or a laser.
  • the light emitting element 163 may emit pulsed light which is infrared light.
  • the light emission control unit 166 controls the light emission of the light emitting element 163.
  • the light emission control section 166 can control the pulse width of the pulse light emitted from the light emitting element 163.
  • the light receiving unit 164 includes a plurality of light receiving elements 165 that measure the distance to the subject associated with each of the plurality of regions.
  • the light receiving unit 164 is an example of an image sensor used for distance measurement.
  • the plurality of light receiving elements 165 are respectively associated with a plurality of regions.
  • the light receiving element 165 repeatedly receives reflected light of pulsed light from the object.
  • the light receiving element 165 receives light including reflected light of pulsed light from the object, and outputs a signal related to the amount of received light.
  • the light receiving control unit 167 controls the light receiving element 165 to receive light.
  • the light receiving control unit 167 measures the distance to the subject associated with each of the plurality of areas based on the signal output from the light receiving element 165.
  • the light receiving control unit 167 measures the distance to the subject associated with each of the plurality of regions based on the amount of reflected light repeatedly received by the light receiving element 165 during the preset light receiving period.
  • the light receiving control unit 167 can measure the distance to the subject by determining the phase difference between the pulsed light and the reflected light according to the amount of reflected light repeatedly received by the light receiving element 165 in a preset light receiving period.
  • the light receiving unit 164 can measure the distance to the subject by reading the frequency change of the reflected wave. This is called FMCW (Frequency Modulated Continuous Wave) mode.
  • FMCW Frequency Modulated Continuous Wave
  • the memory 168 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, and EEPROM.
  • the memory 168 stores a program required for the light emitting control unit 166 to control the light emitting unit 162, a program required for the light receiving control unit 167 to control the light receiving unit 164, and the like.
  • the auto focus (AF) method executed by the imaging device 100 will be described.
  • the imaging device 100 can move the focus lens according to the distance (subject distance) from the imaging device 100 to the subject measured by the TOF sensor 160, thereby controlling the positional relationship between the focus lens and the imaging surface of the image sensor 120.
  • the imaging device 100 may determine the target position of the focus lens focusing on the subject according to the distance (subject distance) from the imaging device 100 to the subject measured by the TOF sensor 160, and move the focus lens to the target position, Thus, focus control is performed.
  • the imaging control unit 110 determines the target position of the focus lens focusing on the main subject based on distance information indicating the distance of the first region (ROI) including the main subject in the plurality of regions measured by the TOF sensor 160.
  • the imaging control unit 110 moves the focus lens to the target position.
  • the imaging control unit 110 performs focus control on the main subject.
  • the imaging system 10 described above there are cases where a moving body passes between the main subject and the imaging device 100.
  • the first area distance measured by the TOF sensor 160 is the distance to the moving body instead of the distance to the main subject.
  • the imaging control section 110 performs focus control based on the distance information from the TOF sensor 160, there are cases where the main subject is not focused.
  • the imaging The control section 110 does not perform focus control based on the distance information about the first area measured by the TOF sensor 160. Thereby, it is possible to prevent a situation in which the imaging device 100 performs focus control based on the distance information measured by the TOF sensor 160 and causes focusing on a non-main subject instead of the main subject.
  • the imaging control unit 110 causes the imaging device 100 to contact the first object based on the first distance measured by the TOF sensor 160 to the first object that is the main object associated with the first area of the plurality of areas. After the focus control is performed, it is determined whether there is a second object that is a non-primary object as a moving object in the first area based on a plurality of images captured by the imaging device 100.
  • the first area can be divided into multiple blocks. When there is a non-primary subject that is a second subject in at least one of the multiple blocks, the imaging control unit 110 may determine that there is a non-primary subject that is a moving subject in the first area, that is, the second subject. Photograph.
  • the imaging control unit 110 may determine that there is a non-primary subject that is a moving subject in the first area. The second subject.
  • the imaging control unit 110 may derive the optical flow associated with the first area based on a plurality of images captured by the imaging device 100, and determine whether the second subject exists in the first area based on the optical flow.
  • the imaging control unit 110 may divide each of the plurality of images into a plurality of blocks, and derive the movement vector for each block, thereby deriving the optical flow.
  • the imaging control unit 110 can derive the optical flow by deriving a movement vector for each pixel constituting each image.
  • the imaging control unit 110 may determine whether there is a second subject in the first region based on at least one of brightness information, color information, edge information, and contrast information of each of the plurality of images captured by the imaging device 100. For example, the imaging control unit 110 may divide multiple images into multiple blocks, compare the brightness information, color information, edge information, or contrast information of each block by block, and determine its changes over time. It is determined whether there is a second subject in the first area.
  • the imaging control section 110 may cause the imaging device 100 to perform the execution based on the second distance to the first subject associated with the first area further measured by the TOF sensor 160 Focus control.
  • the imaging control section 110 may make the imaging device 100 not perform focus control according to the second distance.
  • the imaging control unit 110 may determine whether the second subject exists in the first area. If the angle of view of the imaging device 100 is greater than the angle of view of the TOF sensor 160, the imaging control unit 110 may determine whether there is a non-main subject outside the angle of view of the TOF sensor 160 according to the image captured by the imaging device 100.
  • FIG. 4 is a diagram showing color distribution and optical flow conditions in a case where a non-main subject crosses the front of the main subject.
  • the imaging control section 110 performs focus control based on the distance measurement information of the TOF sensor 160 to focus on the main subject 410 in the region of interest (ROI) of the TOF sensor 160.
  • the non-main subject 412 enters the imaging area 401 of the imaging device 100. The non-main subject 412 moves from the left to the right in the horizontal direction within the imaging area 401.
  • the imaging control unit 110 detects the presence of the non-main subject 412 based on the optical flow.
  • the imaging control unit 110 determines based on the optical flow that the non-main subject 412 is moving from the left to the right in the horizontal direction.
  • the imaging control unit 110 detects that the non-main subject is passing in front of the main subject based on the optical flow. That is, the imaging control unit 110 detects that the non-main subject 412 exists in the ROI of the TOF sensor 160 based on the optical flow.
  • the imaging control unit 110 does not perform focus control based on the distance information measured by the TOF sensor 160ROI.
  • the imaging control unit 110 can detect the presence of the non-main subject 412 and the presence of the non-main subject 412 in the ROI of the TOF sensor 160 based on the change in the color distribution.
  • the imaging control unit 110 detects that the non-main subject 412 does not exist in the ROI of the TOF sensor 160 based on the optical flow. Thus, the imaging control unit 110 restarts the focus control based on the distance information measured by the ROI of the TOF sensor 160.
  • the viewing angle of the imaging device 100 is smaller than the viewing angle of the TOF sensor 160.
  • the imaging device 100 is driven by the support mechanism 200 to rotate in the direction to which the imaging direction of the imaging device 100 is to be changed while shooting is performed.
  • the TOF sensor 160 measures the distance to the subject outside the angle of view of the imaging device 100 in advance, so that it is possible to focus on the subject outside the angle of view of the imaging device 100 in a short time.
  • the support mechanism 200 drives the imaging device 100 to rotate in the first direction (pan direction or tilt direction) 450 to which the imaging direction of the imaging device 100 is to be changed, and the imaging device 100 is focused on the imaging of the imaging device 100
  • the subject 430 outside the angle of view of the device 100 is photographed.
  • the distance to the subject 430 is measured by the TOF sensor 160 in advance.
  • the imaging device 100 focuses on the subject 430 according to the distance information measured in advance by the TOF sensor 160.
  • the imaging control section 110 may determine whether the supporting mechanism 200 causes the imaging device 100 to change the imaging direction of the imaging device according to the control command to the supporting mechanism 200 Into the first direction of rotation.
  • the imaging control unit 110 may determine whether to rotate the support mechanism 200 in a pan direction or a tilt direction according to a control command to the support mechanism 200.
  • the imaging control unit 110 determines from the plurality of regions measured by the TOF sensor 160 that the supporting mechanism 200 rotates the imaging device 100 in the first direction according to the control command.
  • the third area may be an area other than the angle of view 420 of the imaging device 100 at a point in time before the imaging device 100 is rotated in the first direction according to the control command.
  • the imaging control unit 110 may, according to the third distance to the third subject associated with the third area measured by the TOF sensor 160, cause the imaging device 100 to rotate in the first direction by the first rotation amount. 100 performs focus control on the third subject.
  • FIGS. 7A and 7B are diagrams showing an example of a process of performing focus control using the imaging system 10.
  • the imaging control unit 110 acquires information on the angle of view of the TOF sensor 160 and the angle of view of the imaging device 100 (S100).
  • the imaging control unit 110 may obtain information about the angle of view of the TOF sensor 160 and the angle of view of the imaging device 100 stored in the memory 130 or the memory 168.
  • the imaging control unit 110 can acquire the information of the angle of view of the imaging device 100 according to the setting information of the zoom lens through the lens control unit 150.
  • the imaging control unit 110 determines whether the angle of view of the TOF sensor 160 is not smaller than the angle of view of the imaging device 100 (S102). When the viewing angle of the TOF sensor 160 is not less than the viewing angle of the imaging device 100, the imaging control part 110 acquires the control information of the support mechanism 200 through the posture control part 210. The imaging control unit 110 determines whether the control information instructs the support mechanism 200 to rotate in the pan direction or the tilt direction (S104).
  • the imaging control section 110 When the control information does not instruct the support mechanism 200 to rotate in the pan direction or the tilt direction, the imaging control section 110 performs focus control by focusing on a preset object according to the distance information in the ROI obtained by the TOF sensor 160 (S106). When the control information instructs the support mechanism 200 to rotate in the pan direction or the pitch direction, the imaging control unit 110 determines whether a preset object can be detected in the rotation destination of the imaging device 100 within the viewing angle of the TOF sensor 160 (S108). The imaging control unit 110 determines the first direction and the first rotation amount of the imaging direction rotation of the imaging device 100 based on the control information.
  • the imaging control unit 110 determines the area of the TOF sensor 160 where the preset area such as the ROI of the imaging device 100 is located when the imaging device 100 is rotated in the first direction by the first rotation amount according to the first direction and the first rotation amount. When the determined distance information in the area of the TOF sensor 160 indicates the preset distance range, the imaging control unit 110 determines that the preset object can be detected at the rotation destination of the imaging device 100.
  • the imaging control section 110 performs focus control based on the determined distance information in the area of the TOF sensor 160 while the imaging device 100 is rotated in the first direction by the first rotation amount (S110).
  • the imaging control unit 110 detects an object within the angle of view of the imaging device 100 (S112).
  • the imaging control unit 110 may detect an object satisfying a preset condition from the ROI of the imaging device 100 preset in the angle of view of the imaging device 100.
  • the imaging control unit 110 may detect an object satisfying preset conditions such as a face within the angle of view of the imaging device 100.
  • the imaging control unit 110 performs focus control on the detected object based on the distance information from the TOF sensor 160 or the distance information determined from the image of the imaging device 100 (S114).
  • the imaging control unit 110 sets the area where the detected object exists among the plurality of areas measured by the TOF sensor 160 as the ROI of the TOF sensor 160 (S116).
  • the imaging control unit 110 acquires the distance information of the ROI of the installed TOF sensor 160 (S118). The imaging control unit 110 determines whether or not there is a traversing object that traverses the ROI of the TOF sensor 160 (S120).
  • the imaging control unit 110 sets the ROI of the imaging device 100 in order to determine whether or not there is a crossing object.
  • the imaging control unit 110 divides the ROI into a plurality of regions, and acquires an optical flow for each region (S202). Furthermore, the imaging control unit 110 determines whether or not there is a traversing object that traverses the ROI of the TOF sensor 160 based on the optical flow of each region (S204).
  • the imaging control section 110 When there is a traversing object that traverses the ROI of the TOF sensor 160, the imaging control section 110 does not perform focus control based on the distance information acquired in step S118, but reacquires the distance information of the ROI of the TOF sensor 160.
  • the imaging control section 110 When there is no traversing object that traverses the ROI of the TOF sensor 160, the imaging control section 110 performs focus control based on the distance information acquired in step S118, that is, controls the focus lens to focus on the object detected in step S112 (S122 ).
  • the imaging control section 110 detects a change in the imaging direction of the imaging device 100 based on the control information of the support mechanism 200. Then, the imaging control section 110 acquires in advance from the TOF sensor 160 the distance information of the object that is the shooting target after the imaging direction of the imaging device 100 is changed, and based on the distance information, before completing the imaging direction change of the imaging device 100, the imaging device 100 Perform focus control. As a result, it is possible to quickly perform focus control on the subject.
  • the imaging control unit 110 detects a crossing object in the ROI crossing the TOF sensor 160 based on a plurality of images of the imaging device 100. In addition, when a crossing object is detected, the imaging control unit 110 does not perform focus control based on the distance information of the ROI detected by the TOF sensor 160 at this time. As a result, it is possible to prevent the imaging device 100 from erroneously focusing on the crossing object based on the distance information from the TOF sensor 160.
  • the aforementioned imaging device 100 may be mounted on a mobile body.
  • the camera device 100 may also be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 8.
  • UAV 1000 may include a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera device 100.
  • the universal joint 50 and the camera device 100 are an example of a camera system.
  • UAV1000 is an example of a moving body propelled by a propulsion unit.
  • the concept of moving objects refers to flying objects such as airplanes moving in the air, vehicles moving on the ground, ships moving on water, etc., in addition to UAVs.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 1000 fly by controlling the rotation of a plurality of rotors.
  • the UAV body 20 uses, for example, four rotating wings to make the UAV1000 fly.
  • the number of rotors is not limited to four.
  • UAV1000 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the gimbal 50 uses an actuator to rotatably support the camera device 100 around a pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively.
  • the universal joint 50 can change the posture of the camera device 100 by rotating the camera device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensor cameras that capture images of the surroundings of the UAV 1000 in order to control the flight of the UAV 1000.
  • the two camera devices 60 can be installed on the nose of the UAV1000, that is, on the front side.
  • the other two camera devices 60 may be installed on the bottom surface of the UAV1000.
  • the two camera devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side can also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 1000 can be generated based on the images captured by the plurality of imaging devices 60.
  • the number of imaging devices 60 included in the UAV 1000 is not limited to four. It is sufficient that the UAV1000 includes at least one camera device 60.
  • the UAV1000 may also include at least one camera 60 on the nose, tail, side, bottom and top surfaces of the UAV1000.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 600 communicates with the UAV1000 to perform remote operation on the UAV1000.
  • the remote operation device 600 can wirelessly communicate with the UAV1000.
  • the remote operation device 600 transmits to the UAV 1000 instruction information indicating various commands related to the movement of the UAV 1000 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 1000.
  • the indication information can indicate the height at which the UAV1000 should be located.
  • the UAV 1000 moves to be at the height indicated by the instruction information received from the remote operation device 600.
  • the instruction information may include an ascending instruction to raise the UAV1000.
  • UAV1000 rises while receiving the rise command. When the height of the UAV1000 has reached the upper limit height, even if the ascent command is accepted, the UAV1000 can be restricted from rising.
  • FIG. 9 shows an example of a computer 1200 that can fully or partially embody various aspects of the present invention.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host control device 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host control device 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method may be constituted by implementing operations or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network The received data is written into the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when a plurality of entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. The item that matches the condition is read, and the attribute value of the second attribute stored in the item is read to obtain the attribute value of the second attribute associated with the first attribute that meets the preset condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Accessories Of Cameras (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

一种控制装置,构成为:根据由测量传感器(160)测得的到与多个区域中的第一区域(ROI)相关联的第一被摄体的第一距离,使摄像装置(100)执行对第一被摄体的对焦控制后,根据摄像装置(100)所拍摄的多个图像,判定在第一区域(ROI)是否存在作为活动体的第二被摄体,当判定在第一区域并不存在第二被摄体时,根据由测距传感器(160)进一步测得的到与第一区域(ROI)相关联的第一被摄体的第二距离,使摄像装置(100)执行对焦控制。

Description

装置、摄像装置、移动体、方法以及程序 【技术领域】
本发明涉及一种控制装置、摄像系统、控制方法以及程序。
【背景技术】
专利文献1中记载有根据光脉冲的反射光测量到目标被摄体的距离。
[现有技术文献]
[专利文献]
[专利文献1]日本特开2006-79074号公报
【发明内容】
【发明所要解决的技术问题】
在根据测距传感器所测得的距离执行对焦控制的摄像装置中,当其它被摄体通过应对焦被摄体的前方时,有时无法适当地执行对焦控制。
【用于解决问题的技术手段】
本发明的一个方面所涉及的控制装置可以是对摄像系统进行的控制装置,摄像系统包括测量到与多个区域的各个区域相关联的各个被摄体的距离的测距传感器以及根据测距传感器测得的距离执行对焦控制的摄像装置。控制装置可以包括电路,电路构成为:根据测距传感器测得的到与多个区域中的第一区域相关联的第一被摄体的第一距离,使摄像装置对第一被摄体执行对焦控制后,根据由摄像装置拍摄的多个图像,判定在第一区域是否存在作为活动体的第二被摄体。电路可以构成为:当判定在第一区域中并不存在第二被摄体时,根据由测距传感器进一步测得的到与第一区域相关联的第一被摄体的第二距离,使摄像装置执行所述对焦控制。
电路可以构成为:当判定在第一区域存在第二被摄体时,不使摄像装置根据第二距离执行对焦控制。
电路可以构成为:根据由摄像装置拍摄的多个图像,导出与第一区域相关联的光流,并根据光流判定在第一区域是否存在第二被摄体。
电路可以构成为:根据由摄像装置拍摄的多个图像各自的辉度信息、颜色信息、边缘信息以及对比度信息中的至少一个,判定在第一区域是否存在第二被摄体。
电路可以构成为:当测距传感器的视角小于摄像装置的视角时,判定在第一区域是否存在第二被摄体。
摄像系统可以包括以能够使摄像装置旋转的方式支撑摄像装置的支撑机构。当测距传感器的视角大于摄像装置的视角时,电路可以根据对支撑机构的控制命令,判定支撑机构是否使摄像装置沿摄像装置的摄像方向要改变成的第一方向旋转。当判定支撑机构使摄像装置沿第一方向旋转时,电路可以从由测距传感器测量的多个区域中确定支撑机构按照控制命令使摄像装置沿第一方向旋转第一旋转量后使摄像装置应对焦的第三区域。电路可以构成为:根据由测距传感器所测得的到与第三区域相关联的第三被摄体的第三距离,在摄像装置沿第一方向旋转第一旋转量期间,使摄像装置对第三被摄体执行对焦控制。
第三区域可以是在摄像装置按照控制命令沿第一方向旋转摄像装置之前的时间点时在摄像装置的视角外的区域。
本发明的一个方面所涉及的摄像系统可以包括上述控制装置、测距传感器以及摄像装置。
本发明的一个方面所涉及的控制方法可以是对摄像系统进行控制的控制方法,摄像系统包括测量到与多个区域的各个区域相关联的各个被摄体的距离的测距传感器以及根据测距传感器测得的距离执行对焦控制的摄像装置。控制方法可以包括:根据由测距传感器所测得的到与多个区域中的第一区域相关联的第一被摄体的第一距离,使摄像装置对第一被摄体执行对焦控制后,根据由摄像装置拍摄的多个图像,判定在第一区域是否存在作为活动体的第二被摄体。控制方法可以包括:当判定在第一区域并不存在第二被摄体时,根据由测距传感器进一步测得的到与第一区域相关联的第一被摄体的第二距离,使所述摄像装置执行所述对焦控制。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述控制装置发挥作用的程序。
根据本发明的一个方面,在根据由测距传感器测量的距离执行对焦控制的摄像装置中,能够防止当其他被摄体通过应对焦的被摄体的前方时无法适当执行对焦控制的情况。
另外,上述本发明的内容中没有穷举本发明的所有必要的特征。另外,这些特征组的子组合也可形成发明。
【附图说明】
图1是摄像系统的外观立体图的一个示例。
图2示出了摄像系统的其它形式的外观立体图的一个示例。
图3示出了摄像系统的功能块的示意图。
图4示出了当非主要被摄体横穿主要被摄体的前方时,颜色分布及光流的情况的示意图。
图5示出了摄像装置的视角与TOF传感器的视角之间的关系的一个示例的图。
图6示出了摄像装置的视角与TOF传感器的视角之间的关系的一个示例的图。
图7A示出了利用摄像系统进行对焦控制的过程的一个示例的图。
图7B示出了利用摄像系统进行对焦控制的过程的一个示例的图。
图8示出了无人驾驶航空器及远程操作装置的外观的一个示例的图。
图9示出了硬件构成的一个示例的图。
【具体实施方式】
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括floppy(注册商标)disk软盘、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光Blu-ray(注册商标)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制装置、微控制装置等。
图1是本实施方式所涉及的摄像系统10的外观立体图的一个示例。摄像系统10包括摄像装置100、支撑机构200以及握持部300。摄像装置100包括TOF传感器160。支撑机构200使用致动器分别以滚转轴、俯仰轴、偏航轴为中心可旋转地支撑摄像装置100。支撑机构200可通过使摄像装置100以滚转轴、俯仰轴以及偏航轴中的至少一个为中心旋转,来变更或维持摄像装置100的姿势。支撑机构200包括滚转轴驱动机构201、俯仰轴驱动机构202和偏航轴驱动机构203。支撑机构200还包括固定偏航轴驱动机构203的基部204。握持部300固定于基部204。握持部300包括操作接口301以及显示部302。摄像装置100固定于俯仰轴驱动机构202。
操作接口301从用户接收用于操作摄像装置100和支撑机构200的命令。操作接口301可以包含指示由摄像装置100拍摄或录像的快门/录像按钮。操作接口301可 以包含电源/功能按钮,用于指示接通或切断摄像系统10的电源以及在摄像装置100的静态图像摄影模式或动态图像摄影模式之间切换。
显示部302可以显示由摄像装置100拍摄的图像。显示部302可以显示用于操作摄像装置100以及支撑机构200的菜单画面。显示部302可以是触摸面板显示器,其可接收用于操作摄像装置100和支撑机构200的命令。
图2示出了摄像系统10的其它形式的外观立体图的一个示例。如图2所示,摄像系统10可以在握持部300的侧旁固定包括智能手机400等的显示器的移动终端的状态下使用。用户握持握持部300,通过摄像装置100拍摄静态图像或者动画。智能手机400等的显示器通过摄像装置100显示摄静态图像或者动画。
图3示出了摄像系统10的功能块的示意图。摄像装置100包括摄像控制部110、图像传感器120、存储器130、镜头控制部150、镜头驱动部152、多个镜头154以及TOF传感器160。
图像传感器120可以由CCD或CMOS构成。图像传感器120是用于拍摄的图像传感器的一个示例。图像传感器120将通过多个镜头154成像的光学图像的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、以及MCU等微控制装置等构成。
摄像控制部110按照握持部300对摄像装置100的动作指令,并且摄像控制部110对从图像传感器120输出的图像信号实施去马赛克处理,从而生成图像数据。摄像控制部110将图像数据存储在存储器130。摄像控制部110控制TOF传感器160。摄像控制部110是电路的一个示例。TOF传感器160是测量到对象物距离的飞行时间型传感器。摄像装置100根据由TOF传感器160测量的距离来调整聚焦镜头的位置,以此来执行对焦控制。
存储器130可以为计算机可读存储介质,可以包含SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器130存储摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。握持部300可包括用于保存由摄像装置100拍摄的图像数据的其他存储器。握持部300可具有可从握持部300的壳体上拆卸存储器的插槽。
多个镜头154可以起到变焦镜头(zoom lens)、可变焦距镜头(varifocal lens)及聚焦镜头的作用。多个镜头154中的至少一部分或全部被配置为能够沿光轴移动。镜头控制部150按照来自摄像控制部110的镜头控制指令来驱动镜头驱动部152,使一个或者多个镜头154沿光轴方向移动。镜头控制指令例如为变焦控制指令及对焦控制指令。镜头驱动部152可包括使多个镜头154的至少一部分或全部沿光轴方向移动的音圈电机(VCM)。镜头驱动部152可包括DC电机、空心杯电机或超声波电机等电动机。镜头驱动部152可将来自电动机的动力经由凸轮环、导轴等的机构部件传递给多个镜头154的至少一部分或全部,使多个镜头154的至少一部分或全部沿光轴移动。
摄像装置100还包括姿势控制部210、角速度传感器212以及加速度传感器214。角速度传感器212检测摄像装置100的角速度。角速度传感器212检测摄像装置100围绕滚转轴、俯仰轴以及偏航轴的各自的角速度。姿势控制部210获取于来自角速度传感器212的与摄像装置100的角速度相关的角速度信息。角速度信息可示出摄像装 置100围绕滚转轴、俯仰轴以及偏航轴的各自的角速度。姿势控制部210可获取来自加速度传感器214的与摄像装置100的加速度相关的加速度信息。加速度信息也可表示摄像装置100在滚转轴、俯仰轴以及偏航轴的各个方向的加速度。
角速度传感器212以及加速度传感器214可设置于容纳图像传感器120以及镜头154等的壳体内。在本实施方式中,对摄像装置100和支撑机构200一体构成的方式进行说明。但是,支撑机构200可包括可拆装地固定摄像装置100的基座。在该情况下,角速度传感器212以及加速度传感器214可设置于基座等摄像装置100的壳体外。
姿势控制部210根据角速度信息及加速度信息来控制支撑机构200,以维持或改变摄像装置100的姿势。姿势控制部210按照用于控制摄像装置100的姿势的支撑机构200的工作模式来控制支撑机构200,以维持或改变摄像装置100的姿势。
工作模式包括以下模式:使支撑机构200的滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203中的至少一个工作,以使得摄像装置100的姿势的变化跟随支撑机构200的基部204的姿势的变化。操作模式包括以下模式:使支撑机构200的滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203分别工作,以使得摄像装置100的姿势的变化跟随支撑机构200的基部204的姿势的变化。操作模式包括以下模式:使支撑机构200的俯仰轴驱动机构202以及偏航轴驱动机构203的分别工作,以使得摄像装置100的姿势变化跟随支撑机构200的基部204姿势的变化。工作模式包括以下模式:仅使偏航轴驱动机构203工作,以使得摄像装置100的姿势的变化跟随支撑机构200的基部204的姿势的变化。
工作模式可以包括FPV(First Person View:第一人称主视角)模式和固定模式,在FPV模式中,使支撑机构200工作以使摄像装置100的姿势的变化跟随支撑机构200的基部204的姿势的变化,在固定模式中,使支撑机构200工作以维持摄像装置100的姿势。
FPV模式是为使摄像装置100的姿势变化跟随支撑机构200的基部204的姿势变化,而使滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构203中的至少一个工作的模式。固定模式是为维持摄像装置100的当前姿势,而使滚转轴驱动机构201、俯仰轴驱动机构202以及偏航轴驱动机构中203中的至少一个工作的模式。
TOF传感器160包括发光部162、受光部164、发光控制部166、受光控制部167以及存储器168。TOF传感器160是测距传感器的一个示例。
发光部162包括至少一个发光元件163。发光元件163是反复发射LED或者激光等已高速调制的脉冲光的设备。发光元件163可发射是红外光的脉冲光。发光控制部166控制发光元件163的发光。发光控制部166可控制从发光元件163发射的脉冲光的脉冲宽度。
受光部164包括测量到与多个区域各自关联的被摄体的距离的多个受光元件165。受光部164是用于测距的图像传感器的一个示例。多个受光元件165分别关联多个区域。受光元件165反复接收来自对象物的脉冲光的反射光。受光元件165接收包括来自对象物的脉冲光的反射光的光,并且输出关联于所接收的光量的信号。受光控制部167控制受光元件165接收光。受光控制部167根据从受光元件165输出的信号,测量到与多个区域中的各个区域相关联的被摄体的距离。受光控制部167根据预设受光期间内受光元件165反复接收的反射光的量,测量到与多个区域各自关联的被摄体的 距离。受光控制部167可以根据预设受光期间内受光元件165反复接收的反射光的量,通过确定脉冲光与反射光之间的相位差,从而测量到被摄体的距离。受光部164可以通过读取反射波的频率变化来测量到被摄体的距离。这被称为FMCW(Frequency Modulated Continuous Wave,调频连续波)方式。
存储器168可以为计算机可读记录介质,可以包括SRAM、DRAM、EPROM以及EEPROM中的至少一个。存储器168存储发光控制部166控制发光部162所需的程序以及受光控制部167控制受光部164所需的程序等。
对摄像装置100执行的自动聚焦(AF)方式进行说明。摄像装置100可以根据由TOF传感器160测量的从摄像装置100到被摄体的距离(被摄体距离)来移动聚焦镜头,由此控制聚焦镜头和图像传感器120的摄像面之间的位置关系。摄像装置100可以根据由TOF传感器160测量的从摄像装置100到被摄体的距离(被摄体距离)来确定对焦于被摄体的聚焦镜头的目标位置,并将聚焦镜头移动到目标位置,从而执行对焦控制。
摄像控制部110根据表示由TOF传感器160测量的、多个区域中包含主要被摄体的第一区域(ROI)的距离的距离信息,确定对焦于主要被摄体的聚焦镜头的目标位置。摄像控制部110将聚焦镜头移动到目标位置。由此,摄像控制部110对主要被摄体执行对焦控制。
在上述摄像系统10中,有时存在活动体通过主要被摄体与摄像装置100之间的情况。在这种情况下,有可能TOF传感器160测距的第一区域距离是到活动体的距离而非到主要被摄体的距离。在这种情况下,当摄像控制部110根据来自TOF传感器160的距离信息执行对焦控制时,存在并不对焦到主要被摄体的情况。
因此,在本实施方式中,当在TOF传感器160测得的多个区域中的与主要被摄体关联的第一区域存在主要被摄体以外的被摄体即非主要被摄体时,摄像控制部110并不根据由TOF传感器160测得的关于第一区域的距离信息来执行对焦控制。由此能够防止由于摄像装置100根据由TOF传感器160测量的距离信息来执行对焦控制造成对焦于非主要被摄体而非主要被摄体的情况。
摄像控制部110根据由TOF传感器160测得的到与多个区域中的第一区域相关联的主要被摄体即第一被摄体的第一距离,使摄像装置100对第一被摄体执行对焦控制后,根据由摄像装置100拍摄的多个图像,判定第一区域是否存在作为活动体的非主要被摄体即第二被摄体。第一区域可以分割成多个区块。当多个区块中的至少一个区块中存在非主要被摄体即第二被摄体时,摄像控制部110可以判定第一区域中存在作为活动体的非主要被摄体即第二被摄体。当多个区块中大于等于预设数量的区块中存在非主要被摄体即第二被摄体时,摄像控制部110可以判定第一区域中存在作为活动体的非主要被摄体即第二被摄体。
摄像控制部110可以根据摄像装置100所拍摄的多个图像来导出与第一区域相关联的光流,并根据光流来判定第一区域是否存在第二被摄体。摄像控制部110可以将多个图像的每一个分割为多个区块,按每个区块导出移动矢量,从而导出光流。摄像控制部110可以通过对构成各个图像的每个像素导出移动矢量来导出光流。
摄像控制部110可以根据由摄像装置100拍摄的多个图像各自的辉度信息、颜色信息、边缘信息以及对比度信息中的至少一个,判定第一区域是否存在第二被摄体。 例如,摄像控制部110可以将多个图像分割为多个区块,按区块将各个区块的辉度信息、颜色信息、边缘信息、或者对比度信息进行比较,通过确定其随时间的变化来判定在第一区域是否存在第二被摄体。
当判定第一区域中不存在第二被摄体时,摄像控制部110可以使摄像装置100根据由TOF传感器160进一步测量的到与第一区域相关联的第一被摄体的第二距离执行对焦控制。当判定第一区域中存在第二被摄体时,摄像控制部110可以使摄像装置100不根据第二距离执行对焦控制。
当TOF传感器160的视角小于摄像装置100的视角时,摄像控制部110可以判定在第一区域是否存在第二被摄体。若摄像装置100的视角大于TOF传感器160的视角,摄像控制部110可以根据摄像装置100拍摄的图像来判断在TOF传感器160的视角外是否存在非主要被摄体。
图4是示出在非主要被摄体横穿主要被摄体前方的情况下的颜色分布以及光流情况的图。在时间t(0),摄像控制部110根据TOF传感器160的测距信息执行对焦控制,以对焦到TOF传感器160的关注区域(ROI)内的主要被摄体410。而且,非主要被摄体412进入到摄像装置100的摄像区域401。非主要被摄体412在摄像区域401内沿水平方向从左侧向右侧移动。
在时间t(1),摄像控制部110根据光流检测出非主要被摄体412的存在。
进而,在时间t(2),摄像控制部110根据光流判断非主要被摄体412正在沿水平方向从左侧向右侧移动。接着,在时间t(3),摄像控制部110根据光流检测出非主要被摄体正通过主要被摄体的前方。即,摄像控制部110根据光流检测出非主要被摄体412存在于TOF传感器160的ROI内。此时,摄像控制部110不根据对TOF传感器160ROI所测量的距离信息执行对焦控制。摄像控制部110可以根据颜色分布的变化,检测出非主要被摄体412的存在以及非主要被摄体412存在于TOF传感器160的ROI内。
此后,在时间t(t4),摄像控制部110根据光流,检测出非主要被摄体412不存在于TOF传感器160的ROI内。由此,摄像控制部110根据由TOF传感器160的ROI所测量的距离信息,再次开始对焦控制。
不过,也存在摄像装置100的视角小于TOF传感器160的视角的情况。此外,存在由支撑机构200驱动摄像装置100沿摄像装置100的摄像方向要改变成的方向旋转的同时进行拍摄的情况。在这种情况下,通过TOF传感器160预先测量好到摄像装置100的视角外的被摄体的距离,从而能够在短时间内进行对摄像装置100的视角外的被摄体的对焦。
例如,如图5和6所示,存在TOF传感器160的视角422比摄像装置100的视角420大的情况。在该情况下,由支撑机构200驱动摄像装置100沿摄像装置100的摄像方向要改变成的第一方向(平摇方向或俯仰方向)450旋转,并使摄像装置100对焦到摄像装置100的摄像装置100的视角外的被摄体430并进行拍摄。此时,在被摄体430进入摄像装置100的视角内之前,由TOF传感器160预先测量好到被摄体430的距离。然后,当被摄体430进入摄像装置100的视角内时,摄像装置100根据由TOF传感器160预先测得的距离信息,对焦到被摄体430上。
更具体而言,当TOF传感器160的视角大于摄像装置100的视角时,摄像控制部110可以根据对支撑机构200的控制命令,判定支撑机构200是否使摄像装置100沿摄像装置的摄像方向要改变成的第一方向旋转。摄像控制部110可以根据对支撑机构200的控制命令,判定是否使支撑机构200沿平摇方向或俯仰方向旋转。
当判定为支撑机构200使摄像装置100沿第一方向旋转时,摄像控制部110从由TOF传感器160测量的多个区域中确定在支撑机构200按照控制命令使摄像装置100沿第一方向旋转第一旋转量之后摄像装置100应对焦的第三区域。第三区域可以是在摄像装置100按照控制命令沿第一方向旋转之前的时间点时摄像装置100的视角420以外的区域。
摄像控制部110可以根据由TOF传感器160测得的到与第三区域相关联的第三被摄体的第三距离,在摄像装置100沿第一方向旋转第一旋转量的期间,使摄像装置100对第三被摄体执行对焦控制。
图7A和图7B是示出利用摄像系统10进行对焦控制的过程的一个示例的图。摄像控制部110获取TOF传感器160的视角以及摄像装置100的视角的信息(S100)。摄像控制部110可以获取存储器130或存储器168中存储的TOF传感器160的视角以及摄像装置100的视角的信息。摄像控制部110可以通过镜头控制部150根据变焦镜头的设定信息来获取摄像装置100的视角的信息。
摄像控制部110判定TOF传感器160的视角是否不小于摄像装置100的视角(S102)。当TOF传感器160的视角不小于摄像装置100的视角时,摄像控制部110通过姿势控制部210获取支撑机构200的控制信息。摄像控制部110判定控制信息是否指示支撑机构200沿平摇方向或俯仰方向旋转(S104)。
当控制信息未指示支撑机构200沿平摇方向或俯仰方向旋转时,摄像控制部110根据由TOF传感器160获得的ROI内的距离信息,对焦于预设对象来执行对焦控制(S106)。当控制信息指示支撑机构200沿平摇方向或俯仰方向旋转时,摄像控制部110判定TOF传感器160的视角内,在摄像装置100的旋转目的地是否能够检测出预设对象(S108)。摄像控制部110根据控制信息来确定摄像装置100的摄像方向旋转的第一方向以及第一旋转量。摄像控制部110在摄像装置100根据第一方向以及第一旋转量沿第一方向旋转了第一旋转量时,确定摄像装置100的ROI等预设区域所在的TOF传感器160的区域。当确定的TOF传感器160的区域中的距离信息指示预设的距离范围时,摄像控制部110判定能够在摄像装置100的旋转目的地检测出预设对象。
摄像控制部110在摄像装置100沿第一方向旋转第一旋转量期间,根据确定的TOF传感器160的区域中的距离信息,执行对焦控制(S110)。
当摄像装置100的视角大于TOF传感器160的视角时,摄像控制部110检测出摄像装置100的视角内的对象(S112)。摄像控制部110可以从摄像装置100的视角内预设的摄像装置100的ROI内检测出满足预设条件的对象。摄像控制部110可以在摄像装置100的视角内检测出满足脸部等预设条件的对象。
摄像控制部110根据来自TOF传感器160的距离信息或者从摄像装置100的图像中确定的距离信息,对检测出的对象执行对焦控制(S114)。摄像控制部110将TOF传感器160测量的多个区域中存在检测出的对象的区域设置为TOF传感器160的ROI(S116)。
摄像控制部110获取设置的TOF传感器160的ROI的距离信息(S118)。摄像控制部110判定是否存在横穿TOF传感器160的ROI的横穿对象(S120)。
如图7B所示,摄像控制部110为了判定是否存在横穿对象而设置摄像装置100的ROI。摄像控制部110将ROI分割成多个区域,并且对每个区域获取光流(S202)。此外,摄像控制部110根据每个区域的光流判定是否存在横穿于TOF传感器160的ROI内的横穿对象(S204)。
当存在横穿TOF传感器160的ROI的横穿对象时,摄像控制部110并不执行根据在步骤S118中获取的距离信息的对焦控制,而是重新获取TOF传感器160的ROI的距离信息。
当不存在横穿TOF传感器160的ROI的横穿对象时,摄像控制部110根据在步骤S118中获取的距离信息执行对焦控制,即控制聚焦镜头对焦到在步骤S112中检测出的对象上(S122)。
如上所述,根据本实施例,摄像控制部110根据支撑机构200的控制信息,检测摄像装置100的摄像方向的改变。然后,摄像控制部110预先从TOF传感器160获取摄像装置100的摄像方向改变后作为拍摄目标的对象的距离信息,并且根据该距离信息,在完成摄像装置100的摄像方向改变之前,由摄像装置100执行对焦控制。由此,能够迅速地对对象执行对焦控制。
另外,摄像控制部110根据摄像装置100的多个图像,检测横穿TOF传感器160的ROI内的横穿对象。并且,当检测出横穿对象时,摄像控制部110并不执行根据此时由TOF传感器160检测出的ROI的距离信息的对焦控制。由此,能够防止摄像装置100根据来自TOF传感器160的距离信息,错误地对焦到横穿对象上。
上述摄像装置100可以搭载于移动体上。摄像装置100还可以搭载于如图8所示的无人驾驶航空器(UAV)上。UAV1000可以包括UAV主体20、万向节50、多个摄像装置60以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV1000为由推进部推进的移动体的一个示例。移动体的概念是指除UAV之外,包括在空中移动的飞机等飞行体、在地面上移动的车辆、在水上移动的船舶等。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV1000飞行。UAV本体20例如采用四个旋转翼,使UAV1000飞行。旋翼的数量不限于四个。另外,UAV1000也可以是没有旋翼的固定翼机。
摄像装置100是为对包含在所期望的摄像范围内的被摄体进行摄像的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器围绕俯仰轴可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来改变摄像装置100的姿势。
多个摄像装置60是为了控制UAV1000的飞行而对UAV1000的周围进行摄像的传感用相机。两个摄像装置60可以设置于UAV1000的机头、即正面。并且,其它两个摄像装置60可以设置于UAV1000的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机 的作用。可以根据由多个摄像装置60所摄像的图像来生成UAV1000周围的三维空间数据。UAV1000所包括的摄像装置60的数量不限于四个。UAV1000包括至少一个摄像装置60即可。UAV1000也可以在UAV1000的机头、机尾、侧面、底面及顶面分别包括至少一个摄像装置60。摄像装置60中可设定的视角可大于摄像装置100中可设定的视角。摄像装置60也可以具有单焦点镜头或鱼眼镜头。
远程操作装置600与UAV1000通信,对UAV1000进行远程操作。远程操作装置600可以与UAV1000进行无线通信。远程操作装置600向UAV1000发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV1000的移动有关的各种指令的指示信息。指示信息包括例如使UAV1000的高度上升的指示信息。指示信息可以表示UAV1000应该位于的高度。UAV1000进行移动,以位于从远程操作装置600接收的指示信息所表示的高度。指示信息可以包括使UAV1000上升的上升指令。UAV1000在接受上升指令的期间上升。在UAV1000的高度已达到上限高度时,即使接受上升指令,也可以限制UAV1000上升。
图9示出可整体或部分地体现本发明的多个方式的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。
本实施方式的计算机1200包括CPU1212以及RAM1214,它们通过主机控制装置1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制装置1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内存储的程序而工作,从而控制各单元。
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以存储计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。
例如,当在计算机1200和外部装置之间执行通信时,CPU1212可执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取存储在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预设条件的第一属性相关联的第二属性的属性值。
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器系统中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、系统、程序以及方法中的动作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及说明书附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
【符号说明】
10 摄像系统
20 UAV主体
50 万向节
60 摄像装置
100 摄像装置
110 摄像控制部
120 图像传感器
130 存储器
150 镜头控制部
152 镜头驱动部
154 镜头
160 传感器
162 发光部
163 发光元件
164 受光部
165 受光元件
166 发光控制部
167 受光控制部
168 存储器
200 支撑机构
201 滚转轴驱动机构
202 俯仰轴驱动机构
203 偏航轴驱动机构
204 基部
210 姿势控制部
212 角速度传感器
214 加速度传感器
300 握持部
301 操作接口
302 显示部
400 智能手机
600 远程操作装置
1200 计算机
1210 主机控制装置
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM

Claims (10)

  1. 一种控制装置,其是对摄像系统进行控制的控制装置,所述摄像系统包括测量到与多个区域的各个区域相关联的各个被摄体的距离的测距传感器以及根据所述测距传感器测得的距离执行对焦控制的摄像装置,所述控制装置的特征在于,
    包括电路,所述电路构成为:根据所述测距传感器测得的到与所述多个区域中的所述第一区域相关联的第一被摄体的第一距离,使所述摄像装置对所述第一被摄体执行对焦控制后,根据由所述摄像装置拍摄的多个图像,判定在所述第一区域是否存在作为活动体的第二被摄体;
    当判定在所述第一区域中并不存在所述第二被摄体时,根据由所述测距传感器进一步测得的到与所述第一区域相关联的所述第一被摄体的第二距离,使所述摄像装置执行所述对焦控制。
  2. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:当判定在所述第一区域存在所述第二被摄体时,不使所述摄像装置根据所述第二距离执行所述对焦控制。
  3. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:根据由所述摄像装置拍摄的多个图像,导出与所述第一区域相关联的光流,并根据所述光流判定在所述第一区域是否存在所述第二被摄体。
  4. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:根据由所述摄像装置拍摄的多个图像各自的辉度信息、颜色信息、边缘信息以及对比度信息中的至少一个,判定在所述第一区域是否存在所述第二被摄体。
  5. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:当所述测距传感器的视角小于所述摄像装置的视角时,判定在所述第一区域是否存在所述第二被摄体。
  6. 根据权利要求1所述的控制装置,其特征在于,所述摄像系统还包括以能够使所述摄像装置旋转的方式支撑所述摄像装置的支撑机构,
    所述电路构成为:
    当所述测距传感器的视角大于所述摄像装置的视角时,所述电路根据对所述支撑机构的控制命令,判定所述支撑机构是否使所述摄像装置沿所述摄像装置的摄像方向要改变成的第一方向旋转;
    当判定所述支撑机构使所述摄像装置沿所述第一方向旋转时,所述电路从由所述测距传感器测量的所述多个区域中确定所述支撑机构按照所述控制命令使所述摄像装置沿所述第一方向旋转第一旋转量后使所述摄像装置应对焦的第三区域;
    根据由所述测距传感器测得的到与所述第三区域相关联的第三被摄体的第三距离,在所述摄像装置沿所述第一方向旋转所述第一旋转量期间,使所述摄像装置对所述第三被摄体执行对焦控制。
  7. 根据权利要求6所述的控制装置,其特征在于,所述第三区域是在所述摄像装置按照所述控制命令沿所述第一方向旋转之前的时间点时在所述摄像装置的视角外的区域。
  8. 一种摄像系统,其特征在于,包括:根据权利要求1至7中任一项所述的控制装置;
    所述测距传感器;以及
    所述摄像装置。
  9. 一种控制方法,其特征在于,其是对摄像系统进行控制的控制方法,所述摄像系统包括测量到与多个区域的各个区域相关联的各个被摄体的距离的测距传感器以及根据所述测距传感器测得的距离执行对焦控制的摄像装置,所述控制方法的特征在于,包括:
    根据所述测距传感器测得的到与所述多个区域中的所述第一区域相关联的第一被摄体的第一距离,使所述摄像装置对所述第一被摄体执行对焦控制后,根据由所述摄像装置拍摄的多个图像,判定在所述第一区域是否存在作为活动体的第二被摄体;以及
    当判定在所述第一区域中并不存在所述第二被摄体时,根据由所述测距传感器进一步测得的到与所述第一区域相关联的所述第一被摄体的第二距离,使所述摄像装置执行所述对焦控制。
  10. 一种程序,其特征在于,其用于使计算机作为根据权利要求1至7中任一项所述的控制装置发挥功能。
PCT/CN2020/106737 2019-08-20 2020-08-04 装置、摄像装置、移动体、方法以及程序 WO2021031840A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080003363.4A CN112313943A (zh) 2019-08-20 2020-08-04 装置、摄像装置、移动体、方法以及程序
US17/524,637 US20220070362A1 (en) 2019-08-20 2021-11-11 Control apparatuses, photographing apparatuses, movable objects, control methods, and programs

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019150644A JP2021032964A (ja) 2019-08-20 2019-08-20 制御装置、撮像システム、制御方法、及びプログラム
JP2019-150644 2019-08-20

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/524,637 Continuation US20220070362A1 (en) 2019-08-20 2021-11-11 Control apparatuses, photographing apparatuses, movable objects, control methods, and programs

Publications (1)

Publication Number Publication Date
WO2021031840A1 true WO2021031840A1 (zh) 2021-02-25

Family

ID=74660159

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/106737 WO2021031840A1 (zh) 2019-08-20 2020-08-04 装置、摄像装置、移动体、方法以及程序

Country Status (2)

Country Link
JP (1) JP2021032964A (zh)
WO (1) WO2021031840A1 (zh)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1477858A (zh) * 2002-08-23 2004-02-25 赖金轮 自动辨识追踪移动物体及清晰影样取得方法
CN102025914A (zh) * 2009-09-10 2011-04-20 佳能株式会社 摄像设备及其测距方法
EP2587462A1 (en) * 2011-10-31 2013-05-01 Axis AB Image monitoring system and method with false alarm rate reduction
JP2013179412A (ja) * 2012-02-28 2013-09-09 Nikon Corp 撮像装置
CN105611230A (zh) * 2014-11-19 2016-05-25 佳能株式会社 图像处理装置及图像处理方法
CN105793754A (zh) * 2013-12-04 2016-07-20 旭化成微电子株式会社 照相机模块的调整方法、镜头位置控制装置、以及线性运动设备的控制装置及其控制方法
US20170353654A1 (en) * 2016-06-01 2017-12-07 Canon Kabushiki Kaisha Image capturing apparatus and method of controlling the same
CN108139561A (zh) * 2015-09-30 2018-06-08 富士胶片株式会社 摄像装置及摄像方法
WO2018214465A1 (zh) * 2017-05-24 2018-11-29 深圳市大疆创新科技有限公司 控制装置、摄像装置、摄像系统、移动体、控制方法及程序
WO2019120082A1 (zh) * 2017-12-19 2019-06-27 深圳市大疆创新科技有限公司 控制装置、系统、控制方法以及程序

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003230043A (ja) * 2002-02-01 2003-08-15 Matsushita Electric Ind Co Ltd ビデオカメラ
JP4127297B2 (ja) * 2006-06-09 2008-07-30 ソニー株式会社 撮像装置、および撮像装置制御方法、並びにコンピュータ・プログラム
JP4980982B2 (ja) * 2008-05-09 2012-07-18 富士フイルム株式会社 撮像装置、撮像方法、合焦制御方法及びプログラム
JP5282461B2 (ja) * 2008-07-02 2013-09-04 株式会社ニコン 撮像装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1477858A (zh) * 2002-08-23 2004-02-25 赖金轮 自动辨识追踪移动物体及清晰影样取得方法
CN102025914A (zh) * 2009-09-10 2011-04-20 佳能株式会社 摄像设备及其测距方法
EP2587462A1 (en) * 2011-10-31 2013-05-01 Axis AB Image monitoring system and method with false alarm rate reduction
JP2013179412A (ja) * 2012-02-28 2013-09-09 Nikon Corp 撮像装置
CN105793754A (zh) * 2013-12-04 2016-07-20 旭化成微电子株式会社 照相机模块的调整方法、镜头位置控制装置、以及线性运动设备的控制装置及其控制方法
CN105611230A (zh) * 2014-11-19 2016-05-25 佳能株式会社 图像处理装置及图像处理方法
CN108139561A (zh) * 2015-09-30 2018-06-08 富士胶片株式会社 摄像装置及摄像方法
US20170353654A1 (en) * 2016-06-01 2017-12-07 Canon Kabushiki Kaisha Image capturing apparatus and method of controlling the same
WO2018214465A1 (zh) * 2017-05-24 2018-11-29 深圳市大疆创新科技有限公司 控制装置、摄像装置、摄像系统、移动体、控制方法及程序
WO2019120082A1 (zh) * 2017-12-19 2019-06-27 深圳市大疆创新科技有限公司 控制装置、系统、控制方法以及程序

Also Published As

Publication number Publication date
JP2021032964A (ja) 2021-03-01

Similar Documents

Publication Publication Date Title
WO2019238044A1 (zh) 确定装置、移动体、确定方法以及程序
WO2019206076A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
CN112335227A (zh) 控制装置、摄像系统、控制方法以及程序
WO2020216037A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2021031833A1 (zh) 控制装置、摄像系统、控制方法以及程序
US20220188993A1 (en) Control apparatus, photographing apparatus, control method, and program
JP2021085893A (ja) 制御装置、撮像装置、制御方法、及びプログラム
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
WO2021031840A1 (zh) 装置、摄像装置、移动体、方法以及程序
WO2019085771A1 (zh) 控制装置、镜头装置、摄像装置、飞行体以及控制方法
JP6972456B1 (ja) 測距センサ、撮像装置、制御方法、及びプログラム
WO2021052216A1 (zh) 控制装置、摄像装置、控制方法以及程序
WO2020108284A1 (zh) 确定装置、移动体、确定方法以及程序
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
JP6565071B2 (ja) 制御装置、撮像装置、飛行体、制御方法、及びプログラム
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
WO2022001561A1 (zh) 控制装置、摄像装置、控制方法以及程序
US20220070362A1 (en) Control apparatuses, photographing apparatuses, movable objects, control methods, and programs
WO2020216039A1 (zh) 控制装置、摄像系统、移动体、控制方法以及程序
WO2020216057A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP7043707B2 (ja) シーン認識装置、撮像装置、シーン認識方法、及びプログラム
WO2020244440A1 (zh) 控制装置、摄像装置、摄像系统、控制方法以及程序
WO2021143425A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20854652

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20854652

Country of ref document: EP

Kind code of ref document: A1