WO2021143425A1 - 控制装置、摄像装置、移动体、控制方法以及程序 - Google Patents

控制装置、摄像装置、移动体、控制方法以及程序 Download PDF

Info

Publication number
WO2021143425A1
WO2021143425A1 PCT/CN2020/136423 CN2020136423W WO2021143425A1 WO 2021143425 A1 WO2021143425 A1 WO 2021143425A1 CN 2020136423 W CN2020136423 W CN 2020136423W WO 2021143425 A1 WO2021143425 A1 WO 2021143425A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
focal length
lens
imaging
zoom
Prior art date
Application number
PCT/CN2020/136423
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
周长波
大畑笃
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080074285.7A priority Critical patent/CN114600446A/zh
Publication of WO2021143425A1 publication Critical patent/WO2021143425A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B15/00Optical objectives with means for varying the magnification
    • G02B15/14Optical objectives with means for varying the magnification by axial movement of one or more lenses or groups of lenses relative to the image plane for continuously varying the equivalent focal length of the objective
    • G02B15/16Optical objectives with means for varying the magnification by axial movement of one or more lenses or groups of lenses relative to the image plane for continuously varying the equivalent focal length of the objective with interdependent non-linearly related movements between one lens or lens group, and another lens or lens group
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present invention relates to a control device, an imaging device, a mobile body, a control method, and a program.
  • Patent Document 1 discloses a video transmission system that converts high-resolution video content into a transmission resolution (display resolution of the video reproduction terminal) for transmission, and responds to a scaling request to The transfer resolution is used to transfer the zoomed image.
  • Patent Document 1 JP 2012-75030 A.
  • the imaging lens can be easily designed while maintaining a high zoom magnification.
  • a control device includes a circuit configured to change the focal length of an imaging lens that is variable by changing the focal length, and to capture a portion of an image area from an image taken by light passing through the imaging lens
  • the acquisition range of the camera can be used for zoom photography.
  • the image circle of the camera lens changes according to the focal length.
  • the circuit is composed of: setting the acquisition range within the image circle that changes according to the focal length of the imaging lens.
  • the circuit may be configured such that when at least one of the zoom magnification and the number of recording pixels is designated, the focal length and the collection range of the imaging lens are changed to perform zoom photography according to at least one of the designated zoom magnification and the number of recording pixels.
  • the circuit can be configured as follows: when the user specifies the number of recording pixels, the collection range is determined according to the specified number of recording pixels, and the focal length of the camera lens is determined according to the determined collection range.
  • the camera lens may have a longer and smaller image circle.
  • the circuit may be configured to determine the focal length of the imaging lens so that the imaging surface of the image sensor that uses the light passing through the imaging lens to shoot is included in the image circle corresponding to the acquisition range.
  • the circuit can be configured as follows: when the user specifies the zoom magnification, the acquisition range is determined according to the designated zoom magnification, and the focal length of the camera lens is determined according to the determined acquisition range.
  • the camera lens may have a longer and smaller image circle.
  • the circuit may be configured to determine the acquisition range and the focal length of the imaging lens, so that the effective imaging area of the image sensor that uses the light passing through the imaging lens to shoot is included in the image circle corresponding to the acquisition range.
  • the diameter of the image circle of the imaging lens can be shorter than the long side of the effective imaging area of the image sensor.
  • the imaging device may include the above-mentioned imaging lens and the above-mentioned control device.
  • the moving body according to an aspect of the present invention may be a moving body that includes the above-mentioned imaging device and moves.
  • a control method includes a stage of performing zoom photography by changing the focal length of an imaging lens with a variable focal length, and collecting a partial image area from an image captured by light passing through the imaging lens.
  • the image circle of the camera lens changes according to the focal length.
  • the stage of performing zoom photography includes a stage in which the acquisition range is set within an image circle that changes according to the focal length of the imaging lens.
  • the program related to one aspect of the present invention causes a computer to perform the following steps: zoom by changing the focal length of a camera lens with a variable focal length, and collecting a partial image area from an image taken with light passing through the camera lens photography.
  • the image circle of the camera lens changes according to the focal length.
  • the step of performing zoom photography includes the step of setting the acquisition range within an image circle that changes according to the focal length of the imaging lens.
  • a higher zoom magnification can be obtained, and the imaging lens is also easy to design.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • UAV unmanned aerial vehicle
  • FIG. 2 shows an example of the functional blocks of UAV10.
  • FIG. 3 is a diagram for explaining the zoom photography method in this embodiment.
  • FIG. 4 is a diagram for explaining a zoom photography method as a comparative example.
  • FIG. 5 is a flowchart showing the processing procedure executed by the imaging control unit 110.
  • FIG. 6 is a flowchart showing the processing procedure executed by the imaging control unit 110.
  • FIG. 7 shows an example of a computer 1200.
  • the blocks can represent (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations. Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • Programmable circuits may include reconfigurable hardware circuits.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium on which instructions are stored includes a product that includes instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included.
  • the computer readable medium may include a floppy disk Floppy (registered trademark) Disk, a floppy disk, a hard disk, a random access memory (RAM), a read only memory (ROM), an erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory Sticks, integrated circuit cards, etc.
  • a floppy disk Floppy (registered trademark) Disk a floppy disk, a hard disk, a random access memory (RAM), a read only memory (ROM), an erasable programmable read only memory (EPROM) Or flash memory), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory Sticks,
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes a traditional procedural programming language.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages and "C" programming language or similar programming languages.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • WAN wide area network
  • LAN local area network
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and so on.
  • FIG. 1 shows an example of the appearance of an unmanned aerial vehicle (UAV) 10 and a remote operation device 300.
  • the UAV 10 includes a UAV main body 20, a universal joint 50, a plurality of imaging devices 60, and the imaging device 100.
  • the gimbal 50 and the camera device 100 are an example of a camera system.
  • UAV10, or mobile refers to the concept including flying objects moving in the air, vehicles moving on the ground, and ships moving on the water.
  • the concept of flying objects moving in the air includes not only UAVs, but also other aircraft, airships, helicopters, etc. that move in the air.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 20 uses, for example, four rotors to make the UAV 10 fly.
  • the number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the gimbal 50 uses an actuator to rotatably support the imaging device 100 with a pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the camera device 100 by rotating the camera device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 may be installed on the nose of the UAV 10, that is, the front.
  • the other two camera devices 60 may be installed on the bottom surface of the UAV 10.
  • the two imaging devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 10 can be generated based on the images taken by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four. It is sufficient that the UAV 10 includes at least one camera device 60.
  • the UAV 10 may also include at least one camera 60 on the nose, tail, side, bottom, and top surfaces of the UAV 10, respectively.
  • the viewing angle that can be set in the imaging device 60 may be larger than the viewing angle that can be set in the imaging device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can wirelessly communicate with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various instructions related to the movement of the UAV 10, such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the indication information may indicate the height at which the UAV10 should be located.
  • the UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
  • UAV10 includes UAV control unit 30, memory 37, communication interface 36, propulsion unit 40, GPS receiver 41, inertial measurement device 42, magnetic compass 43, barometric altimeter 44, temperature sensor 45, humidity sensor 46, universal joint 50, camera The device 60 and the imaging device 100.
  • the communication interface 36 communicates with other devices such as the remote operation device 300.
  • the communication interface 36 can receive instruction information including various instructions to the UAV control unit 30 from the remote operation device 300.
  • the memory 37 stores the UAV control unit 30 to the propulsion unit 40, the GPS receiver 41, the inertial measurement unit (IMU) 42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the universal joint 50, the imaging device 60, and
  • the imaging device 100 performs programs and the like necessary for control.
  • the memory 37 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the storage 37 may be provided inside the UAV main body 20. It can be configured to be detachable from the UAV main body 20.
  • the UAV control unit 30 controls the flying and shooting of the UAV 10 in accordance with a program stored in the memory 37.
  • the UAV control unit 30 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the UAV control unit 30 controls the flight and shooting of the UAV 10 in accordance with instructions received from the remote operation device 300 via the communication interface 36.
  • the propulsion unit 40 propels the UAV10.
  • the propulsion part 40 includes a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the propulsion unit 40 rotates a plurality of rotors via a plurality of drive motors in accordance with an instruction from the UAV control unit 30 to cause the UAV 10 to fly.
  • the GPS receiver 41 receives a plurality of signals indicating the time transmitted from a plurality of GPS satellites.
  • the GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV 10 based on the received signals.
  • the IMU42 detects the posture of the UAV10.
  • the IMU 42 detects the acceleration of the UAV 10 in the three-axis directions of front and rear, left and right, and up and down, and the angular velocities of the pitch, roll, and yaw axes as the attitude of the UAV 10.
  • the magnetic compass 43 detects the orientation of the nose of the UAV10.
  • the barometric altimeter 44 detects the flying altitude of the UAV10.
  • the barometric altimeter 44 detects the air pressure around the UAV 10 and converts the detected air pressure to altitude to detect the altitude.
  • the temperature sensor 45 detects the temperature around the UAV 10.
  • the humidity sensor 46
  • the imaging device 100 includes an imaging unit 102 and a lens unit 200.
  • the lens part 200 is an example of a lens device.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, a memory 130, and a distance measuring sensor.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 captures optical images formed through the plurality of lenses 210 and outputs the captured images to the imaging control unit 110.
  • the imaging control unit 110 generates a recording image through image processing based on the pixel information read from the image sensor 120 and stores it in the memory 130.
  • the imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 can control the imaging device 100 in accordance with an operation instruction of the imaging device 100 from the UAV control unit 30.
  • the imaging control unit 110 is an example of a circuit.
  • the memory 130 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, USB memory, and solid state drive (SSD).
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the storage 130 may be configured to be detachable from the housing of the imaging device 100.
  • the distance measuring sensor measures the distance to the subject.
  • the ranging sensor may be an infrared sensor, an ultrasonic sensor, a stereo camera, a TOF (Time Of Flight) sensor, etc.
  • the lens part 200 includes a plurality of lenses 210, a plurality of lens driving parts 212, and a lens control part 220.
  • the multiple lenses 210 can function as a zoom lens, a variable focal length lens, and a focusing lens. At least part or all of the plurality of lenses 210 are configured to be able to move along the optical axis.
  • the lens unit 200 may be an interchangeable lens provided to be detachable from the imaging unit 102.
  • the lens driving unit 212 moves at least part or all of the plurality of lenses 210 along the optical axis via a mechanism member such as a cam ring.
  • the lens driving part 212 may include an actuator.
  • the actuator may include a stepper motor.
  • the lens control unit 220 drives the lens driving unit 212 in accordance with a lens control instruction from the imaging unit 102 to move one or more lenses 210 in the optical axis direction via a mechanism member.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens part 200 further includes a memory 222 and a position sensor 214.
  • the lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens drive unit 212 in accordance with a lens operation command from the imaging unit 102. Part or all of the lens 210 moves along the optical axis.
  • the lens control section 220 performs at least one of a zoom operation and a focus operation by moving at least one of the lenses 210 along the optical axis.
  • the position sensor 214 detects the position of the lens 210.
  • the position sensor 214 can detect the current zoom position or focus position.
  • the lens driving part 212 may include a shake correction mechanism.
  • the lens control section 220 may move the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via a shake correction mechanism to perform shake correction.
  • the lens driving part 212 may drive a shake correction mechanism by a stepping motor to perform shake correction.
  • the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or a direction perpendicular to the optical axis to perform shake correction.
  • the memory 222 stores control values of the plurality of lenses 210 driven by the lens driving unit 212.
  • the memory 222 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the imaging control unit 110 performs zoom imaging by changing the focal length of the variable focal length lens 210 and capturing a partial image area from an image captured by light passing through the lens 210.
  • the image circle of the lens 210 changes according to the focal length of the lens 210.
  • the imaging control unit 110 sets the image capture range within an image circle that changes according to the focal length of the lens 210.
  • the imaging control unit 110 may acquire a partial image area from the image by reading the pixel information of some pixels in the effective imaging area of the image sensor 120.
  • the imaging control unit 110 may crop a partial image area from an image obtained by reading all pixel information in the effective imaging area of the image sensor 120, thereby obtaining a partial image area from the image.
  • the image capture range can be set by setting the range of the image partially read from the image sensor 120 and setting the cropping range of the image read from the image sensor 120. At least one of the settings.
  • the imaging control unit 110 may change the focal length and the capture range of the lens 210 according to at least one of the designated zoom magnification and the number of recording pixels, thereby performing zoom photography.
  • the imaging control unit 110 can determine the image capture range according to the specified number of recording pixels, and determine the focal length of the camera lens according to the determined capture range.
  • the lens 210 has a longer and smaller image circle. For example, at least at the telephoto end, the diameter of the image circle of the lens 210 is shorter than the long side of the effective imaging area of the image sensor 120.
  • the focal length of the lens 210 is determined so that the imaging control unit 110 includes an area corresponding to the acquisition range in the imaging surface of the image sensor 120 in the image circle of the lens 210.
  • the imaging control unit 110 determines the image capture range according to the designated zoom magnification, and determines the focal length of the lens 210 according to the determined capture range. Specifically, the imaging control unit 110 determines the acquisition range and the focal length of the imaging lens, so that a part of the effective imaging area of the image sensor 120 corresponding to the determined acquisition range is included in the image circle.
  • FIG. 3 is a diagram for explaining the zoom photography method in this embodiment.
  • Fig. 3 schematically shows the relationship between the zoom position, the image circle and the image capture range.
  • the image circle 310 at the wide-angle end, the image circle 311 in the case of the zoom position 1, and the image circle 312 in the case of the zoom position 2 are shown.
  • the imaging control unit 110 In the zoom position 1, the imaging control unit 110 generates an image with a higher magnification than the image generated in the case of the wide-angle end.
  • the imaging control unit 110 At the zoom position 2, the imaging control unit 110 generates an image with a higher magnification than the image generated in the case of the zoom position 1.
  • the zoom position is determined by the zoom magnification specified by the user.
  • the optical image 330 is an optical image of the subject formed by the lens 210 at the wide-angle end.
  • the effective imaging area 122 is an area where the effective pixels of the image sensor 120 are arranged.
  • the imaging control unit 110 uses pixel information of at least part of the effective pixels of the image sensor 120 to generate an image for recording.
  • the imaging control unit 110 sets the effective imaging area 122 as the image capture range. Therefore, at the wide-angle end, the imaging control section 110 uses the pixel information of all pixels arranged in the effective imaging area 122 to generate the image for recording 350.
  • a subject image 352 is included in the recording image 350.
  • the imaging control unit 110 sets a range 340 narrower than the effective imaging area 122 as the image capture range without changing the focal length of the lens 210.
  • the imaging control unit 110 uses the pixel information of the pixels located in the range 340 to generate the image 360 for recording.
  • the focal length of the lens 210 at the zoom position 1 is the same as the focal length of the lens 210 at the wide-angle end. Therefore, the image circle 311 of the lens 210 and the image circle 310 at the wide-angle end are substantially the same.
  • the optical image 331 formed by the lens 210 at the zoom position 1 is substantially the same as the optical image 330 at the wide-angle end.
  • the imaging control unit 110 gradually narrows the image capture range from the effective imaging area 122 to the range 340 without changing the focal length of the lens 210. That is, the imaging control unit 110 performs digital zooming from the wide-angle end to the zoom position 1.
  • the imaging control unit 110 sets the range 340 as the image capture range.
  • the imaging control unit 110 performs zooming by increasing the focal length of the lens 210.
  • the imaging control unit 110 uses the pixel information of the pixels located in the range 340 to generate the image 370 for recording.
  • the image circle 312 at the zoom position 2 is smaller than the image circle 311 at the zoom position 1, the image circle 312 covers a range 340.
  • the optical image 332 formed by the lens 210 at the zoom position 2 is larger than the optical image 331 at the zoom position 1.
  • the subject image 372 corresponding to the optical image 332 is enlarged by an amount corresponding to the amount by which the focal length of the lens 210 becomes longer than the subject image 362.
  • the imaging control unit 110 continuously increases the focal length of the lens 210 from the zoom position 1 to the zoom position 2 without changing the image capturing range. That is, the imaging control unit 110 performs optical zooming from the zoom position 1 to the zoom position 2.
  • the image circle of the lens 210 becomes smaller due to the optical zoom
  • the image circle 312 at the zoom position 2 covers the range 340 of the captured image at the zoom position 2. Therefore, the degradation of image quality can be suppressed.
  • the image capture range is gradually narrowed from the wide-angle end to the zoom position 1 without changing the focal length of the lens 210, and the image capturing range is not changed from the zoom position 1 to the zoom position 2.
  • the focal length of the lens 210 is made longer as an example.
  • the imaging control unit 110 narrows the image capturing range and at the same time increases the focal length of the lens 210.
  • the image circle becomes smaller.
  • the image capturing range is set within the image circle by the imaging control unit 110, thereby generating an image with reduced image quality and high magnification.
  • FIG. 4 is a diagram for explaining a zoom photography method as a comparative example.
  • the optical zoom is performed from the wide-angle end to the zoom position 1, and the digital zoom is performed from the zoom position 1 to the zoom position 2.
  • the focal length of the camera lens becomes longer due to the optical zoom. Therefore, the camera lens needs to be designed so that the image circle 411 at the zoom position 1 covers the effective imaging area 422 of the image sensor.
  • digital zoom is performed by setting the image capture range to a range 420 narrower than the effective imaging area 422.
  • the size of the image circle 412 is substantially the same as that of the image circle 411.
  • the area of the range 420 is very narrow.
  • the image information outside the range 420 in the image sensor is not used for the image for recording. Therefore, according to the comparative example, the useless area not used for the recording image in the image circle becomes larger.
  • the use of the zoom photography method of this embodiment facilitates the compact design of the lens 210.
  • the zoom photography method of the present embodiment since the image capture range is set in accordance with the narrowing of the image circle, the image information of the area within the image circle can be effectively used.
  • the high pixelation of image sensors has been developed, and the degradation of image quality caused by digital zoom has no longer become a major problem. For example, even if an image equivalent to 6k pixels is collected from an image obtained by an image sensor with 8k pixels, it can be said that it can actually maintain sufficient image quality, except for special applications.
  • FIG. 5 is a flowchart showing the processing procedure executed by the imaging control unit 110. This flowchart shows the steps of the zoom control method when the user specifies the number of recording pixels.
  • the imaging control unit 110 acquires the number of recording pixels based on the instruction information from the user. In S502, the imaging control unit 110 determines whether to change the number of recording pixels. For example, when the user designates the number of recording pixels different from the currently set number of recording pixels, the imaging control unit 110 determines to change the number of recording pixels. If the number of recording pixels is not changed, the process of this flowchart ends. When the number of recording pixels is changed, the imaging control unit 110 sets the number of recording pixels to the number of recording pixels designated by the user (S504). Next, the imaging control unit 110 determines the range of a partial region to be cropped from the image captured by the image sensor 120 based on the designated number of recording pixels (S506). For example, the imaging control unit 110 may determine the cropping range with reference to correspondence information showing the correspondence between the position of the pixel cropped from the image and the number of recording pixels. In addition, the corresponding information may be stored in the memory 130 in advance.
  • the imaging control unit 110 causes the lens control unit 220 to perform a zoom operation of the lens 210.
  • the imaging control unit 110 causes the lens 210 to perform a zoom operation so that the image circle of the lens 210 covers the acquisition range determined in S506.
  • the imaging control unit 110 may refer to correspondence information showing the correspondence between the position of the lens 210 responsible for zooming and the number of recording pixels, and send control information showing the position of the lens 210 responsible for zooming to the lens control unit 220.
  • the lens 210 is caused to perform a zoom operation.
  • the corresponding information of the position of the lens 210 and the number of recording pixels may be stored in the memory 130 in advance.
  • the imaging control unit 110 crops a partial area determined in S506 from the image captured by the image sensor 120, generates a recording image with the number of recording pixels designated by the user, and records it in the memory 130.
  • FIG. 6 is a flowchart showing the processing procedure executed by the imaging control unit 110. This flowchart shows the steps of the zoom control method performed when the user specifies zoom.
  • the imaging control unit 110 acquires the zoom value based on the instruction information from the user.
  • the imaging control unit 110 determines whether to change the zoom value. For example, when the user designates a zoom value different from the zoom value currently set, the imaging control unit 110 determines to change the zoom value. If the zoom value is not changed, the process of this flowchart ends.
  • the imaging control unit 110 sets the number of recording pixels based on the zoom value (S604).
  • the imaging control unit 110 determines the range of cropping a partial area from the image captured by the image sensor 120 based on the number of recording pixels (S606).
  • the imaging control unit 110 may set the number of recording pixels with reference to correspondence information showing the correspondence between the zoom value and the number of recording pixels. The corresponding information may be stored in the memory 130 in advance.
  • the imaging control unit 110 causes the lens control unit 220 to perform a zoom operation of the lens 210.
  • the imaging control unit 110 causes the lens 210 to perform a zoom operation so that the image circle of the lens 210 covers the acquisition range determined in S606. Since it is the same as the processing of S508 in FIG. 5, the description of the processing of S608 is omitted here.
  • the imaging control unit 110 crops the partial area determined in S606 from the image captured by the image sensor 120, generates a recording image with the number of recording pixels set in S604, and records it in the memory 130 middle.
  • the lens 210 can be designed so that the size of the image circle at the wide-angle end becomes a size that covers the effective imaging area 122. Therefore, the miniaturization design of the lens 210 is facilitated.
  • the imaging device 100 is an imaging device mounted on the UAV 10.
  • the imaging device 100 may not be an imaging device mounted on a movable body such as UAV10.
  • the camera device 100 may be a camera device supported by a hand-held universal joint.
  • the camera device 100 may be a camera device that is not supported by the UAV 10 and the handheld gimbal.
  • the camera device 100 may be a camera device that can be held by the user.
  • the camera device 100 may be a fixed camera device similar to a surveillance camera or the like.
  • FIG. 7 shows an example of a computer 1200 that may fully or partially embody various aspects of the present invention.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all of the blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through the network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program dependent on the hardware of the computer 1200.
  • the program is provided through a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method can be constituted by realizing the operation or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instructs the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network. The received data is written into the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, condition determination, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when multiple entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. The item that matches the condition is read, and the attribute value of the second attribute stored in the item is read, so as to obtain the attribute value of the second attribute that is associated with the first attribute that meets the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Nonlinear Science (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Lenses (AREA)
PCT/CN2020/136423 2020-01-15 2020-12-15 控制装置、摄像装置、移动体、控制方法以及程序 WO2021143425A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080074285.7A CN114600446A (zh) 2020-01-15 2020-12-15 控制装置、摄像装置、移动体、控制方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020004527A JP6896963B1 (ja) 2020-01-15 2020-01-15 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP2020-004527 2020-01-15

Publications (1)

Publication Number Publication Date
WO2021143425A1 true WO2021143425A1 (zh) 2021-07-22

Family

ID=76540451

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/136423 WO2021143425A1 (zh) 2020-01-15 2020-12-15 控制装置、摄像装置、移动体、控制方法以及程序

Country Status (3)

Country Link
JP (1) JP6896963B1 (ja)
CN (1) CN114600446A (ja)
WO (1) WO2021143425A1 (ja)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000184259A (ja) * 1998-12-15 2000-06-30 Sharp Corp 電子カメラ装置
JP2008301172A (ja) * 2007-05-31 2008-12-11 Panasonic Corp コンバージョンレンズモード付きカメラ
JP2009147676A (ja) * 2007-12-14 2009-07-02 Canon Inc 撮像装置
CN102461155A (zh) * 2009-06-23 2012-05-16 捷讯研究有限公司 在数字变焦摄影期间调整图像锐度
CN104919367A (zh) * 2013-02-01 2015-09-16 奥林巴斯株式会社 更换镜头、照相机系统、摄像装置、照相机系统的控制方法及摄像装置的控制方法
CN106027895A (zh) * 2010-09-16 2016-10-12 奥林巴斯株式会社 摄影设备及其控制方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6987532B1 (en) * 1999-09-20 2006-01-17 Canon Kabushiki Kaisha Image sensing apparatus, control, and method of designing an optical system therefor
JP6188407B2 (ja) * 2013-05-02 2017-08-30 オリンパス株式会社 交換レンズ
JP2015118131A (ja) * 2013-12-16 2015-06-25 キヤノン株式会社 撮像装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000184259A (ja) * 1998-12-15 2000-06-30 Sharp Corp 電子カメラ装置
JP2008301172A (ja) * 2007-05-31 2008-12-11 Panasonic Corp コンバージョンレンズモード付きカメラ
JP2009147676A (ja) * 2007-12-14 2009-07-02 Canon Inc 撮像装置
CN102461155A (zh) * 2009-06-23 2012-05-16 捷讯研究有限公司 在数字变焦摄影期间调整图像锐度
CN106027895A (zh) * 2010-09-16 2016-10-12 奥林巴斯株式会社 摄影设备及其控制方法
CN104919367A (zh) * 2013-02-01 2015-09-16 奥林巴斯株式会社 更换镜头、照相机系统、摄像装置、照相机系统的控制方法及摄像装置的控制方法

Also Published As

Publication number Publication date
JP6896963B1 (ja) 2021-06-30
JP2021111937A (ja) 2021-08-02
CN114600446A (zh) 2022-06-07

Similar Documents

Publication Publication Date Title
WO2019120082A1 (zh) 控制装置、系统、控制方法以及程序
CN111356954B (zh) 控制装置、移动体、控制方法以及程序
WO2019238044A1 (zh) 确定装置、移动体、确定方法以及程序
WO2019206076A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020098603A1 (zh) 确定装置、摄像装置、摄像系统、移动体、确定方法以及程序
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
WO2019242616A1 (zh) 确定装置、摄像系统、移动体、合成系统、确定方法以及程序
WO2019085771A1 (zh) 控制装置、镜头装置、摄像装置、飞行体以及控制方法
CN111357271B (zh) 控制装置、移动体、控制方法
WO2021143425A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
CN111602385B (zh) 确定装置、移动体、确定方法以及计算机可读记录介质
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020216037A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2019242611A1 (zh) 控制装置、移动体、控制方法以及程序
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019080805A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序
JP6569157B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP7003357B2 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
JP6878738B1 (ja) 制御装置、撮像システム、移動体、制御方法、及びプログラム
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
WO2020125414A1 (zh) 控制装置、摄像装置、摄像系统、移动体、控制方法以及程序
WO2019085794A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20914392

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20914392

Country of ref document: EP

Kind code of ref document: A1