CN111226263A - Control device, imaging device, mobile body, control method, and program - Google Patents

Control device, imaging device, mobile body, control method, and program Download PDF

Info

Publication number
CN111226263A
CN111226263A CN201980005098.0A CN201980005098A CN111226263A CN 111226263 A CN111226263 A CN 111226263A CN 201980005098 A CN201980005098 A CN 201980005098A CN 111226263 A CN111226263 A CN 111226263A
Authority
CN
China
Prior art keywords
image
image pickup
range
imaging
captured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980005098.0A
Other languages
Chinese (zh)
Inventor
本庄谦一
邵明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Shenzhen Dajiang Innovations Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111226263A publication Critical patent/CN111226263A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B15/00Optical objectives with means for varying the magnification
    • G02B15/14Optical objectives with means for varying the magnification by axial movement of one or more lenses or groups of lenses relative to the image plane for continuously varying the equivalent focal length of the objective
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/16Stereoscopic photography by sequential viewing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Abstract

A depth map is generated while suppressing the amount of movement of a focus lens. May comprise: a control unit that causes the image pickup device to capture an image of a first image capture range in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in a first positional relationship, and causes the image pickup device to capture an image of a second image capture range that is different from the first image capture range and includes a first overlapping range overlapping the first image capture range in a state where the image pickup surface of the image pickup device and the focus lens of the image pickup device are in a second positional relationship; an acquisition unit that acquires a first captured image of a first imaging range and a second captured image of a second imaging range captured by an imaging device; a calculation unit that calculates blur amounts of a first image corresponding to a first overlap range included in the first captured image and a second image corresponding to the first overlap range included in the second captured image; and a generation unit that generates a depth map including depth information corresponding to the first overlap range, based on the blur amounts of the first image and the second image.

Description

Control device, imaging device, mobile body, control method, and program Technical Field
The invention relates to a control device, an imaging device, a mobile body, a control method, and a program.
Background
Disclosed is an imaging device which, while moving the focus position of an optical system from the proximal end side to the infinity end side, causes an image processing unit to generate moving image data and extracts a still image focused on a specified area from a plurality of frame images included in the moving image data.
Patent document 1 International publication No. 2017/006538
Disclosure of Invention
The present invention is expected to be able to generate a depth map while suppressing the amount of movement of a focus lens.
The control device according to one aspect of the present invention may include a control unit that causes the image pickup device to capture a first image capture range in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in a first positional relationship, and causes the image pickup device to capture a second image capture range that is different from the first image capture range and includes a first overlapping range overlapping with the first image capture range in a state where the image pickup surface of the image pickup device and the focus lens of the image pickup device are in a second positional relationship. The control device may include an acquisition section that acquires a first captured image of a first imaging range and a second captured image of a second imaging range captured by the imaging device. The control device may include a calculation section that calculates respective amounts of blur of a first image corresponding to the first repetition range included in the first captured image and a second image corresponding to the first repetition range included in the second captured image. The control device may include a generation unit that generates a depth map including depth information corresponding to the first repetition range based on respective blur amounts of the first image and the second image.
The first imaging range and the second imaging range may overlap by more than half.
The control section may cause the image pickup device to perform image pickup of a third image pickup range, which is different from the second image pickup range and includes a second repetition range that overlaps with the second image pickup range, in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in a third positional relationship. The acquisition unit may acquire a third captured image of a third imaging range captured by the imaging device. The calculation section may calculate respective blur amounts of a third image corresponding to the second repetition range included in the second captured image and a fourth image corresponding to the second repetition range included in the third captured image. The generation section may generate the depth map further including depth information corresponding to the second repetition range based on respective blur amounts of the third image and the fourth image.
The second imaging range may overlap with the third imaging range by more than half.
The control unit may cause the image pickup device to capture a first image capture range in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in a first positional relationship and cause the image pickup device to capture a second image capture range in a state where the image pickup surface of the image pickup device and the focus lens of the image pickup device are in a second positional relationship, while an image pickup direction of the image pickup device changes.
The control section may cause the image pickup device to pick up the first picked-up image and the second picked-up image while causing the image pickup device to make the first rotation around the first point so that the image pickup direction of the image pickup device is changed. The control section may control a position of a focus lens of the image pickup device in accordance with the depth map and cause the image pickup device to newly photograph the plurality of photographed images while causing the image pickup device to perform the second rotation around the first point so that the photographing direction of the image pickup device is changed.
The control section may store the depth map and the plurality of captured images in the storage section in association with each other.
The control unit may cause the image pickup device to capture a first image capture range in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in a first positional relationship and cause the image pickup device to capture a second image capture range in a state where the image pickup surface of the image pickup device and the focus lens of the image pickup device are in a second positional relationship, while the image pickup device is moving along the first trajectory.
The control section may cause the image pickup device to capture the first captured image and the second captured image during the first movement of the image pickup device along the first trajectory for the first time, and may control a position of a focus lens of the image pickup device in accordance with the depth map and cause the image pickup device to capture the plurality of captured images again during the second movement of the image pickup device along the first trajectory for the second time.
The control section may store the depth map and the plurality of captured images in the storage section in association with each other.
The imaging apparatus according to an aspect of the present invention may include the control device. The image pickup device may include a focus lens.
The moving object according to one aspect of the present invention may be a moving object that includes the imaging device and moves.
A control method according to an aspect of the present invention may include: a stage of causing the image pickup device to pick up an image of a first image pickup range in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in a first positional relationship, and causing the image pickup device to pick up an image of a second image pickup range different from the first image pickup range and including a first repetition range that overlaps with the first image pickup range in a state where the image pickup surface of the image pickup device and the focus lens of the image pickup device are in a second positional relationship. The control method may include: a stage of acquiring a first shot image of a first shot range and a second shot image of a second shot range shot by the camera device. The control method may include: and a step of calculating blur amounts of a first image corresponding to the first repetition range included in the first captured image and a second image corresponding to the first repetition range included in the second captured image. The control method may include: and a step of generating a depth map including depth information corresponding to the first repetition range, based on the respective blur amounts of the first image and the second image.
The program according to one aspect of the present invention may be a program for causing a computer to function as the control device.
According to an aspect of the present invention, a depth map can be generated while suppressing the amount of movement of the focus lens.
Moreover, the above summary of the present invention is not exhaustive of all of the necessary features of the present invention. In addition, subsets of these feature groups may also form the invention.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a diagram showing an example of the external appearance of an unmanned aerial vehicle and a remote operation device.
Fig. 2 is a diagram showing one example of functional blocks of an unmanned aerial vehicle.
Fig. 3 is a diagram showing an example of a curve representing the relationship between the blur amount and the lens position.
Fig. 4 is a diagram showing one example of a process of calculating the distance to the object based on the blur amount.
Fig. 5 is a diagram for explaining the relationship among the object position, the lens position, and the focal length.
Fig. 6 is a diagram for explaining a mode in which the imaging device performs imaging while the unmanned aerial vehicle is rotating.
Fig. 7A is a diagram for explaining a mode in which the image pickup device performs image pickup while the unmanned aerial vehicle is rotating.
Fig. 7B is a diagram illustrating one example of a relationship between a captured image and a focus distance of the focus lens.
Fig. 8 is a diagram showing an example of a panoramic image generated from a plurality of captured images.
Fig. 9 is a flowchart showing one example of an imaging process by the imaging apparatus mounted on the UAV.
Fig. 10 is a diagram showing an example of the hardware configuration.
[ description of reference ]
10 UAV
20 UAV body
30 UAV control section
36 communication interface
37 memory
40 advancing part
41 GPS receiver
42 inertia measuring device
43 magnetic compass
44 barometric altimeter
45 temperature sensor
46 humidity sensor
50 universal joint
60 image pickup device
100 image pickup device
102 image pickup part
110 image pickup control unit
112 acquisition part
114 calculation unit
116 generation unit
118 synthesis part
120 image sensor
130 memory
200 lens part
210 lens
212 lens driving unit
214 position sensor
220 lens control part
222 memory
300 remote operation device
1200 computer
1210 host controller
1212 CPU
1214 RAM
1220 input/output controller
1222 communication interface
1230 ROM
Detailed Description
The present invention will be described below with reference to embodiments thereof, but the following embodiments do not limit the invention according to the claims. In addition, not all combinations of features described in the embodiments are essential to the solution of the invention. It will be apparent to those skilled in the art that various changes and modifications can be made in the following embodiments. As is apparent from the description of the claims, the embodiments to which such changes or improvements are made are included in the technical scope of the present invention.
The contents of the claims, the specification, the drawings, and the abstract of the specification include contents to be protected by copyright. The copyright owner cannot objection to the facsimile reproduction by anyone of the files, as represented by the patent office documents or records. However, in other cases, the copyright of everything is reserved.
Various embodiments of the present invention may be described with reference to flowcharts and block diagrams, where a block may represent (1) a stage in a process of executing an operation or (2) a "section" of an apparatus having a role of executing an operation. Certain stages and "sections" may be implemented by programmable circuits and/or processors. The dedicated circuitry may comprise digital and/or analog hardware circuitry. May include Integrated Circuits (ICs) and/or discrete circuits. The programmable circuitry may comprise reconfigurable hardware circuitry. The reconfigurable hardware circuit may include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), etc. memory elements.
The computer readable medium may comprise any tangible device that can store instructions for execution by a suitable device. As a result, the computer-readable medium containing the instructions stored therein comprises an article of manufacture including instructions which can be executed to create means for implementing the operations specified in the flowchart or block diagram block or blocks. As examples of the computer readable medium, an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included. As more specific examples of the computer-readable medium, floppy disks (registered trademark), floppy disks, hard disks, Random Access Memories (RAMs), Read Only Memories (ROMs), erasable programmable read only memories (EPROMs or flash memories), Electrically Erasable Programmable Read Only Memories (EEPROMs), Static Random Access Memories (SRAMs), compact disc read only memories (CD-ROMs), Digital Versatile Discs (DVDs), blu-Ray (RTM) discs, memory sticks, integrated circuit cards, and the like may be included.
Computer readable instructions may include any one of source code or object code described in any combination of one or more programming languages. The source code or object code comprises a conventional procedural programming language. Conventional procedural programming languages may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA (registered trademark), C + +, or the like, and the "C" programming language, or similar programming languages. The computer readable instructions may be provided to a processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus, either locally or via a Wide Area Network (WAN), such as a Local Area Network (LAN), the internet, or the like. A processor or programmable circuit may execute the computer readable instructions to create means for implementing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
Fig. 1 shows an example of the appearance of an Unmanned Aerial Vehicle (UAV)10 and a remote operation device 300. The UAV10 includes a UAV body 20, a gimbal 50, a plurality of cameras 60, and a camera 100. The gimbal 50 and the image pickup apparatus 100 are one example of an image pickup system. The UAV10, i.e., a mobile body, is a concept including a flying body moving in the air, a vehicle moving on the ground, a ship moving on water, and the like. The flying body moving in the air refers to a concept including not only the UAV but also other aircrafts, airships, helicopters, and the like moving in the air.
The UAV body 20 contains a plurality of rotors. Multiple rotors are one example of a propulsion section. The UAV body 20 flies the UAV10 by controlling the rotation of the plurality of rotors. The UAV body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four. In addition, the UAV10 may also be a fixed-wing aircraft without a rotor.
The imaging apparatus 100 is an imaging camera that images an object included in a desired imaging range. The gimbal 50 rotatably supports the image pickup apparatus 100. The gimbal 50 is an example of a support mechanism. For example, the gimbal 50 rotatably supports the image pickup apparatus 100 around the pitch axis using an actuator. The gimbal 50 further rotatably supports the image pickup apparatus 100 centered on the roll axis and the yaw axis, respectively, using the actuators. The gimbal 50 can change the attitude of the imaging apparatus 100 by rotating the imaging apparatus 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
The plurality of imaging devices 60 are sensing cameras that capture images of the surroundings of the UAV10 in order to control the flight of the UAV 10. Two cameras 60 may be provided at the nose, i.e., the front, of the UAV 10. Also, two other cameras 60 may be provided on the bottom surface of the UAV 10. The two image pickup devices 60 on the front side may be paired to function as a so-called stereo camera. The two imaging devices 60 on the bottom surface side may also be paired to function as a stereo camera. Three-dimensional spatial data around the UAV10 may be generated from images taken by a plurality of cameras 60. The number of cameras 60 included in the UAV10 is not limited to four. The UAV10 may include at least one camera 60. The UAV10 may also include at least one camera 60 at the nose, tail, sides, bottom, and top of the UAV 10. The angle of view settable in the image pickup device 60 may be larger than the angle of view settable in the image pickup device 100. The imaging device 60 may also have a single focus lens or a fisheye lens.
The remote operation device 300 communicates with the UAV10 to remotely operate the UAV 10. The remote operation device 300 may wirelessly communicate with the UAV 10. The remote operation device 300 transmits instruction information indicating various instructions related to the movement of the UAV10, such as ascending, descending, accelerating, decelerating, advancing, retreating, and rotating, to the UAV 10. The indication information includes, for example, indication information to raise the altitude of the UAV 10. The indication may indicate an altitude at which the UAV10 should be located. The UAV10 moves to be located at an altitude indicated by the instruction information received from the remote operation device 300. The indication may include a lift instruction to lift the UAV 10. The UAV10 ascends while receiving the ascending instruction. When the height of the UAV10 has reached an upper limit height, the UAV10 may limit ascent even if an ascent command is accepted.
Figure 2 shows one example of the functional blocks of the UAV 10. The UAV10 includes a UAV control 30, a memory 37, a communication interface 36, a propulsion 40, a GPS receiver 41, an inertial measurement device 42, a magnetic compass 43, a barometric altimeter 44, a temperature sensor 45, a humidity sensor 46, a gimbal 50, an imaging device 60, and an imaging device 100.
The communication interface 36 communicates with other devices such as the remote operation device 300. The communication interface 36 may receive instruction information including various instructions to the UAV control 30 from the remote operation device 300. The memory 37 stores programs and the like necessary for the UAV control unit 30 to control the propulsion unit 40, the GPS receiver 41, the Inertial Measurement Unit (IMU)42, the magnetic compass 43, the barometric altimeter 44, the temperature sensor 45, the humidity sensor 46, the universal joint 50, the imaging device 60, and the imaging device 100. The memory 37 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and flash memory such as Solid State Disk (SSD). The memory 37 may be disposed inside the UAV body 20. Which may be configured to be detachable from the UAV body 20.
The UAV control unit 30 controls the flight and shooting of the UAV10 according to a program stored in the memory 37. The UAV control unit 30 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The UAV control unit 30 controls the flight and shooting of the UAV10 in accordance with an instruction received from the remote operation device 300 via the communication interface 36. The propulsion section 40 propels the UAV 10. The propulsion unit 40 includes a plurality of rotors and a plurality of drive motors for rotating the rotors. The propulsion unit 40 rotates the plurality of rotors via the plurality of drive motors in accordance with instructions from the UAV control unit 30 to fly the UAV 10.
The GPS receiver 41 receives a plurality of signals indicating times transmitted from a plurality of GPS satellites. The GPS receiver 41 calculates the position (latitude and longitude) of the GPS receiver 41, that is, the position (latitude and longitude) of the UAV10 from the plurality of received signals. The IMU 42 detects the pose of the UAV 10. The IMU 42 detects, as the attitude of the UAV10, the three-axis accelerations in the three-axis directions of the forward, backward, leftward, rightward, and upward and downward directions of the UAV10, and the three-axis angular velocities of the pitch axis, the roll axis, and the yaw axis. The magnetic compass 43 detects the orientation of the nose of the UAV 10. The barometric altimeter 44 detects the altitude of the flight of the UAV 10. The barometric altimeter 44 detects the barometric pressure around the UAV10, and converts the detected barometric pressure into altitude to detect altitude. The temperature sensor 45 detects the temperature around the UAV 10. The humidity sensor 46 detects the humidity around the UAV 10.
The imaging device 100 includes an imaging unit 102 and a lens unit 200. The lens part 200 is one example of a lens apparatus. The imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130. The image sensor 120 may be formed of a CCD or a CMOS. The image sensor 120 captures an optical image formed via the plurality of lenses 210, and outputs the captured image to the image capture control section 110. The imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The imaging control unit 110 may control the imaging apparatus 100 according to an operation instruction of the imaging apparatus 100 from the UAV control unit 30. The imaging control unit 110 is an example of a first control unit and a second control unit. The memory 130 may be a computer-readable recording medium and may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and flash memory such as Solid State Disk (SSD). The memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the image pickup apparatus 100. The memory 130 may be configured to be detachable from the housing of the image pickup apparatus 100.
The lens section 200 has a plurality of lenses 210, a plurality of lens driving sections 212, and a lens control section 220. The plurality of lenses 210 may function as zoom lenses (zoom lenses), variable focal lenses (variable lenses), and focusing lenses. At least a part or all of the plurality of lenses 210 are configured to be movable along the optical axis. The lens section 200 may be an interchangeable lens provided to be attachable to and detachable from the image pickup section 102. The lens driving section 212 moves at least a part or all of the plurality of lenses 210 along the optical axis via a mechanism member such as a cam ring. The lens driving part 212 may include an actuator. The actuator may comprise a stepper motor. The lens control section 220 drives the lens driving section 212 in accordance with a lens control instruction from the image pickup section 102 to move the one or more lenses 210 in the optical axis direction via the mechanism member. The lens control command is, for example, a zoom control command and a focus control command.
The lens portion 200 also has a memory 222 and a position sensor 214. The lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens driving unit 212 in accordance with a lens operation command from the image pickup unit 102. The lens control unit 220 controls the movement of the lens 210 in the optical axis direction via the lens driving unit 212 in accordance with a lens operation command from the image pickup unit 102. A part or all of the lens 210 moves along the optical axis. The lens control section 220 performs at least one of a zoom operation and a focus operation by moving at least one of the lenses 210 along the optical axis. The position sensor 214 detects the position of the lens 210. The position sensor 214 may detect a current zoom position or focus position.
The lens driving part 212 may include a shake correction mechanism. The lens control section 220 may perform shake correction by moving the lens 210 in a direction along the optical axis or a direction perpendicular to the optical axis via the shake correction mechanism. The lens driving section 212 may drive the shake correction mechanism by a stepping motor to perform shake correction. In addition, the shake correction mechanism may be driven by a stepping motor to move the image sensor 120 in a direction along the optical axis or a direction perpendicular to the optical axis to perform shake correction.
The memory 222 stores control values of the plurality of lenses 210 moved via the lens driving part 212. The memory 222 may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and other flash memories.
The imaging apparatus 100 configured as described above has a function of determining a distance from the lens to the object (object distance) in order to execute an autofocus process (AF process) or the like. As a method for determining the object distance, there is a method of performing determination based on blur amounts of a plurality of images captured in different states of the positional relationship between the lens and the image pickup surface. This method is referred to herein as the blur Detection autofocus (Bokeh Detection Auto Foucus: BDAF) method.
For example, the blur amount (Cost) of an image can be expressed by the following equation (1) using a gaussian function. In the formula (1), x represents a pixel position in the horizontal direction. σ denotes the standard deviation.
Figure PCTCN2019091780-APPB-000001
Fig. 3 shows an example of a curve represented by equation (1). By focusing the focus lens to a lens position corresponding to the lowest point 502 of the curve 500, it is possible to focus on the object contained in the image I.
Fig. 4 is a flowchart showing one example of the distance calculation process of the BDAF mode. First, the imaging apparatus 100 captures a first image I in a state where the lens and the imaging surface are in a first positional relationship1The photographing is performed and stored in the memory 130. Then, the focusing lens or the image pickup surface of the image sensor 120 is moved in the optical axis direction to bring the lens and the image pickup surface into the second positional relationship, whereby the image pickup apparatus 100 picks up the second image I2The photographing is performed and stored in the memory 130 (S101). For example, as in so-called hill-climbing AF, the focus lens or the image pickup surface of the image sensor 120 is moved in the optical axis direction without exceeding the focus point. Amount of movement of focusing lens or image pickup surface of image sensor 120For example, it may be 10 μm.
Next, the image pickup apparatus 100 extracts the image I1Divided into a plurality of regions (S102). Can be according to image I2Calculates a feature quantity for each pixel in the image, and uses a pixel group with similar feature quantity as an area to further use the image I1Divided into a plurality of regions. The image I can also be used1The pixel group set as the range of the AF processing frame within is divided into a plurality of areas. The image pickup apparatus 100 picks up an image I2Division into and images I1A plurality of regions corresponding to the plurality of regions. The imaging device 100 is based on the image I1And the blur amount of each of the plurality of regions and the image I2The blur amount of each of the plurality of regions is calculated for each of the plurality of regions, the distance to the object included in each of the plurality of regions (S103).
The calculation of the distance is further explained with reference to fig. 5. Let a distance from the lens L (principal point) to the object 510 (object plane) be a, a distance from the lens L (principal point) to a position (image plane) where the object 510 forms an image on the image plane be B, and a focal length be F. In this case, the relationship among the distance a, the distance B, and the focal length F can be expressed by the following formula (2) according to the lens type.
Figure PCTCN2019091780-APPB-000002
The focal length F is specified by the lens position. Therefore, if the distance B at which the subject 510 is imaged on the imaging surface can be specified, the distance a from the lens L to the subject 510 can be specified using equation (2).
As shown in fig. 5, the distance B, and further the distance a, can be specified by calculating the position where the object 510 is imaged based on the magnitude of blur of the object 510 projected on the image pickup surface (circle of confusion 512 and 514). That is, the imaging position can be specified in association with the size of blur (blur amount) being proportional to the imaging plane and the imaging position.
Wherein, an image I close to the image pickup surface is set1Distance D from lens L1. Let us set an image I which is far from the image pickup surface2Distance to lens LIs D2. Each image has blur. Let the Point Spread Function (Point Spread Function) at this time be PSF, and let D1And D2The images of (A) are respectively Id1And Id2. In this case, for example, the image I1The following equation (3) can be expressed by convolution operation.
I1=PSF*Id1Formula (3)
Further, let image data Id1And Id2Has a Fourier transform function of f, and is set for the image Id1And Id2Point spread function PSF of1And PSF2The Optical Transfer Function (OTF) obtained by performing Fourier transform is1And OTF2The ratio is obtained as in the following formula (4).
Figure PCTCN2019091780-APPB-000003
The C value represented by the formula (4) is an image Id1And Id2The amount of change in the respective amounts of blur, i.e., the C value, corresponds to the image Id1Amount of blur and image Id2nThe difference in the amount of blur.
Based on the captured image captured by the imaging apparatus 100 configured as described above, a depth map is generated while suppressing the amount of movement of the focus lens. The depth map is data representing the distance to a subject for each pixel or for each block including a plurality of pixels.
The imaging control unit 110 includes an acquisition unit 112, a calculation unit 114, a generation unit 116, and a synthesis unit 118.
The imaging control unit 110 causes the imaging device 100 to capture a plurality of captured images in a plurality of imaging ranges including the overlap range in a state where the positional relationship between the imaging plane of the imaging device 100 and the focus lens of the imaging device 100 is different during a period in which the imaging direction of the imaging device 100 is changed or during a period in which the imaging device 100 is moved along the first trajectory. The imaging control unit 110 may cause the imaging apparatus 100 to capture a plurality of captured images in a plurality of imaging ranges including the repetition range in a state where the positional relationship between the imaging plane of the imaging apparatus 100 and the focus lens of the imaging apparatus 100 is different while the UAV10 rotates while hovering at a predetermined location. The imaging range is a range of a space imaged by the imaging device 100.
The imaging control unit 110 may cause the imaging apparatus 100 to capture a plurality of captured images in a plurality of imaging ranges including the overlap range in a state where the positional relationship between the imaging plane of the imaging apparatus 100 and the focus lens of the imaging apparatus 100 is different while the UAV10 is moving. The imaging control unit 110 may cause the imaging apparatus 100 to capture a plurality of captured images in a plurality of imaging ranges including the overlap range, while the UAV10 moves in a direction different from the imaging direction of the imaging apparatus 100, in a state where the positional relationship between the imaging plane of the imaging apparatus 100 and the focus lens of the imaging apparatus 100 is different.
The imaging control section 110 may move the focus lens during hovering rotation of the UAV10 at a predetermined location, and cause the imaging apparatus 100 to capture a plurality of captured images in a plurality of imaging ranges including the repetition range. The imaging control section 110 may move the focus lens during movement of the UAV10 along the first trajectory, and cause the imaging apparatus 100 to capture a plurality of captured images in a plurality of imaging ranges including the repetition range.
The imaging control section 110 may alternately switch the position of the focus lens between the first position and the second position while the UAV10 rotates while hovering at a predetermined location, while causing the imaging apparatus 100 to capture a plurality of captured images in a plurality of imaging ranges including the repetition range. The imaging control section 110 may alternately switch the position of the focus lens between the first position and the second position while causing the imaging apparatus 100 to capture a plurality of captured images in a plurality of imaging ranges including the repetition range, during the movement of the UAV10 along the first trajectory.
As shown in fig. 6, the imaging control unit 110 may cause the imaging apparatus 100 to capture an image of the first imaging range 601 while the UAV10 is hovering and rotating at a predetermined location and in a state where an imaging surface of the imaging apparatus 100 and a focus lens of the imaging apparatus 100 are in a first positional relationship. The image pickup control section 110 may cause the image pickup apparatus 100 to perform image pickup of a second image pickup range 602 in a state where the image pickup surface of the image pickup apparatus 100 and the focus lens of the image pickup apparatus 100 are in a second positional relationship, the second image pickup range 602 being different from the first image pickup range 601 and including a first overlapping range 611 overlapping with the first image pickup range 601. The image capture control section 110 may cause the image capture apparatus 100 to capture a third image capture range 603 in a state where the image capture plane of the image capture apparatus 100 and the focus lens of the image capture apparatus 100 are in a third positional relationship, the third image capture range 603 being different from the second image capture range 602 and including a second repetition range 612 that overlaps with the second image capture range 602. The first image capturing range 601 is different from the second image capturing range 602, but may overlap with the second image capturing range 602 by more than half. The second image capturing range 602 is different from the third image capturing range 603, but may be overlapped with the third image capturing range 603 by more than half. The second imaging range 602 is different from the first imaging range 601 and the third imaging range 603, but may overlap with the third imaging range 603 by more than half.
The acquisition section 112 acquires a plurality of captured images captured by the imaging apparatus 100 during a change in the imaging direction of the imaging apparatus 100 or during movement of the imaging apparatus 100 along the first trajectory. The acquisition section 112 may acquire captured images of a plurality of imaging ranges captured by the imaging apparatus 100 in a state where the positional relationship between the imaging plane of the imaging apparatus 100 and the focus lens of the imaging apparatus 100 is different during rotation of the UAV10 while hovering at a predetermined location. The acquisition unit 112 may acquire captured images of a plurality of imaging ranges captured by the imaging apparatus 100 while the UAV10 moves along the first trajectory in a state where a positional relationship between an imaging plane of the imaging apparatus 100 and a focus lens of the imaging apparatus 100 is different.
The calculation unit 114 calculates the blur amount of each of the images in the overlapping range included in each of the plurality of captured images. The calculation unit 114 may calculate the blur amount (Cost) of each image based on equation (1) using a gaussian function.
The generation unit 116 generates a depth map including depth information corresponding to each of the plurality of overlapping ranges, based on the blur amounts of the images in the plurality of overlapping ranges. The generation section 116 may generate a depth map including depth information indicating a distance to the subject for each pixel of the images of the plurality of repetition ranges or for each block including the plurality of pixels by the BDAF method based on the respective blur amounts of the images of the plurality of repetition ranges.
As shown in fig. 7A and 7B, the imaging control section 110 may set the focal distance of the focus lens to a first position (1.0m) and cause the imaging apparatus 100 to capture an image of a first imaging range 601 while the UAV10 hovers and rotates at a predetermined location. The acquisition section 112 can acquire the first captured image 701 of the first imaging range 601. The imaging control unit 110 may set the focal distance of the focus lens to the second position (0.5m) and cause the imaging device 100 to capture the second imaging range 602 while the UAV10 hovers and rotates at a predetermined location. The acquisition section 112 can acquire the second captured image 702 of the second imaging range 602. Further, the imaging control section 110 may set the focal distance of the focus lens to the first position (1.0m) and cause the imaging apparatus 100 to image the third imaging range 603 while the UAV10 hovers and rotates at a predetermined location. The acquisition unit 112 can acquire the third captured image 703 of the third imaging range 603.
The calculation section 114 may calculate the blur amounts of the first image 710 corresponding to the first repetition range 611 included in the first captured image 701 and the second image 711 corresponding to the first repetition range 611 included in the second captured image 702, respectively. The calculation section 114 may calculate respective blur amounts of the third image 712 corresponding to the second repetition range 612 included in the second captured image 702 and the fourth image 713 corresponding to the second repetition range 612 included in the third captured image 703.
The generation section 116 may generate a depth map including depth information corresponding to the first repetition range 611 based on the respective blur amounts of the first image 710 and the second image 711. The generation section 116 may generate a depth map further including depth information corresponding to the second repetition range 612 based on the respective blur amounts of the third image 712 and the fourth image 713.
As described above, the imaging device 100 according to the present embodiment generates a depth map corresponding to an overlapping range in the BDAF system based on a plurality of captured images of different and partially overlapping imaging ranges. In the case of the BDAF mode, the distance to the object can be specified without moving the focus lens from the nearest side to the infinity side. Further, the depth map is generated based on the captured image captured during the change of the imaging direction of the imaging apparatus 100 or during the movement of the imaging apparatus 100 in the first trajectory. Accordingly, a depth map including a wide range of depth information can be generated in a relatively short time.
The imaging control unit 110 may control the position of the focus lens of the imaging apparatus 100 based on the depth map, and further cause the imaging apparatus 100 to capture a plurality of captured images.
The imaging control section 110 may cause the imaging device 100 to capture the first captured image and the second captured image while causing the imaging device 100 to make the first rotation around the first point so that the imaging direction of the imaging device 100 is changed. The imaging control unit 110 may control the position of the focus lens of the imaging apparatus 100 in accordance with the depth map and cause the imaging apparatus 100 to capture a plurality of captured images again while causing the imaging apparatus to perform the second rotation around the first point so that the imaging direction of the imaging apparatus 100 is changed.
The imaging control section 110 may cause the imaging apparatus 100 to capture the first captured image and the second captured image during the first movement of the imaging apparatus 100 along the first trajectory for the first time, and may control the position of the focus lens of the imaging apparatus 100 in accordance with the depth map and cause the imaging apparatus 100 to capture a plurality of captured images again during the second movement of the imaging apparatus 100 along the first trajectory for the second time.
The imaging control unit 110 may cause the imaging device 100 to capture a plurality of captured images of different and partially overlapping imaging ranges while the UAV10 is hovering and making a first rotation around a first point. The acquisition section 112 may acquire a plurality of captured images of different and partially overlapping imaging ranges from each other. The calculation section 114 may calculate the blur amount of the image of the repetition range. Further, the generation section 116 may generate a depth map including depth information corresponding to the repetition range based on the blur amount of the image of the repetition range.
The imaging control section 110 may specify a distance to a desired object in accordance with the depth map. The imaging control unit 110 may control the position of the focus lens in accordance with the distance to a predetermined desired object while hovering the UAV10 and performing a second rotation around the first point, and may cause the imaging apparatus 100 to capture a plurality of captured images. The imaging control unit 110 may control the focus lens to a predetermined focus distance without following the depth map and cause the imaging device 100 to capture a plurality of captured images while making the UAV10 hover and making a first rotation around the first point.
The imaging control section 110 may store the depth map and the plurality of captured images in association in a storage section such as the memory 130.
The combining unit 118 may combine a plurality of captured images to generate a panoramic image as shown in fig. 8. The compositing section 118 may associate the panoramic image and the depth map and store in a storage section such as the memory 130.
The ratio of the repetition range between adjacent image capturing ranges may be any ratio as long as the image capturing ranges are different. When generating a depth map including depth information of all image capturing ranges captured by the image capturing apparatus 100, the proportion of the overlapping range between adjacent image capturing ranges is less than 1 and is 1/2 or more. On the other hand, when generating a depth map including depth information of a partial image capturing range captured by the image capturing apparatus 100, the proportion of the overlapping range between adjacent image capturing ranges may be a proportion smaller than 1/2 and larger than 0.
Fig. 9 is a flowchart showing one example of an imaging process of the imaging apparatus 100 mounted on the UAV 10.
The UAV10 starts flying (S200). The image capturing mode of the image capturing apparatus 100 is set to the panoramic mode with the depth map according to an instruction from the user via the remote operation apparatus 300 (S202). The UAV control 30 moves the UAV10 to a desired first location. Next, the imaging control unit 110 causes the UAV10 to hover at the first point while rotating and moving the focus lens through the UAV control unit 30, and causes the imaging device 100 to capture a plurality of captured images of different and partially overlapping imaging ranges (S204). The image pickup control section 110 may alternately switch the position of the focus lens between the first position and the second position while causing the image pickup device 100 to pick up a plurality of picked-up images of image pickup ranges that are different from each other and partially overlap.
The calculation unit 114 calculates the blur amount of the image in the overlap range, and the generation unit 116 generates a depth map including depth information corresponding to the overlap range based on the blur amount (S206).
Next, the imaging control unit 110 causes the UAV10 to hover at the first point and rotate again via the UAV control unit 30, and causes the imaging apparatus 100 to capture a plurality of captured images while controlling the focus lens according to the depth map (S208). The imaging control unit 110 may specify a distance to a desired object in accordance with the depth map, and adjust the position of the focus lens in accordance with the distance. Then, the imaging control part 110 may rotate the UAV10 while hovering at the first location through the UAV control part 30, and capture a plurality of captured images at the adjusted position of the focus lens. For example, when the imaging apparatus 100 is caused to perform imaging at each of a plurality of focal distances, the imaging control unit 110 may cause the imaging apparatus 100 to perform imaging by rotating the UAV10 for each focal distance. When the UAV10 is mounted with a plurality of imaging apparatuses 100, the imaging control unit 110 may cause each of the plurality of imaging apparatuses 100 to take a plurality of captured images at different focal distances while the UAV10 makes one rotation.
The imaging control unit 110 stores a plurality of captured images and a depth map in the memory 130 in association with each other (S210). The synthesizing section 118 synthesizes a plurality of captured images to generate a panoramic image, and the imaging control section 110 may store this panoramic image in the memory 130 in association with the depth map.
The above has explained an example in which the UAV10 rotates 360 degrees and the generation section 116 generates a depth map including depth information in a range of 360 degrees. However, the UAV10 may rotate by a rotation angle of less than 360 degrees, and the generation unit 116 may generate a depth map including depth information corresponding to a range of the rotation angle. An example of a 360 degree rotation of the camera 100 of the UAV10 about a yaw axis is illustrated. However, the imaging apparatus 100 may perform imaging in a three-dimensional space. When the image pickup apparatus 100 performs shooting in a three-dimensional space, the orientation of the image pickup apparatus 100 is adjusted by controlling the universal joint 50 so that the image pickup apparatus 100 can perform shooting. For example, first the pitch axis and roll axis of the gimbal 50 are fixed and the image pickup apparatus 100 is rotated about the yaw axis, and then the image pickup apparatus 100 is tilted in the positive or negative direction about the pitch axis and the image pickup apparatus 100 is rotated about the yaw axis again. The attitude of the imaging device 100 may also be controlled by controlling the attitude of the UAV10 if the UAV10 is balanced, rather than controlling the attitude of the imaging device 100 through a universal joint. This enables the acquisition of depth maps in three-dimensional space. If a plurality of image pickup apparatuses 100 are provided in order to obtain other overlapping ranges as well, a depth map can be obtained with a shorter man-hour than in the case of one image pickup apparatus 100.
FIG. 10 illustrates one example of a computer 1200 in which aspects of the invention may be embodied, in whole or in part. The program installed on the computer 1200 can cause the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention or as one or more "sections" of the apparatus. Alternatively, the program can cause the computer 1200 to execute the operation or the one or more "sections". The program enables the computer 1200 to execute the processes or the stages of the processes according to the embodiments of the present invention. Such programs may be executed by CPU 1212 to cause computer 1200 to perform certain operations associated with some or all of the blocks in the flowchart and block diagrams described herein.
The computer 1200 of the present embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other via a host controller 1210. The computer 1200 also includes a communication interface 1222, an input/output unit, which are connected to the host controller 1210 through the input/output controller 1220. Computer 1200 also includes a ROM 1230. The CPU 1212 operates in accordance with programs stored in the ROM 1230 and the RAM 1214, thereby controlling the respective units.
The communication interface 1222 communicates with other electronic devices through a network. The hard disk drive may store programs and data used by CPU 1212 in computer 1200. The ROM 1230 stores therein a boot program or the like executed by the computer 1200 at runtime, and/or a program depending on hardware of the computer 1200. The program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network. The program is installed in the RAM 1214 or the ROM 1230, which is also an example of a computer-readable recording medium, and executed by the CPU 1212. The information processing described in these programs is read by the computer 1200, and brings about cooperation between the programs and the various types of hardware resources described above. An apparatus or method may be constructed by implementing operations or processes for information as the computer 1200 is used.
For example, in performing communication between the computer 1200 and an external device, the CPU 1212 may execute a communication program loaded in the RAM 1214 and instruct the communication interface 1222 to perform communication processing according to processing described in the communication program. The communication interface 1222 reads transmission data stored in a transmission buffer provided in a recording medium such as the RAM 1214 or a USB memory and transmits the read transmission data to a network, or writes reception data received from the network in a reception buffer or the like provided in the recording medium, under the control of the CPU 1212.
Further, the CPU 1212 may cause the RAM 1214 to read all or a necessary portion of a file or a database stored in an external recording medium such as a USB memory, and execute various types of processing on data on the RAM 1214. Then, the CPU 1212 may write back the processed data to the external recording medium.
Various types of information such as various types of programs, data, tables, and databases may be stored in the recording medium and processed by the information. With respect to data read from the RAM 1214, the CPU 1212 may execute various types of processing including various types of operations specified by an instruction sequence of a program, information processing, condition judgment, condition transition, unconditional transition, retrieval/replacement of information, and the like, which are described throughout the present disclosure, and write the result back to the RAM 1214. Further, the CPU 1212 can retrieve information in files, databases, etc., within the recording medium. For example, when a plurality of entries having attribute values of a first attribute respectively associated with attribute values of a second attribute are stored in a recording medium, the CPU 1212 may retrieve an entry matching a condition specifying an attribute value of the first attribute from the plurality of entries and read an attribute value of the second attribute stored in the entry, thereby acquiring an attribute value of the second attribute associated with the first attribute satisfying a predetermined condition.
The programs or software modules described above may be stored on the computer 1200 or on a computer readable storage medium near the computer 1200. Further, a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet may be used as the computer-readable storage medium, thereby providing the program to the computer 1200 through the network.
It should be noted that the order of execution of the operations, the sequence, the steps, the stages, and the like in the devices, systems, programs, and methods shown in the claims, the description, and the drawings of the specification can be implemented in any order as long as "before …", "in advance", and the like are not particularly explicitly indicated, and as long as the output of the preceding process is not used in the subsequent process. The operational flow in the claims, the specification, and the drawings is described using "first", "next", and the like for convenience, but it is not necessarily meant to be performed in this order.
The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. As is apparent from the description of the claims, the embodiments to which such changes or improvements are made are included in the technical scope of the present invention.

Claims (14)

  1. A control device, comprising:
    a control unit that causes an image pickup apparatus to capture an image of a first image capture range in a state where an image pickup surface of the image pickup apparatus and a focus lens of the image pickup apparatus are in a first positional relationship, and causes the image pickup apparatus to capture an image of a second image capture range that is different from the first image capture range and includes a first overlapping range overlapping with the first image capture range in a state where the image pickup surface of the image pickup apparatus and the focus lens of the image pickup apparatus are in a second positional relationship;
    an acquisition unit that acquires a first captured image of the first imaging range and a second captured image of the second imaging range captured by the imaging device;
    a calculation unit that calculates blur amounts of a first image corresponding to the first overlap range included in the first captured image and a second image corresponding to the first overlap range included in the second captured image; and
    a generating unit that generates a depth map including depth information corresponding to the first overlap range, based on the blur amount of each of the first image and the second image.
  2. The control device according to claim 1, wherein the first imaging range and the second imaging range are overlapped by half or more.
  3. The control apparatus of claim 1, wherein:
    the control section causes the image pickup device to perform image pickup of a third image pickup range, which is different from the second image pickup range and includes a second repetition range that overlaps with the second image pickup range, in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in a third positional relationship;
    the acquisition section further acquires a third captured image of the third imaging range captured by the imaging device;
    the calculation section further calculates blur amounts of a third image corresponding to the second repetition range included in the second captured image and a fourth image corresponding to the second repetition range included in the third captured image;
    the generation section generates the depth map further including depth information corresponding to the second repetition range based on respective blur amounts of the third image and the fourth image.
  4. The control device according to claim 3, wherein the second imaging range and the third imaging range overlap by half or more.
  5. The control device according to claim 1, wherein the control unit causes the image pickup device to photograph a first image pickup range in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in the first positional relationship, and causes the image pickup device to photograph a second image pickup range in a state where the image pickup surface of the image pickup device and the focus lens of the image pickup device are in the second positional relationship, while an image pickup direction of the image pickup device is changed.
  6. The control device according to claim 5, wherein the control unit causes the imaging device to capture the first captured image and the second captured image while causing the imaging device to make a first rotation around a first point so that an imaging direction of the imaging device is changed, and controls a position of a focus lens of the imaging device in accordance with the depth map and causes the imaging device to capture a plurality of captured images again while causing the imaging device to make a second rotation around the first point so that the imaging direction of the imaging device is changed.
  7. The control device according to claim 6, wherein the control unit stores the depth map and the plurality of captured images in a storage unit in association with each other.
  8. The control device according to claim 1, wherein the control unit causes the image pickup device to photograph a first image pickup range in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in the first positional relationship, and causes the image pickup device to photograph a second image pickup range in a state where the image pickup surface of the image pickup device and the focus lens of the image pickup device are in the second positional relationship, while the image pickup device is moving along the first trajectory.
  9. The control device according to claim 8, wherein the control section causes the image pickup device to pick up the first picked up image and the second picked up image during a first movement of the image pickup device along the first trajectory, and controls a position of a focus lens of the image pickup device in accordance with the depth map and causes the image pickup device to pick up a plurality of picked up images again during a second movement of the image pickup device along the first trajectory.
  10. The control device according to claim 9, wherein the control unit stores the depth map and the plurality of captured images in a storage unit in association with each other.
  11. An image pickup apparatus, comprising:
    the control device according to any one of claims 1 to 10; and
    the focusing lens.
  12. A mobile body which includes the imaging device according to claim 11 and moves.
  13. A control method, comprising:
    a step of causing an image pickup device to pick up an image of a first image pickup range in a state where an image pickup surface of the image pickup device and a focus lens of the image pickup device are in a first positional relationship, and causing the image pickup device to pick up an image of a second image pickup range different from the first image pickup range and including a first repetition range that overlaps with the first image pickup range in a state where the image pickup surface of the image pickup device and the focus lens of the image pickup device are in a second positional relationship;
    a stage of acquiring a first captured image of the first imaging range and a second captured image of the second imaging range captured by the imaging device;
    a step of calculating blur amounts of a first image corresponding to the first repetition range included in the first captured image and a second image corresponding to the first repetition range included in the second captured image; and
    a stage of generating a depth map including depth information corresponding to the first repetition range based on the respective blur amounts of the first image and the second image.
  14. A program for causing a computer to function as the control device according to any one of claims 1 to 10.
CN201980005098.0A 2018-06-27 2019-06-18 Control device, imaging device, mobile body, control method, and program Pending CN111226263A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018122419A JP6569157B1 (en) 2018-06-27 2018-06-27 Control device, imaging device, moving object, control method, and program
JP2018-122419 2018-06-27
PCT/CN2019/091780 WO2020001335A1 (en) 2018-06-27 2019-06-18 Control device, imaging device, moving object, control method and program

Publications (1)

Publication Number Publication Date
CN111226263A true CN111226263A (en) 2020-06-02

Family

ID=67844759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980005098.0A Pending CN111226263A (en) 2018-06-27 2019-06-18 Control device, imaging device, mobile body, control method, and program

Country Status (3)

Country Link
JP (1) JP6569157B1 (en)
CN (1) CN111226263A (en)
WO (1) WO2020001335A1 (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0832865A (en) * 1994-07-13 1996-02-02 Nikon Corp Image pickup device
JP2003018438A (en) * 2001-07-05 2003-01-17 Fuji Photo Film Co Ltd Imaging apparatus
US20060198623A1 (en) * 2005-03-03 2006-09-07 Fuji Photo Film Co., Ltd. Image capturing apparatus, image capturing method, image capturing program, image recording output system and image recording output method
CN101673395A (en) * 2008-09-10 2010-03-17 深圳华为通信技术有限公司 Image mosaic method and image mosaic device
CN102223477A (en) * 2010-04-13 2011-10-19 索尼公司 Four-dimensional polynomial model for depth estimation based on two-picture matching
JP2013085291A (en) * 2012-12-28 2013-05-09 Canon Inc Image processing apparatus and image processing method
JP2013205516A (en) * 2012-03-27 2013-10-07 Nippon Hoso Kyokai <Nhk> Multi-focus camera
CN103793909A (en) * 2014-01-21 2014-05-14 东北大学 Single-vision overall depth information acquisition method based on diffraction blurring
CN104519328A (en) * 2013-10-02 2015-04-15 佳能株式会社 Image processing device, image capturing apparatus, and image processing method
JP2016066007A (en) * 2014-09-25 2016-04-28 キヤノン株式会社 Imaging apparatus and method for controlling the same
JP2017199958A (en) * 2016-04-25 2017-11-02 キヤノン株式会社 Imaging apparatus, control method thereof, and control program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011259168A (en) * 2010-06-08 2011-12-22 Fujifilm Corp Stereoscopic panoramic image capturing device
JP2013044844A (en) * 2011-08-23 2013-03-04 Panasonic Corp Image processing device and image processing method
JP2015017999A (en) * 2011-11-09 2015-01-29 パナソニック株式会社 Imaging device
CN105472252B (en) * 2015-12-31 2018-12-21 天津远度科技有限公司 A kind of unmanned plane obtains the system and method for image
KR101694890B1 (en) * 2016-03-29 2017-01-13 주식회사 비젼인 Image processing system and method using 2D images from multiple points
WO2018212008A1 (en) * 2017-05-16 2018-11-22 富士フイルム株式会社 Image capturing device and image compositing device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0832865A (en) * 1994-07-13 1996-02-02 Nikon Corp Image pickup device
JP2003018438A (en) * 2001-07-05 2003-01-17 Fuji Photo Film Co Ltd Imaging apparatus
US20060198623A1 (en) * 2005-03-03 2006-09-07 Fuji Photo Film Co., Ltd. Image capturing apparatus, image capturing method, image capturing program, image recording output system and image recording output method
CN101673395A (en) * 2008-09-10 2010-03-17 深圳华为通信技术有限公司 Image mosaic method and image mosaic device
CN102223477A (en) * 2010-04-13 2011-10-19 索尼公司 Four-dimensional polynomial model for depth estimation based on two-picture matching
JP2013205516A (en) * 2012-03-27 2013-10-07 Nippon Hoso Kyokai <Nhk> Multi-focus camera
JP2013085291A (en) * 2012-12-28 2013-05-09 Canon Inc Image processing apparatus and image processing method
CN104519328A (en) * 2013-10-02 2015-04-15 佳能株式会社 Image processing device, image capturing apparatus, and image processing method
CN103793909A (en) * 2014-01-21 2014-05-14 东北大学 Single-vision overall depth information acquisition method based on diffraction blurring
JP2016066007A (en) * 2014-09-25 2016-04-28 キヤノン株式会社 Imaging apparatus and method for controlling the same
JP2017199958A (en) * 2016-04-25 2017-11-02 キヤノン株式会社 Imaging apparatus, control method thereof, and control program

Also Published As

Publication number Publication date
JP2020005108A (en) 2020-01-09
JP6569157B1 (en) 2019-09-04
WO2020001335A1 (en) 2020-01-02

Similar Documents

Publication Publication Date Title
CN108235815B (en) Imaging control device, imaging system, moving object, imaging control method, and medium
CN111567032B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
CN110383812B (en) Control device, system, control method, and program
CN111356954B (en) Control device, mobile body, control method, and program
US20210014427A1 (en) Control device, imaging device, mobile object, control method and program
US20210105411A1 (en) Determination device, photographing system, movable body, composite system, determination method, and program
US10942331B2 (en) Control apparatus, lens apparatus, photographic apparatus, flying body, and control method
CN109844634B (en) Control device, imaging device, flight object, control method, and program
CN110785997B (en) Control device, imaging device, mobile body, and control method
CN111357271B (en) Control device, mobile body, and control method
CN111602385B (en) Specifying device, moving body, specifying method, and computer-readable recording medium
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
CN110770667A (en) Control device, mobile body, control method, and program
CN111226170A (en) Control device, mobile body, control method, and program
CN112154371A (en) Control device, imaging device, mobile body, control method, and program
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
CN111213369B (en) Control device, control method, imaging device, mobile object, and computer-readable storage medium
CN112166374B (en) Control device, imaging device, mobile body, and control method
CN110383815B (en) Control device, imaging device, flying object, control method, and storage medium
CN114600446A (en) Control device, imaging device, mobile body, control method, and program
JP2021128208A (en) Control device, imaging system, mobile entity, control method, and program
CN111615663A (en) Control device, imaging system, mobile object, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200602

WD01 Invention patent application deemed withdrawn after publication