WO2022001561A1 - 控制装置、摄像装置、控制方法以及程序 - Google Patents

控制装置、摄像装置、控制方法以及程序 Download PDF

Info

Publication number
WO2022001561A1
WO2022001561A1 PCT/CN2021/097830 CN2021097830W WO2022001561A1 WO 2022001561 A1 WO2022001561 A1 WO 2022001561A1 CN 2021097830 W CN2021097830 W CN 2021097830W WO 2022001561 A1 WO2022001561 A1 WO 2022001561A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
distance information
imaging
image
subject
Prior art date
Application number
PCT/CN2021/097830
Other languages
English (en)
French (fr)
Inventor
周长波
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202180006044.3A priority Critical patent/CN114586336A/zh
Publication of WO2022001561A1 publication Critical patent/WO2022001561A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/40Systems for automatic generation of focusing signals using time delay of the reflected waves, e.g. of ultrasonic waves
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to a control device, a camera device, a control method and a program.
  • Patent Document 1 describes "focus peaking that emphasizes the outline of the in-focus portion in the display of a live view image”.
  • Patent Document 1 Japanese Patent Document Re-List No. 2017/188150
  • the accuracy of judging the in-focus state of the subject based on the contrast value may decrease. As a result, the accuracy of the position where the focus peaking should be determined based on the contrast value may be reduced.
  • the control device may be a control device for controlling an imaging device including a ranging sensor and a focusing lens.
  • the control device may include a circuit configured to acquire the contrast value of the first imaging region in the image captured by the imaging device.
  • the circuit may be configured to acquire first distance information from the ranging sensor, where the first distance information represents the distance to the subject associated with the first ranging area corresponding to the first imaging area.
  • the circuit may be configured to acquire second distance information, wherein the second distance information represents at least one of a distance or a range of distances to the subject in the in-focus state based on the position of the focus lens.
  • the circuit may be configured as follows: when the contrast value of the first imaging area is greater than or equal to a preset threshold, and the relationship between the distance represented by the first distance information and at least one of the distance or distance range represented by the second distance information satisfies a preset condition , an indicator indicating that the subject existing in the first imaging area is in focus is superimposed on the first imaging area of the image and displayed on the display unit together with the image.
  • the circuit may be configured as: when the difference between the distance represented by the first distance information and the distance represented by the second distance information is less than a preset threshold, it is determined that the distance represented by the first distance information and the distance represented by the second distance information The relationship satisfies the preset conditions.
  • the circuit may be configured to: when the distance represented by the first distance information is included in the distance range represented by the second distance information, it is determined that the relationship between the distance represented by the first distance information and the distance represented by the second distance information satisfies preset conditions.
  • the circuit may be configured as follows: when the difference between the distance represented by the first distance information and the distance represented by the second distance information is less than a preset threshold, and the distance represented by the first distance information is included in the distance range represented by the second distance information When the distance is within the range, it is determined that the relationship between the distance represented by the first distance information and the distance represented by the second distance information satisfies the preset condition.
  • the circuit may be configured to superimpose the index on the edge portion of the subject existing in the first imaging area of the image, and display the index on the display unit together with the image.
  • the ranging sensor may be a TOF sensor.
  • An imaging device may include the above-mentioned control device, a distance measuring sensor, and a focusing lens.
  • the control method involved in one aspect of the present invention may be a control method for controlling an imaging device including a ranging sensor and a focusing lens.
  • the control method may include: a stage of acquiring a contrast value of the first imaging region within the image captured by the imaging device.
  • the control method may include: a stage of acquiring first distance information from a ranging sensor, wherein the first distance information represents a distance to the subject associated with the first ranging area corresponding to the first imaging area.
  • the control method may include a stage of acquiring second distance information based on the position of the focusing lens, wherein the second distance information represents at least one of a distance or a distance range to a subject that becomes in-focus.
  • the control method may include: when the contrast value of the first imaging area is greater than or equal to a preset threshold, and the relationship between the distance represented by the first distance information and the distance or distance range represented by the second distance information satisfies a preset condition At the stage of superimposing an indicator indicating that the subject existing in the first imaging area is in focus on the first imaging area of the image and displayed on the display unit together with the image.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned control device.
  • FIG. 1 is a diagram showing functional blocks of an imaging apparatus.
  • FIG. 2 is a diagram illustrating a display example of focus peaking.
  • FIG. 3 is a diagram for explaining an imaging area.
  • FIG. 4 is a diagram for explaining a ranging area.
  • FIG. 5 is a flowchart showing an example of a processing procedure of focus peaking.
  • FIG. 6 is a diagram showing an example of the appearance of the unmanned aerial vehicle and the remote control device.
  • FIG. 7 is a diagram showing an example of a hardware configuration.
  • blocks may represent (1) stages of processes for performing operations or (2) "portions" of means that function to perform operations.
  • Certain stages and “sections” may be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits.
  • ICs integrated circuits
  • discrete circuits may be included.
  • Programmable circuits may include reconfigurable hardware circuits.
  • Reconfigurable hardware circuits may include logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, Field Programmable Gate Arrays (FPGA), Programmable Logic Arrays (PLA) ) and other memory elements, etc.
  • Computer-readable media can include any tangible device that can store instructions for execution by a suitable device.
  • a computer-readable medium having instructions stored thereon includes an article of manufacture comprising the instructions executable to create means for performing the operations specified in the flowchart or block diagram.
  • electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like may be included.
  • floppy (registered trademark) disk floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) may be included or flash memory), Electrically Erasable Programmable Read Only Memory (EEPROM), Static Random Access Memory (SRAM), Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD), Blu-ray (registered trademark) disc, Memory stick, integrated circuit card, etc.
  • Computer readable instructions may include either source code or object code described in any combination of one or more programming languages.
  • Source code or object code includes conventional procedural programming languages.
  • Conventional procedural programming languages may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk (registered trademark), JAVA (registered trademark) , C++ and other object-oriented programming languages, and the "C" programming language or similar programming languages.
  • Computer readable instructions may be provided to the processor or programmable circuitry of a general purpose computer, special purpose computer or other programmable data processing apparatus locally or via a local area network (LAN), a wide area network (WAN) such as the Internet.
  • LAN local area network
  • WAN wide area network
  • a processor or programmable circuit may execute computer readable instructions to create means for performing the operations specified in the flowcharts or block diagrams.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
  • FIG. 1 shows an example of functional blocks of the imaging apparatus 100 according to the present embodiment.
  • the imaging device 100 includes an imaging unit 102 , a TOF sensor 160 , and a lens unit 200 .
  • the imaging unit 102 includes an image sensor 120 , an imaging control unit 110 , a memory 170 , a display unit 180 , and an operation unit 182 .
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 outputs image data of the optical image formed by the plurality of lenses 154 to the imaging control unit 110 .
  • the imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the imaging control unit 110 performs demosaic processing on the image signal output from the image sensor 120 in accordance with an operation command of the imaging device 100 from the operation unit 182 to generate image data.
  • the imaging control unit 110 stores the image data in the memory 170 .
  • the imaging control unit 110 controls the TOF sensor 160 .
  • the imaging control unit 110 is an example of a circuit.
  • the TOF sensor 160 is a time-of-flight sensor that measures the distance to an object.
  • the imaging apparatus 100 performs focus control by adjusting the position of the focus lens based on the distance measured by the TOF sensor 160 .
  • the memory 170 may be a computer-readable storage medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 170 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 170 may be provided inside the casing of the camera device 100 .
  • the plurality of lenses 154 may function as zoom lenses, variable focal length lenses, and focus lenses. At least some or all of the plurality of lenses 154 are configured to be movable along the optical axis.
  • the lens control unit 150 drives the lens driving unit 152 according to the lens control command from the imaging control unit 110 to move one or more lenses 154 in the optical axis direction.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens driving unit 152 may include a voice coil motor (VCM) that moves at least a part or all of the plurality of lenses 154 in the optical axis direction.
  • the lens driving unit 152 may include a motor such as a DC motor, a hollow cup motor, or an ultrasonic motor.
  • the lens driving unit 152 may transmit power from the motor to at least a part or all of the plurality of lenses 154 via mechanical components such as a cam ring and a guide shaft to move at least a part or all of the plurality of lenses 154 along the optical axis.
  • mechanical components such as a cam ring and a guide shaft to move at least a part or all of the plurality of lenses 154 along the optical axis.
  • the plurality of lenses 154 and the imaging device 100 are integrated will be described.
  • the plurality of lenses 154 may be interchangeable lenses, and may be configured separately from the imaging apparatus 100 .
  • the display part 180 can display the image output from the image sensor 120 .
  • the display part 180 can display various setting information of the camera 100 .
  • the display part 180 may be a liquid crystal display, a touch screen display, or the like.
  • the display part 180 may include a plurality of liquid crystal displays or touch screen displays.
  • the TOF sensor 160 includes a light-emitting unit 162 , a light-receiving unit 164 , a light-emitting control unit 166 , a light-receiving control unit 167 , and a memory 168 .
  • TOF sensor 160 is an example of a ranging sensor.
  • the camera 100 may include other ranging sensors, such as a stereo camera that performs ranging based on parallax, instead of the TOF sensor 160 .
  • the light-emitting portion 162 includes at least one light-emitting element 163 .
  • the light-emitting element 163 is a device that repeatedly emits high-speed modulated pulsed light such as an LED or a laser.
  • the light emitting element 163 may emit pulsed light which is infrared light.
  • the light emission control unit 166 controls light emission of the light emitting element 163 .
  • the light emission control section 166 can control the pulse width of the pulsed light emitted from the light emitting element 163 .
  • the light-receiving section 164 includes a plurality of light-receiving elements 165 that measure the distance to the subject associated with each of the plurality of ranging areas.
  • the plurality of light-receiving elements 165 respectively correspond to the plurality of ranging areas.
  • the light receiving element 165 repeatedly receives the reflected light of the pulsed light from the object.
  • the light receiving control unit 167 controls the light receiving element 165 to receive light.
  • the light receiving control unit 167 measures the distance to the subject associated with each of the plurality of ranging areas based on the amount of reflected light repeatedly received by the light receiving element 165 during the preset light receiving period.
  • the light receiving control unit 167 can measure the distance to the subject by determining the phase difference between the pulsed light and the reflected light based on the amount of the reflected light repeatedly received by the light receiving element 165 during the preset light receiving period.
  • the light receiving unit 164 can measure the distance to the subject by reading the frequency change of the reflected wave. This is called FMCW (Frequency Modulated Continuous Wave) method.
  • the memory 168 may be a computer-readable recording medium, and may include at least one of SRAM, DRAM, EPROM, and EEPROM.
  • the memory 168 stores programs required by the light emission control unit 166 to control the light emission unit 162 , programs required by the light reception control unit 167 to control the light reception unit 164 , and the like.
  • the TOF sensor 160 can measure the distance to the subject associated with each of the plurality of ranging areas corresponding to the number of pixels of the light receiving unit 164 .
  • the imaging apparatus 100 configured in this way adjusts the position of the focus lens in accordance with an instruction from the user in the manual focus mode.
  • the camera apparatus 100 When operating in the manual focus mode, the camera apparatus 100 performs focus peaking. Focus peaking is a function that assists in adjusting the position of the focusing lens by visually displaying to the user the degree of the in-focus state of the subject. For example, as shown in FIG. 2 , the imaging apparatus 100 may emphasize the portion in the in-focus state by superimposing the index 184 on the edge portion of the subject in the in-focus state of the image 185 . When operating in the auto-focus mode, the camera 100 may also perform focus peaking to visually show the user which area is in focus.
  • the imaging apparatus 100 determines an area in the image where the contrast value is greater than or equal to a preset threshold, and superimposes an index indicating that the subject included in the determined area is in focus on the image captured by the imaging apparatus 100 and displayed on the display unit 180 .
  • the imaging control unit 110 in the imaging control unit 110 , the difference between the contrast value of the in-focus subject and the contrast value of the out-of-focus subject may be almost the same. Therefore, even if it is determined whether or not it is in the in-focus state based on the difference in the magnitude of the contrast value of each area in the image, the imaging apparatus 100 may not be able to identify the edge portion of the subject that is actually in the in-focus state.
  • the imaging apparatus 100 when the imaging apparatus 100 is equipped with a lens with a dark light source, a small aperture, and a long focal length, the image captured by the imaging apparatus 100 is unclear and noisy. In the case of such an image, there is almost no difference between the contrast value of the subject in the in-focus state and the contrast value of the subject not in the in-focus state. In this case, the entire edge portion including the edge portion of the subject that is not actually in focus is emphasized and displayed on the display unit 180 together with the image captured by the imaging device 100 .
  • the camera 100 may consider adjusting the threshold according to the resolution of the camera 100 and the noise level of the image captured by the camera 100 .
  • the image pickup apparatus 100 with the interchangeable lens mounted thereon may not be able to appropriately set the threshold value.
  • the imaging apparatus 100 may not be able to distinguish between the contrast value of the subject in focus and the contrast value of the subject not in focus only by adjusting the magnitude of the threshold value. .
  • users can also consider setting the threshold manually according to the shooting environment. For example, it may be considered that the three-level thresholds of "high”, “medium” and “low” are stored in the memory in advance, and the user selects the threshold according to the shooting environment. However, even in this case, even in a shooting environment with a high noise level, the imaging apparatus 100 may not be able to distinguish the contrast value of the subject in focus from the subject which is not in focus by merely adjusting the magnitude of the threshold. The contrast value of the body.
  • the imaging apparatus 100 uses the distance measurement result of the TOF sensor 160 as a determination criterion to specify an area including the subject in the in-focus state in the image.
  • the imaging control unit 110 acquires the contrast value of the first imaging region in the image captured by the imaging device 100 .
  • the imaging control unit 110 acquires first distance information from the TOF sensor 160 , where the first distance information represents the distance to the subject associated with the first ranging area corresponding to the first imaging area.
  • the imaging control unit 110 acquires second distance information indicating at least one of a distance or a distance range to the subject in the in-focus state based on the position of the focus lens.
  • a table in which the position of the focus lens corresponds to the distance to the subject in the in-focus state may be stored in the memory 170 . Accordingly, the imaging control unit 110 can specify the current position of the focus lens, and by referring to the table, can specify the distance to the subject corresponding to the current position of the focus lens.
  • a table in which the position of the focus lens corresponds to the depth of field representing the distance range to the subject in the in-focus state may be stored in the memory 170 . Accordingly, the imaging control unit 110 can determine the current position of the focus lens, and by referring to the table, can determine the depth of field corresponding to the current position of the focus lens. The imaging control unit 110 can derive the depth of field at the current focus lens position as the distance range to the in-focus subject based on a preset function including a value corresponding to the focus lens position as a variable.
  • the imaging control unit 110 When the contrast value of the first imaging area is greater than or equal to the preset threshold, and the relationship between the distance indicated by the first distance information and at least one of the distance or distance range indicated by the second distance information satisfies the preset condition, the imaging control unit 110 superimposes an indicator indicating that the subject existing in the first imaging area is in focus on the first imaging area of the image and displays it on the display unit 180 together with the image.
  • the imaging control unit 110 may superimpose a line along the edge portion of the subject in the in-focus state on the image as an index and display it on the display unit 180 .
  • the imaging control unit 110 may superimpose the set of points along the edge portion of the subject in the in-focus state on the image as an index, and display it on the display unit 180 .
  • the camera control unit 110 may determine the difference between the distance indicated by the first distance information and the distance indicated by the second distance information The relationship satisfies the preset conditions.
  • the imaging control unit 110 may determine that the relationship between the distance indicated by the first distance information and the distance indicated by the second distance information satisfies the predetermined Set conditions.
  • the camera control The section 110 may determine that the relationship between the distance indicated by the first distance information and the distance indicated by the second distance information satisfies a preset condition.
  • the imaging control unit 110 divides the image captured by the imaging device 100 into a plurality of imaging regions, and calculates a contrast value for each of the plurality of imaging regions.
  • the contrast value may be a value representing an extension of the distribution of pixel values included in the imaging area.
  • the imaging control unit 110 may calculate the contrast value by (Lmax ⁇ Lmin)/(Lmax+Lmin), with the maximum luminance in the imaging area being Lmax and the minimum luminance being Lmin.
  • the imaging control unit 110 determines the distance to the subject in relation to each ranging area measured by the TOF sensor 160 .
  • the imaging control unit 110 can determine a ranging area corresponding to each imaging area based on the preset correspondence between each imaging area and each ranging area in the image.
  • the imaging control unit 110 calculates the contrast value of each imaging area ( 1 ) to ( 9 ) in the image captured by the imaging device 100 .
  • the imaging control section 110 determines the distance to the subject associated with each of the ranging areas ( 1 ) to ( 9 ) corresponding to the respective imaging areas ( 1 ) to ( 9 ) measured by the TOF sensor 160 . distance.
  • the imaging control section 110 sets the threshold value of the contrast value to 60. Thereby, the imaging control unit 110 determines the imaging area ( 1 ) and the imaging area ( 5 ) as imaging areas having contrast values higher than the threshold value.
  • the imaging control unit 110 determines that the distance to the subject is 1 m based on the current position of the focus lens. Then, when the difference between the distance to the subject based on the current focus lens position and the distance to the subject based on the distance measurement result of the TOF sensor 160 is less than or equal to 1 m, the imaging control unit 110 determines that the preset condition is satisfied . The imaging control unit 110 determines the ranging area (5) as a ranging area satisfying a preset condition.
  • the imaging control unit 110 determines the imaging area ( 5 ) corresponding to the ranging area ( 5 ) as the imaging area including the subject in the in-focus state.
  • the imaging control unit 110 determines an imaging area (1) having a contrast value equal to or greater than a threshold value as an imaging area including noise, and removes the imaging area (1) from the imaging area including the subject in the in-focus state. Then, the imaging control section 110 emphasizes the edge portion of the subject included in the imaging area ( 5 ), and displays the image captured by the imaging device 100 on the display section 180 .
  • FIG. 5 is a flowchart showing an example of a processing procedure of focusing peaking performed by the imaging control unit 110 .
  • the imaging control unit 110 determines the distance to the subject associated with each ranging area corresponding to each imaging area in the image based on the ranging result of the TOF sensor 160 ( S100 ).
  • the imaging control unit 110 specifies the current position of the focus lens.
  • the imaging control unit 110 determines the distance to the subject in the in-focus state based on the determined current focus lens position ( S102 ).
  • the imaging control unit 110 determines whether there is an imaging area that satisfies a preset condition in relation to the distance to the subject in the in-focus state based on the current focus lens position (S104); the imaging control unit 110 determines based on the current focus Whether the difference between the distance of the lens position to the in-focus subject and the distance to the subject based on the distance measurement result of the TOF sensor 160 is equal to or less than a preset threshold (eg, 1 m) exists in an imaging area.
  • a preset threshold eg, 1 m
  • the imaging control unit 110 determines the contrast value of the corresponding imaging area ( S106 ). Next, the imaging control unit 110 determines whether the determined contrast value is equal to or greater than a preset threshold value ( S108 ). If the determined contrast value is equal to or greater than the preset threshold value, the imaging control unit 110 highlights the subject existing in the corresponding imaging area on the display unit 180 (S110). The imaging control section 110 may emphasize the edge portion of the subject existing in the corresponding imaging area, and display the image captured by the imaging device 100 on the display section 180 .
  • the imaging control unit 110 displays the image captured by the imaging control unit 110 on the display unit 180 without highlighting and displaying the edge portion of the subject existing in each imaging area ( S112 ).
  • the range measurement result of the TOF sensor 160 is used to specify the area including the in-focus subject in the image.
  • the range measurement result of the TOF sensor 160 is used to specify the area including the in-focus subject in the image.
  • the above-described imaging device 100 may be mounted on a mobile body.
  • the camera device 100 may also be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 6 .
  • UAV 1000 may include a UAV body 20 , a gimbal 50 , a plurality of cameras 60 , and a camera 100 .
  • the gimbal 50 and the camera device 100 are an example of a camera system.
  • UAV 1000 is an example of a moving object propelled by a propelling unit.
  • the concept of a moving object includes, in addition to a UAV, a flying object such as an airplane moving in the air, a vehicle moving on the ground, and a ship moving on the water.
  • UAV body 20 includes a plurality of rotors.
  • a plurality of rotors is one example of a propulsion section.
  • the UAV main body 20 makes the UAV 1000 fly by controlling the rotation of the plurality of rotors.
  • the UAV main body 20 uses, for example, four rotors to make the UAV 1000 fly.
  • the number of rotors is not limited to four.
  • the UAV1000 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures a subject included in a desired imaging range.
  • the gimbal 50 rotatably supports the camera device 100 .
  • Cardan joint 50 is one example of a support mechanism.
  • the gimbal 50 rotatably supports the camera device 100 with a pitch axis using an actuator.
  • the gimbal 50 further supports the camera 100 rotatably around the roll axis and the yaw axis, respectively, using an actuator.
  • the gimbal 50 can change the posture of the camera 100 by rotating the camera 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that capture images of the surroundings of the UAV 1000 in order to control the flight of the UAV 1000 .
  • the two camera devices 60 may be installed on the nose of the UAV1000, that is, on the front. Also, the other two camera devices 60 can be installed on the bottom surface of the UAV1000.
  • the two imaging devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may be paired to function as stereo cameras.
  • Three-dimensional space data around the UAV 1000 can be generated from images captured by the plurality of cameras 60 .
  • the number of camera devices 60 included in UAV 1000 is not limited to four.
  • the UAV 1000 only needs to include at least one camera device 60 .
  • the UAV1000 may also include at least one camera device 60 on the nose, the tail, the side, the bottom, and the top of the UAV1000, respectively.
  • the angle of view that can be set in the camera device 60 may be larger than the angle of view that can be set in the camera device 100 .
  • the camera device 60 may also include a single focus lens or a fisheye lens.
  • the remote operation device 600 communicates with the UAV 1000 to remotely operate the UAV 1000 .
  • the remote operation device 600 can wirelessly communicate with the UAV 1000 .
  • the remote control device 600 transmits instruction information indicating various commands related to movement of the UAV 1000, such as ascending, descending, acceleration, deceleration, forward, backward, and rotation, to the UAV 1000 .
  • the instruction information includes, for example, instruction information to raise the altitude of UAV 1000 .
  • the indication information may indicate the altitude at which the UAV 1000 should be located.
  • UAV 1000 moves so as to be located at the height indicated by the instruction information received from remote operation device 600 .
  • the instruction information may include an ascending instruction to ascend the UAV 1000 .
  • the UAV1000 rises while receiving the rising command. When the height of the UAV1000 has reached the upper limit, even if the ascending command is accepted, the UAV1000 can be restricted from ascending.
  • FIG. 7 illustrates one example of a computer 1200 that may embody, in whole or in part, aspects of the invention.
  • a program installed on the computer 1200 enables the computer 1200 to function as an operation associated with the apparatus according to the embodiment of the present invention, or one or more "sections" of the apparatus. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "sections.”
  • the program enables the computer 1200 to execute the process or stages of the process involved in the embodiments of the present invention.
  • Such programs can be executed by CPU 1212 to cause computer 1200 to perform specified operations associated with some or all of the blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of the present embodiment includes a CPU 1212 and a RAM 1214 , which are connected to each other through the host controller 1210 .
  • the computer 1200 also includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • Computer 1200 also includes ROM 1230 .
  • CPU1212 operates according to the program stored in ROM1230 and RAM1214, and controls each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200 .
  • the ROM 1230 stores therein a boot program and the like executed by the computer 1200 at the time of operation, and/or programs depending on the hardware of the computer 1200.
  • the program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network.
  • the program is installed in the RAM 1214 or the ROM 1230 , which are also examples of a computer-readable recording medium, and executed by the CPU 1212 .
  • the information processing described in these programs is read by the computer 1200 and brings about cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method may be constituted by implementing operations or processing of information with the use of the computer 1200 .
  • the CPU 1212 may execute a communication program loaded in the RAM 1214, and instruct the communication interface 1222 to perform communication processing according to the processing described by the communication program.
  • the communication interface 1222 under the control of the CPU 1212, reads the transmission data stored in the transmission buffer provided in the recording medium such as the RAM 1214 or the USB memory, and transmits the read transmission data to the network, or receives the transmission data from the network.
  • the received data is written into a receive buffer or the like provided in the recording medium.
  • the CPU 1212 can cause the RAM 1214 to read all or a necessary part of a file or database stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214 . Next, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional branching, unconditional branching, information retrieval/information specified by the instruction sequence of the program, which are described throughout this disclosure.
  • Various types of processing such as replacement are performed, and the result is written back to the RAM 1214.
  • the CPU 1212 can retrieve information in files, databases, and the like within the recording medium.
  • the CPU 1212 may retrieve the attribute value specifying the first attribute from the plurality of entries The entry matches the condition of the entry, and the attribute value of the second attribute stored in the entry is read, so as to obtain the attribute value of the second attribute associated with the first attribute that satisfies the preset condition.
  • the programs or software modules described above may be stored on computer 1200 or on a computer-readable storage medium near computer 1200 .
  • a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby supplying the program to the computer 1200 through a network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

一种控制装置,其对包括测距传感器和聚焦镜头的摄像装置进行控制,所述控制装置可以包括电路,所述电路构成为:获取摄像装置拍摄的图像内的第一摄像区域的对比度值;从测距传感器获取第一距离信息,其中第一距离信息表示与第一摄像区域所对应的第一测距区域相关联的到被摄体的距离;获取第二距离信息,其中第二距离信息表示基于聚焦镜头的位置的到成于对焦状态的被摄体的距离或距离范围中的至少一个;当第一摄像区域的对比度值大于等于预设阈值,并且第一距离信息所表示的距离与第二距离信息所表示的距离或距离范围中的至少一个的关系满足预设条件时,将表示存在于第一摄像区域的被摄体处于对焦状态的指标叠加在图像的第一摄像区域并与图像共同显示在显示部上。

Description

控制装置、摄像装置、控制方法以及程序 【技术领域】
本发明涉及一种控制装置、摄像装置、控制方法以及程序。
【背景技术】
专利文献1记载了“在实时取景图像的显示中强调对焦部分的轮廓的峰值对焦”。
[现有技术文献]
[专利文献]
[专利文献1]日本专利文献再表2017/188150号公报
【发明内容】
【发明所要解决的技术问题】
根据拍摄环境的不同,基于对比度值判断被摄体的对焦状态的精度有时会降低。由此,基于对比度值判断的应峰值对焦的位置的精度有时会降低。
【用于解决问题的技术手段】
根据本发明的一个方面所涉及的控制装置,可以是控制包括测距传感器和聚焦镜头的摄像装置的控制装置。控制装置可以包括电路,该电路构成为:获取摄像装置拍摄的图像内的第一摄像区域的对比度值。电路可以构成为:从测距传感器获取第一距离信息,其中第一距离信息表示与第一摄像区域所对应的第一测距区域相关联的到被摄体的距离。电路可以构成为:获取第二距离信息,其中第二距离信息表示基于聚焦镜头的位置的到成为对焦状态的被摄体的距离或距离范围中的至少一个。电路可以构成为:当第一摄像区域的对比度值大于等于预设阈值,并且第一距离信息所表示的距离与第二距离信息所表示的距离或距离范围中的至少一个的关系满足预设条件时,将表示存在于第一摄像区域的被摄体处于对焦状态的指标叠加在图像的第一摄像区域并与图像共同显示在显示部上。
电路可以构成为:当第一距离信息所表示的距离和第二距离信息所表示的距离之差小于预设阈值时,判断为第一距离信息所表示的距离与第二距离信息所表示的距离的关系满足预设条件。
电路可以构成为:当第一距离信息所表示的距离包含于第二距离信息所表示的距离范围内时,判断为第一距离信息所表示的距离和第二距离信息所表示的距离的关系满足预设条件。
电路可以构成为:当第一距离信息所表示的距离与第二距离信息所表示的距离之差小于预设阈值,并且第一距离信息所表示的距离包含于第二距离信息所表示的距离范围内时,判断为第一距离信息所表示的距离和第二距离信息所表示的距离的关系满足预设条件。
电路可以构成为:将指标叠加在存在于图像的第一摄像区域中的被摄体的边缘部分,并与图像共同显示在显示部上。
测距传感器可以是TOF传感器。
本发明的一个方面所涉及的摄像装置可以包括上述控制装置、测距传感器以及聚焦镜头。
本发明的一个方面所涉及的控制方法可以是控制包括测距传感器和聚焦镜头的摄像装置的控制方法。控制方法可以包括:获取摄像装置拍摄的图像内的第一摄像区域的对比度值的阶段。控制方法可以包括:从测距传感器获取第一距离信息的阶段,其中第一距离信息表示与第一摄像区域所对应的第一测距区域相关联的到被摄体的距离。控制方法可以包括:基于聚焦镜头的位置获取第二距离信息的阶段,其中第二距离信息表示到成为对焦状态的被摄体的距离或距离范围中的至少一个。控制方法可以包括:当第一摄像区域的对比度值大于等于预设阈值,并且第一距离信息所表示的距离与第二距离信息所表示的距离或距离范围中的至少一个的关系满足预设条件时,将表示存在于第一摄像区域中的被摄体处于对焦状态的指标叠加在图像的第一摄像区域并与图像共同显示在显示部的阶段。
本发明的一个方面所涉及的程序可以是用于使计算机作为上述控制装置而发挥功能的程序。
根据本发明的一个方面,可以抑制应峰值对焦位置的精度降低。
此外,上述发明内容未列举本发明的全部必要特征。此外,这些特征组的子组合也可以构成发明。
【附图说明】
图1是示出摄像装置的功能块的图。
图2是示出峰值对焦的显示示例的图。
图3是用于说明摄像区域的图。
图4是用于说明测距区域的图。
图5是示出峰值对焦的处理过程的一个示例的流程图。
图6是示出无人驾驶航空器及远程操作装置的外观的一个示例的图。
图7是示出硬件配置的示例的图。
【具体实施方式】
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可以参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括软盘floppy(注册商标)disk、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(注册商标)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk(注册商标)、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1示出了本实施方式所涉及的摄像装置100的功能块的一个示例。摄像装置100包括摄像部102、TOF传感器160及镜头部200。摄像部102包括图像传感器120、摄像控制部110、存储器170、显示部180及操作部182。
图像传感器120可以由CCD或CMOS构成。图像传感器120将通过多个镜头154成像的光学图像的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。
摄像控制部110按照来自操作部182的摄像装置100的动作指令,并且摄像控制部110对从图像传感器120输出的图像信号实施去马赛克处理,生成图像数据。摄像控制部110将图像数据存储在存储器170中。摄像控制部110控制TOF传感器160。 摄像控制部110是电路的一个示例。TOF传感器160是测量到对象物距离的飞行时间型传感器。摄像装置100基于TOF传感器160测量的距离来调整聚焦镜头的位置,以此来执行对焦控制。
存储器170可以为计算机可读存储介质,可以包含SRAM、DRAM、EPROM、EEPROM及USB存储器等闪存中的至少一个。存储器170储存摄像控制部110对图像传感器120等进行控制所需的程序等。存储器170可以设置于摄像装置100的壳体内部。
多个镜头154可以起到变焦镜头、可变焦距镜头及聚焦镜头的作用。多个镜头154中的至少一部分或全部被配置为能够沿着光轴移动。镜头控制部150按照来自摄像控制部110的镜头控制指令驱动镜头驱动部152,使一个或者多个镜头154沿光轴方向移动。镜头控制指令例如为变焦控制指令及聚焦控制指令。镜头驱动部152可以包括使多个镜头154的至少一部分或全部沿光轴方向移动的音圈电机(VCM)。镜头驱动部152可以包括DC电机、空心杯电机或超声波电机等电动机。镜头驱动部152可以将来自电动机的动力经由凸轮环、导轴等的机构部件传递给多个镜头154中的至少一部分或全部,使多个镜头154中的至少一部分或全部沿光轴移动。在本实施方式中,对多个镜头154和摄像装置100一体化的例子进行说明。然而,多个镜头154可以是可更换镜头,可以与摄像装置100不同,另外构成。
显示部180可以显示从图像传感器120输出的图像。显示部180可以显示摄像装置100的各种设置信息。显示部180可以是液晶显示器、触摸屏显示器等。显示部180可以包括多个液晶显示器或触摸屏显示器。
TOF传感器160包括发光部162、受光部164、发光控制部166、受光控制部167以及存储器168。TOF传感器160是测距传感器的一个示例。摄像装置100可以包括其他测距传感器,例如基于视差进行测距的立体相机来代替TOF传感器160。发光部162包括至少一个发光元件163。发光元件163是反复发射LED或者激光等经高速调制的脉冲光的设备。发光元件163可以发射是红外光的脉冲光。发光控制部166控制发光元件163的发光。发光控制部166可以控制从发光元件163发射的脉冲光的脉冲宽度。
受光部164包括多个受光元件165,其测量多个测距区域各自相关联的到被摄体的距离。多个受光元件165分别对应多个测距区域。受光元件165反复接收来自对象物的脉冲光的反射光。受光控制部167控制受光元件165接收光。受光控制部167基于预设受光期间内受光元件165反复接收的反射光的量,测量多个测距区域各自相关联的到被摄体的距离。受光控制部167可以基于预设受光期间内受光元件165反复接收反射光的量,通过确定脉冲光与反射光之间的相位差,来测量到被摄体的距离。受光部164可以通过读取反射波的频率变化来测量到被摄体的距离。这被称为FMCW(Frequency Modulated Continuous Wave,调频连续波)方式。
存储器168可以是计算机可读记录介质,可以包括SRAM、DRAM、EPROM以及EEPROM中的至少一个。存储器168存储发光控制部166为控制发光部162所需的程序以及受光控制部167为控制受光部164所需的程序等。
TOF传感器160可以测量受光部164的像素数所对应的多个测距区域的各自相关联的到被摄体的距离。
这样构成的摄像装置100在手动聚焦模式下,按照来自用户的指示调整聚焦镜头的位置。当以手动聚焦模式工作时,摄像装置100执行峰值对焦。峰值对焦是在视觉上向用户显示被摄体的对焦状态的程度,从而辅助调整聚焦镜头的位置的功能。例如,如图2所示,摄像装置100可以通过将指标184叠加在图像185的处于对焦状态的被摄体的边缘部分,来强调处于对焦状态的部分。当以自动聚焦模式工作时,摄像装置100也可以执行峰值对焦,以在视觉上向用户显示哪个区域处于对焦状态。
此处,可以考虑摄像装置100确定图像内的对比度值大于等于预设阈值的区域,并将表示包括在所确定区域中的被摄体处于对焦状态的指标叠加在摄像装置100拍摄的图像上并显示在显示部180上。
但是,根据摄像装置100的拍摄环境的不同,有时摄像控制部110中,处于对焦状态的被摄体的对比度值与未处于对焦状态的被摄体的对比度值之差几乎没有差别。因此,即使以图像内的各个区域的对比度值的大小的差异来判断是否处于对焦状态,摄像装置100有时也不能确定实际处于对焦状态的被摄体的边缘部分。
例如,当摄像装置100搭载了光源暗、光圈小并且焦距长的镜头时,摄像装置100拍摄的图像不鲜明、噪声多。在这种图像的情况下,处于对焦状态的被摄体的对 比度值与未处于对焦状态的被摄体的对比度值几乎没有差别。这种情况下,包括实际上未处于对焦状态的被摄体的边缘部分的整体边缘部分被强调,与摄像装置100拍摄的图像共同显示在显示部180上。
此外,可以考虑将作为是否处于对焦状态的判断基准的阈值和焦距以及ISO灵敏度对应地存储在存储器中。进一步地,摄像装置100可以考虑根据摄像装置100的分辨率和摄像装置100拍摄的图像的噪声电平来调整阈值。但是,由于阈值根据镜头的光学特性而改变,因此,安装有可更换镜头的摄像装置100有时不能适当地设置阈值。此外,如上所述,在噪声电平高的拍摄环境下,仅调整阈值的大小,有时摄像装置100不能区分处于对焦状态的被摄体的对比度值和未处于对焦状态的被摄体的对比度值。
此外,用户也可以考虑根据拍摄环境手动设置阈值。例如,可以考虑预先将“高”、“中”、“低”三级阈值存储在存储器中,用户根据拍摄环境选择阈值。但是,即使在这种情况下,在噪声电平高的拍摄环境下,仅调整阈值的大小,有时摄像装置100也不能区分处于对焦状态的被摄体的对比度值和未处于对焦状态的被摄体的对比度值。
因此,除了对比度值以外,本实施方式所涉及的摄像装置100还通过利用TOF传感器160的测距结果作为判断基准来确定图像内的包含处于对焦状态的被摄体的区域。
摄像控制部110获取摄像装置100拍摄的图像内的第一摄像区域的对比度值。摄像控制部110从TOF传感器160获取第一距离信息,其中第一距离信息表示与第一摄像区域所对应的第一测距区域相关联的到被摄体的距离。摄像控制部110获取第二距离信息,其中第二距离信息表示基于聚焦镜头的位置的到成为对焦状态的被摄体的距离或距离范围中的至少一个。
聚焦镜头的位置与到成为对焦状态的被摄体的距离相对应的表格可以存储在存储器170中。从而,摄像控制部110确定当前的聚焦镜头的位置,通过参照表格,可以确定与当前的聚焦镜头的位置相对应的到被摄体的距离。
可以将聚焦镜头的位置与表示到成为对焦状态的被摄体的距离范围的景深相对应的表格存储在存储器170中。从而,摄像控制部110确定当前的聚焦镜头的位置, 并且通过参照表格,可以确定与当前的聚焦镜头的位置相对应的景深。摄像控制部110可以根据将与聚焦镜头的位置相对应的值作为变量包含的预设函数,导出当前的聚焦镜头的位置处的景深作为到成为对焦状态的被摄体的距离范围。
当第一摄像区域的对比度值大于等于预设阈值,并且第一距离信息所示的距离与第二距离信息所示的距离或距离范围中的至少一个的关系满足预设条件时,摄像控制部110将表示存在于第一摄像区域中的被摄体处于对焦状态的指标叠加在图像的第一摄像区域并与图像共同显示在显示部180上。
摄像控制部110可以将沿着成为对焦状态的被摄体的边缘部分的线作为指标叠加在图像上并显示在显示部180上。摄像控制部110可以将沿着成为对焦状态的被摄体的边缘部分的点的集合作为指标叠加在图像上并显示在显示部180上。
当第一距离信息所表示的距离与第二距离信息所表示的距离之差小于预设阈值时,摄像控制部110可以判断第一距离信息所表示的距离与第二距离信息所表示的距离的关系满足预设条件。
当第一距离信息所表示的距离包含在第二距离信息所表示的距离范围内时,摄像控制部110可以判断第一距离信息所表示的距离与第二距离信息所表示的距离的关系满足预设条件。
当第一距离信息所表示的距离与第二距离信息所表示的距离的差小于预设阈值,并且第一距离信息所表示的距离包含在第二距离信息所表示的距离范围内时,摄像控制部110可以判断第一距离信息所表示的距离与第二距离信息所表示的距离的关系满足预设条件。
摄像控制部110将摄像装置100拍摄的图像分割为多个摄像区域,并计算出多个摄像区域的每一个区域的对比度值。对比度值可以是表示包含在摄像区域内的像素值的分布的扩展的值。例如,可以将摄像区域内的最大亮度设为Lmax,将最小亮度设为Lmin,摄像控制部110通过(Lmax-Lmin)/(Lmax+Lmin)计算出对比度值。
摄像控制部110确定TOF传感器160测量的、各个测距区域相关联的到被摄体的距离。摄像控制部110基于图像内的各个摄像区域与各个测距区域之间的预设的对应关系,可以确定与各个摄像区域相对应的测距区域。
例如,如图3所示,摄像控制部110计算摄像装置100拍摄的图像内的各个摄像区域(1)~(9)的对比度值。例如,如图4所示,摄像控制部110确定与TOF传感器160测量的各个摄像区域(1)~(9)对应的各个测距区域(1)~(9)相关联的到被摄体的距离。
例如,摄像控制部110将对比度值的阈值设置为60。由此,摄像控制部110确定摄像区域(1)和摄像区域(5)作为具有高于阈值的对比度值的摄像区域。
例如,摄像控制部110基于当前的聚焦镜头的位置来确定到被摄体的距离为1m。然后,当基于当前的聚焦镜头的位置的到被摄体的距离与基于TOF传感器160的测距结果的到被摄体的距离之差小于等于1m时,摄像控制部110判断为满足预设条件。摄像控制部110将测距区域(5)确定为满足预设条件的测距区域。
摄像控制部110将与测距区域(5)相对应的摄像区域(5)确定为包含成为对焦状态的被摄体的摄像区域。摄像控制部110将具有大于等于阈值的对比度值的摄像区域(1)判断为包含噪声的摄像区域,并且从包括处于对焦状态的被摄体的摄像区域中去除摄像区域(1)。然后,摄像控制部110强调包括在摄像区域(5)中的被摄体的边缘部分,并且将摄像装置100拍摄的图像显示在显示部180上。
图5是示出摄像控制部110进行峰值对焦的处理过程的一个示例的流程图。
摄像控制部110基于TOF传感器160的测距结果,确定与图像内的各摄像区域所对应的各测距区域相关联的到被摄体的距离(S100)。摄像控制部110确定当前的聚焦镜头的位置。摄像控制部110基于所确定的当前的聚焦镜头的位置,确定到成为对焦状态的被摄体的距离(S102)。
摄像控制部110判断是否存在与基于当前的聚焦镜头的位置的到成为对焦状态的被摄体的距离的关系满足预设条件的距离的摄像区域(S104);摄像控制部110判断基于当前的聚焦镜头的位置的到成为对焦状态的被摄体的距离与基于TOF传感器160的测距结果的到被摄体的距离之差小于等于预设阈值(例如1m)的摄像区域是否存在。
当存在相应的摄像区域时,摄像控制部110确定相应的摄像区域的对比度值(S106)。接着,摄像控制部110判断所确定的对比度值是否大于等于预设阈值(S108)。如果所确定的对比度值大于等于预设阈值,则摄像控制部110将存在于相应的摄像区 域中的被摄体强调显示在显示部180上(S110)。摄像控制部110可以强调存在于相应的摄像区域中的被摄体的边缘部分,并且将摄像装置100拍摄的图像显示在显示部180上。
另一方面,当根据与基于当前的聚焦镜头的位置的到成为对焦状态的被摄体的距离的关系,不存在满足预设条件的距离的摄像区域时,或者所确定的对比度值小于预设阈值时,摄像控制部110不强调显示存在于各个摄像区域中的被摄体的边缘部分,而将摄像控制部110拍摄的图像显示在显示部180上(S112)。
如上所述,根据本实施方式所涉及的摄像装置100,除了对比度值以外,还利用TOF传感器160的测距结果来确定图像内的包括处于对焦状态的被摄体的区域。从而,当包括未处于对焦状态的被摄体的摄像区域在噪声的影响下对比度值高时,能够防止强调显示该摄像区域中包含的被摄体。由此,能够不受拍摄环境的影响,更高精度地实现峰值对焦。
上述摄像装置100也可以搭载于移动体上。摄像装置100也可以搭载于如图6所示的无人驾驶航空器(UAV)上。UAV1000可以包括UAV主体20、万向节50、多个摄像装置60以及摄像装置100。万向节50及摄像装置100为摄像系统的一个示例。UAV1000为由推进部推进的移动体的一个示例。移动体的概念是指除UAV之外,包括在空中移动的飞机等飞行体、在地面上移动的车辆、在水上移动的船舶等。
UAV主体20包括多个旋翼。多个旋翼为推进部的一个示例。UAV主体20通过控制多个旋翼的旋转而使UAV1000飞行。UAV本体20采用例如4个旋翼,使UAV1000飞行。旋翼的数量不限于四个。另外,UAV1000也可以是没有旋翼的固定翼机。
摄像装置100为对包含在所期望的摄像范围内的被摄体进行拍摄的摄像用相机。万向节50可旋转地支撑摄像装置100。万向节50为支撑机构的一个示例。例如,万向节50使用致动器以俯仰轴可旋转地支撑摄像装置100。万向节50使用致动器进一步分别以滚转轴和偏航轴为中心可旋转地支撑摄像装置100。万向节50可以通过使摄像装置100以偏航轴、俯仰轴以及滚转轴中的至少一个为中心旋转,来改变摄像装置100的姿势。
多个摄像装置60是为了控制UAV1000的飞行而对UAV1000的周围进行拍摄的传感用相机。两个摄像装置60可以设置于UAV1000的机头、即正面。并且,其它两 个摄像装置60可以设置于UAV1000的底面。正面侧的两个摄像装置60可以成对,起到所谓的立体相机的作用。底面侧的两个摄像装置60也可以成对,起到立体相机的作用。可以根据由多个摄像装置60所摄像的图像来生成UAV1000周围的三维空间数据。UAV1000所包括的摄像装置60的数量不限于四个。UAV1000包括至少一个摄像装置60即可。UAV1000也可以在UAV1000的机头、机尾、侧面、底面及顶面分别包括至少一个摄像装置60。摄像装置60中可设置的视角可大于摄像装置100中可设置的视角。摄像装置60也可以包括单焦点镜头或鱼眼镜头。
远程操作装置600与UAV1000通信,对UAV1000进行远程操作。远程操作装置600可以与UAV1000进行无线通信。远程操作装置600向UAV1000发送表示上升、下降、加速、减速、前进、后退、旋转等与UAV1000的移动有关的各种指令的指示信息。指示信息包括例如使UAV1000的高度上升的指示信息。指示信息可以表示UAV1000应该位于的高度。UAV1000进行移动,以位于从远程操作装置600接收的指示信息所表示的高度。指示信息可以包括使UAV1000上升的上升指令。UAV1000在接受上升指令的期间上升。在UAV1000的高度已达到上限高度时,即使接受上升指令,也可以限制UAV1000上升。
图7示出可整体或部分地体现本发明的多个方式的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置建立关联的操作或者该装置的一个或多个“部”发挥作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框建立关联的指定操作。
本实施方式的计算机1200包括CPU1212及RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内储存的程序而工作,从而控制各单元。
通信接口1222通过网络与其它电子装置通信。硬盘驱动器可以储存计算机1200内的CPU1212所使用的程序及数据。ROM1230在其中储存运行时由计算机1200执 行的引导程序等、和/或依赖于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也是计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并带来程序与上述各种类型的硬件资源之间的协作。装置或者方法可通过随着计算机1200的使用而实现信息的操作或者处理来构成。
例如,在计算机1200与外部设备之间执行通信时,CPU1212可以执行加载在RAM1214中的通信程序,并根据通信程序所描述的处理,指令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取储存在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所储存的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处记载的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件判断、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中储存具有分别与第二属性的属性值建立了关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内储存的第二属性的属性值,从而获取与满足预设条件的第一属性建立了关联的第二属性的属性值。
以上描述的程序或者软件模块可以储存在计算机1200上或者计算机1200附近的计算机可读存储介质上。此外,与专用通信网络或者互联网连接的服务器系统内提供的硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而通过网络将程序提供给计算机1200。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、系统、程序以及方法中的动作、过程、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在...之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及说明书附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
【符号说明】
100 摄像装置
102 摄像部
110 摄像控制部
120 图像传感器
150 镜头控制部
152 镜头驱动部
154 镜头
160 TOF传感器
162 发光部
163 发光元件
164 受光部
165 受光元件
166 发光控制部
167 受光控制部
168 存储器
170 存储器
180 显示部
182 操作部
200 镜头部
1000 UAV
20 UAV主体
50 万向节
60 摄像装置
600 远程操作装置
1200 计算机
1210 主机控制器
1212 CPU
1214 RAM
1220 输入/输出控制器
1222 通信接口
1230 ROM

Claims (9)

  1. 一种控制装置,其对包括测距传感器和聚焦镜头的摄像装置进行控制,其特征在于,包括电路,所述电路构成为:
    获取所述摄像装置拍摄的图像内的第一摄像区域的对比度值;
    从所述测距传感器获取第一距离信息,所述第一距离信息表示与所述第一摄像区域所对应的第一测距区域相关联的到被摄体的距离;
    获取第二距离信息,所述第二距离信息表示基于所述聚焦镜头的位置的到成为对焦状态的被摄体的距离或距离范围中的至少一个;
    当所述第一摄像区域的对比度值大于等于预设阈值,并且所述第一距离信息所表示的距离与所述第二距离信息所表示的距离或距离范围中的至少一个的关系满足预设条件时,将表示存在于所述第一摄像区域的被摄体处于对焦状态的指标叠加在所述图像的所述第一摄像区域上,并与所述图像共同显示在显示部上。
  2. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:当所述第一距离信息所表示的距离与所述第二距离信息所表示的距离之差小于预设阈值时,判断为所述第一距离信息所表示的距离与所述第二距离信息所表示的距离的关系满足所述预设条件。
  3. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:
    当所述第一距离信息所表示的距离包含于所述第二距离信息所表示的距离范围内时,判断为所述第一距离信息所表示的距离与所述第二距离信息所表示的距离的关系满足所述预设条件。
  4. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:当所述第一距离信息所表示的距离与所述第二距离信息所表示的距离之差小于预设阈值,并且所述第一距离信息所表示的距离包含于由所述第二距离信息所表示的距离范围内时,判断为所述第一距离信息所表示的距离与所述第二距离信息所表示的距离的关系满足所述预设条件。
  5. 根据权利要求1所述的控制装置,其特征在于,所述电路构成为:将所述指标叠加在存在于所述图像的所述第一摄像区域的所述被摄体的边缘部分上,并与所述图像共同显示在所述显示部上。
  6. 根据权利要求1所述的控制装置,其特征在于,所述测距传感器是TOF传感器。
  7. 一种摄像装置,其特征在于,包括根据权利要求1至6中任一项所述的控制装置;
    所述测距传感器;以及
    所述聚焦镜头。
  8. 一种控制方法,其对包括测距传感器和聚焦镜头的摄像装置进行控制,其特征在于,包括以下阶段:
    获取所述摄像装置拍摄的图像内的第一摄像区域的对比度值;
    从所述测距传感器获取第一距离信息,所述第一距离信息表示与所述第一摄像区域所对应的第一测距区域相关联的到被摄体的距离;
    获取第二距离信息,所述第二距离信息表示基于所述聚焦镜头的位置的到成为对焦状态的被摄体的距离或距离范围中的至少一个;
    当所述第一摄像区域的对比度值大于等于预设阈值,并且所述第一距离信息所表示的距离与所述第二距离信息所表示的距离或距离范围中的至少一个的关系满足预设条件时,将表示存在于所述第一摄像区域中的被摄体处于对焦状态的指标叠加在所述图像的所述第一摄像区域上,并与所述图像共同显示在显示部上。
  9. 一种程序,其特征在于,其用于使计算机作为根据权利要求1至6中任一项所述的控制装置而发挥功能。
PCT/CN2021/097830 2020-06-30 2021-06-02 控制装置、摄像装置、控制方法以及程序 WO2022001561A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202180006044.3A CN114586336A (zh) 2020-06-30 2021-06-02 控制装置、摄像装置、控制方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020112988A JP6961889B1 (ja) 2020-06-30 2020-06-30 制御装置、撮像装置、制御方法、及びプログラム
JP2020-112988 2020-06-30

Publications (1)

Publication Number Publication Date
WO2022001561A1 true WO2022001561A1 (zh) 2022-01-06

Family

ID=78409816

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/097830 WO2022001561A1 (zh) 2020-06-30 2021-06-02 控制装置、摄像装置、控制方法以及程序

Country Status (3)

Country Link
JP (1) JP6961889B1 (zh)
CN (1) CN114586336A (zh)
WO (1) WO2022001561A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030081137A1 (en) * 2001-10-26 2003-05-01 Fuji Photo Film Co., Ltd. Device and method for autofocus adjustment
JP2004120582A (ja) * 2002-09-27 2004-04-15 Olympus Corp カメラ
CN101126833A (zh) * 2006-08-16 2008-02-20 佳能株式会社 自动聚焦设备和摄像设备
CN101931752A (zh) * 2009-06-19 2010-12-29 卡西欧计算机株式会社 摄像装置、以及对焦方法
CN104869304A (zh) * 2014-02-21 2015-08-26 三星电子株式会社 显示对焦的方法和应用该方法的电子设备
CN108632529A (zh) * 2017-03-24 2018-10-09 三星电子株式会社 为焦点提供图形指示符的电子设备及操作电子设备的方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007195097A (ja) * 2006-01-23 2007-08-02 Konica Minolta Photo Imaging Inc 撮像装置、画像処理方法、及び画像処理プログラム
CN103430073B (zh) * 2011-03-31 2014-12-31 富士胶片株式会社 摄像装置及摄像装置的控制方法
JP6462183B2 (ja) * 2016-03-30 2019-01-30 富士フイルム株式会社 撮像装置及びフォーカス制御方法
JP6727453B2 (ja) * 2017-09-28 2020-07-22 富士フイルム株式会社 撮像装置、撮像装置の制御方法、及び撮像装置の制御プログラム
JP6561370B1 (ja) * 2018-06-19 2019-08-21 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 決定装置、撮像装置、決定方法、及びプログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030081137A1 (en) * 2001-10-26 2003-05-01 Fuji Photo Film Co., Ltd. Device and method for autofocus adjustment
JP2004120582A (ja) * 2002-09-27 2004-04-15 Olympus Corp カメラ
CN101126833A (zh) * 2006-08-16 2008-02-20 佳能株式会社 自动聚焦设备和摄像设备
CN101931752A (zh) * 2009-06-19 2010-12-29 卡西欧计算机株式会社 摄像装置、以及对焦方法
CN104869304A (zh) * 2014-02-21 2015-08-26 三星电子株式会社 显示对焦的方法和应用该方法的电子设备
CN108632529A (zh) * 2017-03-24 2018-10-09 三星电子株式会社 为焦点提供图形指示符的电子设备及操作电子设备的方法

Also Published As

Publication number Publication date
CN114586336A (zh) 2022-06-03
JP2022011685A (ja) 2022-01-17
JP6961889B1 (ja) 2021-11-05

Similar Documents

Publication Publication Date Title
WO2020011230A1 (zh) 控制装置、移动体、控制方法以及程序
WO2019238044A1 (zh) 确定装置、移动体、确定方法以及程序
JP6630939B2 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
US11066182B2 (en) Control apparatus, camera apparatus, flying object, control method and program
CN112335227A (zh) 控制装置、摄像系统、控制方法以及程序
US20210092282A1 (en) Control device and control method
WO2022001561A1 (zh) 控制装置、摄像装置、控制方法以及程序
JP2021085893A (ja) 制御装置、撮像装置、制御方法、及びプログラム
CN112292712A (zh) 装置、摄像装置、移动体、方法以及程序
WO2021031833A1 (zh) 控制装置、摄像系统、控制方法以及程序
US20220046177A1 (en) Control device, camera device, movable object, control method, and program
US20220070362A1 (en) Control apparatuses, photographing apparatuses, movable objects, control methods, and programs
WO2021031840A1 (zh) 装置、摄像装置、移动体、方法以及程序
JP2019220834A (ja) 無人航空機、制御方法、及びプログラム
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020108284A1 (zh) 确定装置、移动体、确定方法以及程序
WO2020020042A1 (zh) 控制装置、移动体、控制方法以及程序
CN112313941A (zh) 控制装置、摄像装置、控制方法以及程序
WO2020011198A1 (zh) 控制装置、移动体、控制方法以及程序
JP7043707B2 (ja) シーン認識装置、撮像装置、シーン認識方法、及びプログラム
WO2021052216A1 (zh) 控制装置、摄像装置、控制方法以及程序
JP6805448B2 (ja) 制御装置、撮像システム、移動体、制御方法、及びプログラム
WO2021249245A1 (zh) 装置、摄像装置、摄像系统及移动体
JP6746856B2 (ja) 制御装置、撮像システム、移動体、制御方法、及びプログラム
JP6818987B1 (ja) 画像処理装置、撮像装置、移動体、画像処理方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21831952

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21831952

Country of ref document: EP

Kind code of ref document: A1