US20210105403A1 - Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle - Google Patents

Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle Download PDF

Info

Publication number
US20210105403A1
US20210105403A1 US17/122,672 US202017122672A US2021105403A1 US 20210105403 A1 US20210105403 A1 US 20210105403A1 US 202017122672 A US202017122672 A US 202017122672A US 2021105403 A1 US2021105403 A1 US 2021105403A1
Authority
US
United States
Prior art keywords
image
lens module
target object
aerial vehicle
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/122,672
Inventor
Yang Liu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autel Robotics Co Ltd
Original Assignee
Autel Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autel Robotics Co Ltd filed Critical Autel Robotics Co Ltd
Priority to US17/122,672 priority Critical patent/US20210105403A1/en
Publication of US20210105403A1 publication Critical patent/US20210105403A1/en
Assigned to AUTEL ROBOTICS CO., LTD. reassignment AUTEL ROBOTICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIU, YANG
Abandoned legal-status Critical Current

Links

Images

Classifications

    • H04N5/23229
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • H04N5/2258
    • H04N5/23232
    • H04N5/23296
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • B64C2201/127
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • Implementations of the present application relate to the field of image processing technologies, and in particular, to a method for processing an image, an image processing apparatus, a multi-camera photographing apparatus, and an aerial vehicle.
  • Aerial vehicles in the prior art are generally equipped with normal lenses for aerial photography.
  • the focus of the normal lens is usually defined in infinitely remote locations, and the resolution of an image sensor is very limited. Consequently, some small targets in a ground scene cannot be clearly photographed.
  • lenses used for aerial photography generally have a relatively narrow perspective, and therefore a photographed scene range is relatively limited.
  • some aerial vehicles are equipped with zoom lenses for aerial photography, and a perspective is adjusted between 10° to 90°. To clearly photograph a small target in the ground scene, the lens needs to be zoomed to a long focal length. To photograph a relatively wide scene, the lens needs to be switched to a wide angle.
  • the zoom lens has a relatively large weight, increasing the load of the aerial vehicle and reducing the flight performance of the aerial vehicle.
  • an image sensor of the zoom lens is relatively undiversified and cannot change.
  • a main technical problem to be resolved in implementations of the present application is to provide a method for processing an image, an image processing apparatus, a multi-camera photographing apparatus, and an aerial vehicle, to combine requirements of both a large-perspective scene and a small-perspective scene.
  • an embodiment of the present application provides a method for processing an image, where the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module, and the method includes:
  • the first lens module obtaining a first image photographed by the first lens module, the first image including a target object
  • the second lens module obtaining a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image;
  • the method further includes:
  • the performing image processing on the first image and the second image, and generating the output image includes:
  • the performing image processing on the first image and the second image, and generating the output image includes:
  • the method further includes:
  • the method further includes:
  • the performing image processing on the first image and the second image, and generating the output image includes:
  • a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • the first lens module includes a normal lens module and/or a wide-angle lens module and the second lens module includes a long-focus lens module.
  • the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module having an equivalent focal length of 35 mm, the wide-angle lens module having an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module being 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • an embodiment of the present application provides a multi-camera photographing apparatus, where the multi-camera photographing apparatus includes a body, and a first lens module and a second lens module that are disposed on the body, the first lens module and the second lens module being arranged in an array on the body.
  • the first lens module includes at least two lens modules.
  • the first lens module and the second lens module are arranged in an isosceles triangle.
  • the first lens module and the second lens module are linearly arranged.
  • a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • the first lens module includes a normal lens module and/or a wide-angle lens module and the second lens module includes a long-focus lens module.
  • the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module having an equivalent focal length of 35 mm, the wide-angle lens module having an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module being 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • an embodiment of the present application provides an image processing apparatus, where the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module, and the apparatus includes:
  • a first obtaining module configured to obtain a first image photographed by the first lens module, the first image including a target object
  • a second obtaining module configured to obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image
  • an image generation module configured to perform image processing on the first image and the second image and to generate an output image.
  • the apparatus further includes a location obtaining module and an adjustment module, the location obtaining module being configured to obtain location information of the target object, and the adjustment module being configured to adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
  • the image generation module is configured to replace an area in the first image that includes the target object with the second image and to output the first image obtained after the replacement as the output image.
  • the image generation module is configured to overlay the second image on the first image and to output the first image and the second image that are overlaid as the output image.
  • the apparatus further includes:
  • a correction module configured to perform optical axis correction on optical axes of the first lens module and the second lens module.
  • the apparatus further includes:
  • an analysis module configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
  • the image generation module is further configured to perform image processing on the first image and the second image based on the registration parameter and to generate the output image.
  • a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • the first lens module includes a normal lens module and/or a wide-angle lens module and the second lens module includes a long-focus lens module.
  • the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module having an equivalent focal length of 35 mm, the wide-angle lens module having an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module being 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • an embodiment of the present application provides an aerial vehicle, including a vehicle body, arms connected to the vehicle body, power apparatuses disposed on the arms, a multi-camera photographing apparatus connected to the vehicle body and an image processing chip, the multi-camera photographing apparatus including a first lens module and a second lens module, and the image processing chip being configured to:
  • the second lens module obtains a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image;
  • the image processing chip is specifically configured to:
  • the image processing chip is specifically configured to:
  • the aerial vehicle further includes a controller and a gimbal that are connected to the vehicle body, the controller being configured to control the gimbal to perform optical axis correction on optical axes of the first lens module and the second lens module.
  • the controller is further configured to obtain location information of the target object and to adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
  • the controller is further configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
  • the aerial vehicle further includes a controller, configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
  • the image processing chip is further configured to perform image processing on the first image and the second image based on the registration parameter and to generate the output image.
  • the multi-camera photographing apparatus includes a body, and a first lens module and a second lens module that are disposed on the body, the first lens module and the second lens module being arranged in an array on the body.
  • the first lens module includes at least two lens modules.
  • the first lens module and the second lens module are arranged in an isosceles triangle.
  • the first lens module and the second lens module are linearly arranged.
  • a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • the first lens module includes a normal lens module and/or a wide-angle lens module and the second lens module includes a long-focus lens module.
  • the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module having an equivalent focal length of 35 mm, the wide-angle lens module having an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module being 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • an embodiment of the present application provides an aerial vehicle, including: a body, a gimbal disposed on the body and a multi-camera photographing apparatus disposed on the gimbal, the multi-camera photographing apparatus including a first lens module and a second lens module, and the aerial vehicle further including:
  • the memory stores an instruction capable of being executed by the at least one processor, and the instruction is executed by the at least one processor, to enable the at least one processor to perform the foregoing method for processing an image.
  • the embodiments of the present application provide a method for processing an image, an image processing apparatus, a multi-camera photographing apparatus and an aerial vehicle.
  • a first image photographed by a first lens module is obtained, the first image including a target object; then a second image photographed by a second lens module is obtained; and image processing is performed on the first image and the second image, and an output image is generated, so that a range in a large-perspective scene and precision of the target object in a small-perspective scene can be both combined.
  • FIG. 1 is a schematic structural diagram of an aerial vehicle according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of an embodiment of a multi-camera photographing apparatus in the aerial vehicle shown in FIG. 1 ;
  • FIG. 3 is a schematic structural diagram of another embodiment of a multi-camera photographing apparatus in the aerial vehicle shown in FIG. 1 ;
  • FIG. 4 is a schematic diagram of a first image and a second image
  • FIG. 5 is a schematic diagram of an output image obtained through image compositing according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an output image obtained through image overlaying according to an embodiment of the present application.
  • FIG. 7 is a flowchart of a method for processing an image according to an embodiment of the present application.
  • FIG. 8 is a flowchart of a method for processing an image according to another embodiment of the present application.
  • FIG. 9 is a functional block diagram of an image processing apparatus according to an embodiment of the present application.
  • FIG. 10 is a functional block diagram of an image processing apparatus according to another embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of hardware of an aerial vehicle for performing a method for processing an image according to an embodiment of the present application.
  • an aerial vehicle 10 includes a housing 11 , arms 12 connected to the housing 11 , a power apparatus 13 disposed at one end of each arm 12 , a gimbal 15 connected to the housing 11 , a multi-camera photographing apparatus 14 connected to the gimbal 15 , and a controller 16 and an image processing chip 17 that are disposed in the housing 11 .
  • the aerial vehicle is a four-rotor aerial vehicle.
  • the aerial vehicle 10 may alternatively be other movable objects such as a manned aerial vehicle, a model airplane, an unmanned airship, a fixed-wing unmanned aerial vehicle, or an unmanned hot air balloon.
  • the power apparatus 13 includes a motor 132 disposed at one end of the arm 12 and a propeller 131 connected to a rotating shaft of the motor 132 .
  • the rotating shaft of the motor 132 rotates to drive the propeller 131 to rotate, so as to provide a lift force for the aerial vehicle 10 .
  • the gimbal 15 is configured to reduce and even eliminate vibration transferred from the power apparatus 13 to the multi-camera photographing apparatus 14 , to ensure that the multi-camera photographing apparatus 14 can photograph a stable and clear image or video.
  • the multi-camera photographing apparatus 14 is carried on the aerial vehicle 10 by using the gimbal 15 described in this embodiment.
  • the multi-camera photographing apparatus 14 includes a body 141 , and a first lens module 145 and a second lens module 144 that are disposed on the body 141 .
  • the first lens module 145 and the second lens module 144 are both fixed-focus lens modules. Because the first lens module 145 and the second lens module 144 are both fixed-focus lenses, a lighter material can be used during design and a mechanical apparatus of a zoom lens is eliminated.
  • the first lens module 145 includes two lens modules, and the two lens modules are respectively a normal lens module 142 and a wide-angle lens module 143 .
  • the second lens module 144 is a long-focus lens module.
  • the normal lens module 142 has an equivalent focal length of 35 mm
  • the wide-angle lens module 143 has an equivalent focal length of 18 mm to 24 mm
  • an equivalent focal length of the long-focus lens module 144 is 3 to 20 times the equivalent focal length of the wide-angle lens module 143 .
  • image sensors having corresponding resolutions may be flexibly configured for the wide-angle lens module 143 and the long-focus lens module 144 .
  • the first lens module 145 and the second lens module 144 are arranged in an array.
  • the normal lens module 142 , the wide-angle lens module 143 and the long-focus lens module 144 are arranged in an isosceles triangle.
  • the first lens module 145 ′ and the second lens module 144 ′ may alternatively be linearly arranged.
  • the normal lens module 142 ′, the wide-angle lens module 143 ′ and the long-focus lens module 144 ′ are linearly arranged at equal intervals.
  • the long-focus lens module 144 ′ and the wide-angle lens module 143 ′ are arranged on two sides of the normal lens module 142 ′.
  • the controller 16 Before the multi-camera photographing apparatus 14 is started, the controller 16 first controls the gimbal 15 to correct optical axes of the first lens module 145 and the second lens module 144 and to record optical axis correction parameters of the first lens module 145 and the second lens module 144 . It may be understood that the optical axis correction of the first lens module 145 and the second lens module 144 may alternatively be completed at delivery by using an existing lens-module optical axis correction technology.
  • the controller 16 needs to perform registration analysis on the first lens module 145 and the second lens module 144 and generate a corresponding registration parameter.
  • the first lens module 145 includes the normal lens module 142 and the wide-angle lens module 143 is used.
  • the wide-angle lens module 143 photographs image data
  • the controller 16 obtains location information of the target object based on the image data obtained by the wide-angle lens module 143 , and adjusts the optical axis of the second lens module 144 , so that the optical axis of the second lens module 144 faces the target object of interest.
  • the image processing chip 17 obtains a first image photographed by the first lens module 145 , the first image including the target object.
  • the target object may include, but is not limited to, objects such as a person, an animal, a plant, an automobile and a building.
  • the first image may be a single frame of image that is captured by the first lens module.
  • the aerial vehicle may further include an image recognition chip, configured to obtain a feature parameter of the target object, where the feature parameter is used for tracing the target object and may be used for representing an overall contour of the target object.
  • the multi-camera photographing apparatus is mounted on the aerial vehicle.
  • the aerial vehicle can target an automobile by obtaining a feature parameter such as a license number and/or an automobile body contour and/or an automobile color of the automobile, and further trace the automobile.
  • the image processing chip 17 obtains a second image photographed by the second lens module 144 , the second image also including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • the second lens module 144 is a long-focus lens module.
  • the image processing chip 17 performs image processing on the first image and the second image based on the registration parameter obtained by the controller 16 and generates an output image.
  • processing manners for the first image and the second image may be image compositing.
  • the image processing chip 17 replaces an area 211 in the first image 210 that includes the target object with the second image 220 , and as shown in FIG. 5 , determines the first image 210 including the second image 220 as the output image. That is, an image in a corresponding area in a large perspective image is replaced with a clear small-perspective image that is photographed, so that the clear image of the target object can be seen in the large-perspective image.
  • combination manners for the first image and the second image may alternatively be image overlaying.
  • the image processing chip 17 overlays the second image 230 on the first image 240 , and determines the first image 240 and the second image 230 that are overlaid as the output image.
  • the second image 230 may be overlaid at positions such as a top right corner, a top left corner, a bottom right corner, or a bottom left corner of the first image 240 , and the second image 230 may be zoomed in or zoomed out according to a user operation.
  • the second image 230 is zoomed in through a double-touch operation, to cover the first image 240 .
  • a range in a large-perspective scene and precision in a small-perspective scene can be both combined, to obtain an output image having relatively high precision.
  • an embodiment of the present application provides a method for processing an image, where the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module, and the method may include the following steps.
  • the optical axis correction of the first lens module and the second lens module may be completed by using an optical axis correction algorithm of a controller located in an aerial vehicle, or may be completed at delivery by using an existing lens-module optical axis correction technology.
  • the controller 16 needs to perform registration analysis on the first lens module 145 and the second lens module 144 and generate the corresponding registration parameter.
  • the registration parameter may be used as a basis of subsequent image processing.
  • Obtaining location information of a target object belongs to the prior art, and details are not described herein again.
  • the target object may include, but is not limited to, objects such as a person, an animal, a plant, an automobile and a building.
  • the first image may be a single frame of image that is captured by the first lens module.
  • the method may further include: obtaining a feature parameter of the target object, where the feature parameter is used for tracing the target object and may be used for representing an overall contour of the target object.
  • the multi-camera photographing apparatus is mounted on the aerial vehicle.
  • the aerial vehicle can target an automobile by obtaining a feature parameter such as a license number and/or an automobile body contour and/or an automobile color of the automobile, and further trace the automobile.
  • a person can be traced by obtaining a feature parameter such as a facial contour and/or a body contour and/or a clothes color of the person.
  • the feature parameter of the target object may be obtained by photographing the target object by the first lens module, or may be obtained from a device such as a server or another intelligent terminal through wireless transmission. This is not limited in this embodiment of the present application.
  • the location information of the target object may be obtained by using a currently known technology, and details are not described herein again.
  • S 105 Adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis faces a direction of the target object.
  • S 106 Obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • the target object included in the second image and the target object included in the first image are a same target object, and there are more pixels of the target object in the second image. Therefore, the target object in the second image is clearer.
  • S 107 Perform image processing on the first image and the second image, and generate an output image.
  • combination manners for the first image and the second image may be image compositing. Specifically, as shown in FIG. 4 , an area 211 in the first image 210 that includes the target object is replaced with the second image 220 , and as shown in FIG. 5 , the first image 210 including the second image 220 is determined as the output image. That is, an image in a corresponding area in a large perspective is replaced with a clear small-perspective image that is photographed, so that the clear image of the target object can be seen in the large-perspective image.
  • combination manners for the first image and the second image may alternatively be image overlaying.
  • the second image 230 is overlaid on the first image 240 , and the first image 240 and the second image 230 that are overlaid are determined as the output image.
  • the second image 230 may be overlaid at positions such as a top right corner, a top left corner, a bottom right corner, or a bottom left corner of the first image 240 , and the second image 230 may be zoomed in or zoomed out according to a user operation.
  • the second image 230 is zoomed in through a double-touch operation, so that the second image 230 is displayed in full screen, and the first image 240 is zoomed out and may be caused to be located at a position at which the second image 230 is originally located.
  • an embodiment of the present application provides a method for processing an image, where the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module, and the method may include the following steps.
  • Step S 11 Obtain a first image photographed by the first lens module, the first image including a target object.
  • the target object may include, but is not limited to, objects such as a person, an animal, a plant, an automobile and a building.
  • the first image may be a single frame of image that is captured by the first lens module.
  • the method may further include: obtaining a feature parameter of the target object, where the feature parameter is used for tracing the target object and may be used for representing an overall contour of the target object.
  • the multi-camera photographing apparatus is mounted on an aerial vehicle.
  • the aerial vehicle can target an automobile by obtaining a feature parameter such as a license number and/or an automobile body contour and/or an automobile color of the automobile, and further trace the automobile.
  • a person can be traced by obtaining a feature parameter such as a facial contour and/or a body contour and/or a clothes color of the person.
  • the feature parameter of the target object may be obtained by photographing the target object by the first lens module, or may be obtained from a device such as a server or another intelligent terminal through wireless transmission. This is not limited in this embodiment of the present application.
  • Step S 12 Obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • the second lens module photographs the second image, the second image including the target object.
  • a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • An image sensor configured for a zoom lens is undiversified and cannot change, while image sensors having corresponding resolutions may be flexibly configured for the fixed-focus first lens module and second lens module.
  • a zoom lens is generally heavier than a fixed-focus lens, and the first lens module and the second lens module are both fixed-focus lenses. A lighter material can be used during design and a mechanical apparatus of a zoom lens is eliminated, so that the load of the aerial vehicle can be reduced.
  • the focal length of the first lens module is different from the focal length of the second lens module.
  • the focal length of the first lens module is less than the focal length of the second lens module.
  • the first lens module may include a normal lens module and/or a wide-angle lens module and the second lens module may include a long-focus lens module. The first image photographed by the first lens module belongs to a large-perspective scene and the second image photographed by the second lens module belongs to a small-perspective scene.
  • the normal lens module When the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module, the wide-angle lens module and the long-focus lens module may be arranged in a triangle or may be linearly arranged.
  • the normal lens module may have an equivalent focal length of 35 mm
  • the wide-angle lens module may have an equivalent focal length of 18 mm to 24 mm
  • an equivalent focal length of the long-focus lens module may be 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • Step S 13 Perform image processing on the first image and the second image, and generate an output image.
  • combination manners for the first image and the second image may be image compositing. Specifically, as shown in FIG. 4 , an area 211 in the first image 210 that includes the target object is replaced with the second image 220 , and as shown in FIG. 5 , the first image 210 including the second image 220 is determined as the output image. That is, an image in a corresponding area in a large perspective is replaced with a clear small-perspective image that is photographed, so that the clear image of the target object can be seen in the large-perspective image.
  • combination manners for the first image and the second image may alternatively be image overlaying.
  • the second image 230 is overlaid on the first image 240 , and the first image 240 and the second image 230 that are overlaid are determined as the output image.
  • the second image 230 may be overlaid at positions such as a top right corner, a top left corner, a bottom right corner, or a bottom left corner of the first image 240 , and the second image 230 may be zoomed in or zoomed out according to a user operation.
  • the second image 230 is zoomed in through a double-touch operation, so that the second image 230 is displayed in full screen, and the first image 240 is zoomed out and may be caused to be located at a position at which the second image 230 is originally located.
  • an embodiment of the present application provides an image processing apparatus 40 , which may be configured to perform a method for processing an image disclosed in the embodiments of the present application.
  • the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including s first lens module and a second lens module.
  • the multi-camera photographing apparatus may be mounted in the image processing apparatus 40 , or may be mounted in another device and transmit, to the image processing apparatus 40 , an image photographed by the device.
  • the apparatus 40 may include a first obtaining module 41 , a second obtaining module 45 , and an image generation module 47 .
  • the first obtaining module 41 is configured to obtain a first image photographed by the first lens module, the first image including a target object.
  • the target object may include, but is not limited to, objects such as a person, an animal, a plant, an automobile and a building.
  • the first image may be a single frame of image that is captured by the first lens module.
  • the apparatus may further include a feature parameter obtaining module, configured to obtain a feature parameter of the target object before the first obtaining module 41 obtains the first image photographed by the first lens module, where the feature parameter is used for tracing the target object and may be used for representing an overall contour of the target object.
  • the multi-camera photographing apparatus is mounted on an aerial vehicle.
  • the aerial vehicle can target an automobile by obtaining a feature parameter such as a license number and/or an automobile body contour and/or an automobile color of the automobile, and further trace the automobile.
  • the second obtaining module 45 is configured to obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • the second lens module photographs the second image, the second image including the target object.
  • a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • the focal length of the first lens module is less than the focal length of the second lens module.
  • the first lens module may include a normal lens module and/or a wide-angle lens module and the second lens module may include a long-focus lens module. The first image photographed by the first lens module belongs to a large-perspective scene and the second image photographed by the second lens module belongs to a small-perspective scene.
  • the normal lens module When the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module, the wide-angle lens module and the long-focus lens module may be arranged in a triangle or may be linearly arranged.
  • the normal lens module may have an equivalent focal length of 35 mm
  • the wide-angle lens module may have an equivalent focal length of 18 mm to 24 mm
  • an equivalent focal length of the long-focus lens module may be 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • the image generation module 47 is configured to determine an output image based on the first image and the second image.
  • combination manners for the first image and the second image may be image compositing. Specifically, as shown in FIG. 4 , an area 211 in the first image 210 that includes the target object is replaced with the second image 220 , and as shown in FIG. 5 , the first image 210 including the second image 220 is determined as the output image.
  • combination manners for the first image and the second image may alternatively be image overlaying.
  • the second image 230 is overlaid on the first image 240 , and the first image 240 and the second image 230 that are overlaid are determined as the output image.
  • the second image 230 may be overlaid at positions such as a top right corner, a top left corner, a bottom right corner, or a bottom left corner of the first image 240 , and the second image 230 may be zoomed in or zoomed out according to a user operation.
  • the second image 230 is zoomed in through a double-touch operation, to cover the first image 240 .
  • the first obtaining module 41 and the second obtaining module 45 may be flight-control chips or vision chips in the aerial vehicle.
  • the image generation module 47 may be an independent image processing chip or an image processing unit integrated in a flight-control chip or a vision chip.
  • This embodiment of the present application provides an image processing apparatus.
  • the first obtaining module obtains a first image photographed by the first lens module, the first image including a target object; then the second obtaining module obtains a second image photographed by the second lens module; and the image generation module generates an output image based on the first image and the second image, so that a range in a large-perspective scene and precision of a target object in a small-perspective scene can be both combined.
  • an embodiment of the present application provides an image processing apparatus 50 , which may be configured to perform a method for processing an image disclosed in the embodiments of the present application.
  • the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module.
  • the multi-camera photographing apparatus may be mounted in the image processing apparatus 50 , or may be mounted in another device and transmit, to the image processing apparatus 50 , an image photographed by the device.
  • the apparatus 50 may include a first obtaining module 51 , a location obtaining module 52 , an adjustment module 53 , a second obtaining module 55 , a correction module 56 , an image generation module 57 and an analysis module 54 .
  • the first obtaining module 51 is configured to obtain a first image photographed by the first lens module, the first image including a target object.
  • the location obtaining module 52 is configured to obtain location information of the target object.
  • the adjustment module 53 is configured to adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
  • the second obtaining module 55 is configured to obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • the correction module 56 is configured to correct optical axes of the first lens module and the second lens module or to correct the first image and the second image based on an optical-axis offset of the multi-camera photographing apparatus.
  • the first image and the second image may be corrected based on the optical-axis offset of the multi-camera photographing apparatus. For example, distortion of the first image and the second image is corrected.
  • the analysis module 54 is configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
  • the image generation module 57 is configured to replace an area in the first image that includes the target object with the second image based on the registration parameter, and to generate an output image of the first image including the second image.
  • the image generation module 57 may further be configured to overlay the second image on the first image and to determine the first image and the second image that are overlaid as the output image.
  • the first obtaining module obtains a first image photographed by the first lens module, the first image including a target object.
  • the location obtaining module obtains location information of the target object.
  • the adjustment module adjusts an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
  • the second obtaining module obtains a second image photographed by the second lens module.
  • the correction module corrects the first image and the second image based on an optical-axis offset of the multi-camera photographing apparatus.
  • the image generation module replaces an area in the corrected first image that includes the second image with the corrected second image, and determines the first image including the corrected second image as an output image.
  • a range in a large-perspective scene and precision of a target object in a small-perspective scene can be both combined, to obtain an output image having a relatively desirable combination effect.
  • the first obtaining module 51 , the location obtaining module 52 , the correction module 56 , the analysis module 54 and the second obtaining module 55 may be flight-control chips or vision chips in an aerial vehicle.
  • the adjustment module 53 may be a flight-control chip or a control chip controlling rotation of a gimbal.
  • the image generation module 57 may be an independent image processing chip or an image processing unit integrated in a flight-control chip or a vision chip.
  • FIG. 11 is a schematic structural diagram of hardware of an aerial vehicle 60 for performing a method for processing an image according to an embodiment of the present application.
  • the aerial vehicle 60 may include: a body 61 , a gimbal 62 disposed on the body 61 and a multi-camera photographing apparatus 63 disposed on the gimbal 62 .
  • the multi-camera photographing apparatus 63 includes a first lens module 631 and a second lens module 632 .
  • the aerial vehicle 60 may further include at least one processor 64 and a memory 65 in communication connection with the at least one processor 64 .
  • the memory 65 stores an instruction capable of being executed by the at least one processor 64 , and the instruction is executed by the at least one processor 64 , to enable the at least one processor 64 to perform the foregoing method for processing an image.
  • a processor 64 is used as an example in FIG. 6 .
  • the processor 64 and the memory 65 may be connected by using a bus or in other manners. An example in which the processor 64 and the memory 65 are connected by using a bus is used in FIG. 6 .
  • the memory 65 is used as a non-volatile computer-readable storage medium, and can be configured to store a non-volatile software program, a non-volatile computer-executable program and a module, for example, a program instruction/module (for example, the first obtaining module 41 , the adjustment module 43 , the second obtaining module 45 and the image determining module 47 that are shown in FIG. 4 ; and the first obtaining module 51 , the adjustment module 53 , the second obtaining module 55 , the correction module 56 and the image determining module 57 that are shown in FIG. 5 ) correspondingly configured to perform the method for processing an image in the embodiments of the present application.
  • the processor 64 runs the non-volatile software program, the instruction and the module that are stored in the memory 65 , to perform various functional applications and data processing of the aerial vehicle 60 , that is, to implement the method for processing an image in the foregoing method embodiment.
  • the memory 65 may include a program storage area and a data storage area.
  • the program storage area may store an operating system and an application program required by at least one function.
  • the data storage area may store data created according to use of the aerial vehicle 60 and the like.
  • the memory 65 may include a high speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory or other non-volatile solid-state storage devices.
  • the memory 65 may optionally include a memory remotely arranged relative to the processor 64 , and the remote memory may be connected to the aerial vehicle 60 through a network. Instances of the network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communications network and a combination thereof.
  • the one or more modules are stored in the memory 65 , and when executed by the one or more processors 64 , perform the method for processing an image in any of the foregoing method embodiments, for example, perform step S 11 , step S 13 , step S 15 and step S 17 in the method in FIG. 8 , and implement functions of the first obtaining module 41 , the adjustment module 43 , the second obtaining module 45 , and the image determining module 47 that are shown in FIG. 9 and functions of the first obtaining module 51 , the adjustment module 53 , the second obtaining module 55 , the correction module 56 and the image determining module 57 that are in FIG. 10 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)

Abstract

The present application discloses a method for processing an image, an image processing apparatus, a multi-camera photographing apparatus and an aerial vehicle. A first image photographed by a first lens module is obtained, the first image including a target object; then a second image photographed by a second lens module is obtained; and image processing is performed on the first image and the second image, and an output image is generated, so that a range in a large-perspective scene and precision of the target object in a small-perspective scene can be both combined.

Description

    CROSS-REFERENCE RELATED APPLICATIONS
  • The present application is a continuation of US application with application Ser. No. 16/388,292, filed on Apr. 18, 2019, which is a continuation of International Application No. PCT/CN2017/106143, filed on Oct. 13, 2017, which claims priority of Chinese Patent Application No. 2016109146811, filed on Oct. 20, 2016, both of which are hereby incorporated herein by reference in their entireties.
  • TECHNICAL FIELD
  • Implementations of the present application relate to the field of image processing technologies, and in particular, to a method for processing an image, an image processing apparatus, a multi-camera photographing apparatus, and an aerial vehicle.
  • RELATED ART
  • Aerial vehicles in the prior art are generally equipped with normal lenses for aerial photography. The focus of the normal lens is usually defined in infinitely remote locations, and the resolution of an image sensor is very limited. Consequently, some small targets in a ground scene cannot be clearly photographed. In addition, lenses used for aerial photography generally have a relatively narrow perspective, and therefore a photographed scene range is relatively limited. In the prior art, some aerial vehicles are equipped with zoom lenses for aerial photography, and a perspective is adjusted between 10° to 90°. To clearly photograph a small target in the ground scene, the lens needs to be zoomed to a long focal length. To photograph a relatively wide scene, the lens needs to be switched to a wide angle. The zoom lens has a relatively large weight, increasing the load of the aerial vehicle and reducing the flight performance of the aerial vehicle. Besides, an image sensor of the zoom lens is relatively undiversified and cannot change.
  • In the prior art, adjusting the size of the scene perspective through zooming of the lens cannot combine requirements of both a large-perspective scene and a small-perspective scene.
  • SUMMARY
  • A main technical problem to be resolved in implementations of the present application is to provide a method for processing an image, an image processing apparatus, a multi-camera photographing apparatus, and an aerial vehicle, to combine requirements of both a large-perspective scene and a small-perspective scene.
  • According to a first aspect, an embodiment of the present application provides a method for processing an image, where the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module, and the method includes:
  • obtaining a first image photographed by the first lens module, the first image including a target object;
  • obtaining a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image; and
  • performing image processing on the first image and the second image, and generating an output image.
  • In an embodiment of the present application, the method further includes:
  • obtaining location information of the target object; and
  • adjusting an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
  • In an embodiment of the present application, the performing image processing on the first image and the second image, and generating the output image includes:
  • replacing an area in the first image that includes the target object with the second image, and determining the first image obtained after the replacement as the output image.
  • In an embodiment of the present application, the performing image processing on the first image and the second image, and generating the output image includes:
  • overlaying the second image on the first image, and determining the first image and the second image that are overlaid as the output image.
  • In an embodiment of the present application, the method further includes:
  • performing optical axis correction on optical axes of the first lens module and the second lens module.
  • In an embodiment of the present application, the method further includes:
  • performing registration analysis on the first lens module and the second lens module, and generating a corresponding registration parameter.
  • In an embodiment of the present application, the performing image processing on the first image and the second image, and generating the output image includes:
  • performing image processing on the first image and the second image based on the registration parameter, and outputting the image.
  • In an embodiment of the present application, a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • In an embodiment of the present application, the first lens module includes a normal lens module and/or a wide-angle lens module and the second lens module includes a long-focus lens module.
  • In an embodiment of the present application, the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module having an equivalent focal length of 35 mm, the wide-angle lens module having an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module being 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • According to a second aspect, an embodiment of the present application provides a multi-camera photographing apparatus, where the multi-camera photographing apparatus includes a body, and a first lens module and a second lens module that are disposed on the body, the first lens module and the second lens module being arranged in an array on the body.
  • In an embodiment of the present application, the first lens module includes at least two lens modules.
  • In an embodiment of the present application, the first lens module and the second lens module are arranged in an isosceles triangle.
  • In an embodiment of the present application, the first lens module and the second lens module are linearly arranged.
  • In an embodiment of the present application, a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • In an embodiment of the present application, the first lens module includes a normal lens module and/or a wide-angle lens module and the second lens module includes a long-focus lens module.
  • In an embodiment of the present application, the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module having an equivalent focal length of 35 mm, the wide-angle lens module having an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module being 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • According to a third aspect, an embodiment of the present application provides an image processing apparatus, where the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module, and the apparatus includes:
  • a first obtaining module, configured to obtain a first image photographed by the first lens module, the first image including a target object;
  • a second obtaining module, configured to obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image; and
  • an image generation module, configured to perform image processing on the first image and the second image and to generate an output image.
  • In an embodiment of the present application, the apparatus further includes a location obtaining module and an adjustment module, the location obtaining module being configured to obtain location information of the target object, and the adjustment module being configured to adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
  • In an embodiment of the present application, the image generation module is configured to replace an area in the first image that includes the target object with the second image and to output the first image obtained after the replacement as the output image.
  • In an embodiment of the present application, the image generation module is configured to overlay the second image on the first image and to output the first image and the second image that are overlaid as the output image.
  • In an embodiment of the present application, the apparatus further includes:
  • a correction module, configured to perform optical axis correction on optical axes of the first lens module and the second lens module.
  • In an embodiment of the present application, the apparatus further includes:
  • an analysis module, configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
  • In an embodiment of the present application, the image generation module is further configured to perform image processing on the first image and the second image based on the registration parameter and to generate the output image.
  • In an embodiment of the present application, a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • In an embodiment of the present application, the first lens module includes a normal lens module and/or a wide-angle lens module and the second lens module includes a long-focus lens module.
  • In an embodiment of the present application, the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module having an equivalent focal length of 35 mm, the wide-angle lens module having an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module being 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • According to a fourth aspect, an embodiment of the present application provides an aerial vehicle, including a vehicle body, arms connected to the vehicle body, power apparatuses disposed on the arms, a multi-camera photographing apparatus connected to the vehicle body and an image processing chip, the multi-camera photographing apparatus including a first lens module and a second lens module, and the image processing chip being configured to:
  • obtain a first image photographed by the first lens module, the first image including a target object;
  • obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image; and
  • perform image processing on the first image and the second image, and generate an output image.
  • In an embodiment of the present application, the image processing chip is specifically configured to:
  • replace an area in the first image that includes the target object with the second image, and determine the first image obtained after the replacement as the output image.
  • In an embodiment of the present application, the image processing chip is specifically configured to:
  • overlay the second image on the first image, and determine the first image and the second image that are overlaid as the output image.
  • In an embodiment of the present application, the aerial vehicle further includes a controller and a gimbal that are connected to the vehicle body, the controller being configured to control the gimbal to perform optical axis correction on optical axes of the first lens module and the second lens module.
  • In an embodiment of the present application, the controller is further configured to obtain location information of the target object and to adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
  • In an embodiment of the present application, the controller is further configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
  • In an embodiment of the present application, the aerial vehicle further includes a controller, configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
  • In an embodiment of the present application, the image processing chip is further configured to perform image processing on the first image and the second image based on the registration parameter and to generate the output image.
  • In an embodiment of the present application, the multi-camera photographing apparatus includes a body, and a first lens module and a second lens module that are disposed on the body, the first lens module and the second lens module being arranged in an array on the body.
  • In an embodiment of the present application, the first lens module includes at least two lens modules.
  • In an embodiment of the present application, the first lens module and the second lens module are arranged in an isosceles triangle.
  • In an embodiment of the present application, the first lens module and the second lens module are linearly arranged.
  • In an embodiment of the present application, a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths.
  • In an embodiment of the present application, the first lens module includes a normal lens module and/or a wide-angle lens module and the second lens module includes a long-focus lens module.
  • In an embodiment of the present application, the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module having an equivalent focal length of 35 mm, the wide-angle lens module having an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module being 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • According to a fifth aspect, an embodiment of the present application provides an aerial vehicle, including: a body, a gimbal disposed on the body and a multi-camera photographing apparatus disposed on the gimbal, the multi-camera photographing apparatus including a first lens module and a second lens module, and the aerial vehicle further including:
  • at least one processor; and
  • a memory in communication connection with the at least one processor, where
  • the memory stores an instruction capable of being executed by the at least one processor, and the instruction is executed by the at least one processor, to enable the at least one processor to perform the foregoing method for processing an image.
  • The embodiments of the present application provide a method for processing an image, an image processing apparatus, a multi-camera photographing apparatus and an aerial vehicle. A first image photographed by a first lens module is obtained, the first image including a target object; then a second image photographed by a second lens module is obtained; and image processing is performed on the first image and the second image, and an output image is generated, so that a range in a large-perspective scene and precision of the target object in a small-perspective scene can be both combined.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • To more clearly describe technical solutions of embodiments of the present application, accompanying drawings required in the embodiments of the present application are briefly described below. Obviously, the accompanying drawings described below are merely some embodiments of the present application. Persons of ordinary skill in the art may derive other accompanying drawings based on these accompanying drawings without creative efforts.
  • FIG. 1 is a schematic structural diagram of an aerial vehicle according to an embodiment of the present application;
  • FIG. 2 is a schematic structural diagram of an embodiment of a multi-camera photographing apparatus in the aerial vehicle shown in FIG. 1;
  • FIG. 3 is a schematic structural diagram of another embodiment of a multi-camera photographing apparatus in the aerial vehicle shown in FIG. 1;
  • FIG. 4 is a schematic diagram of a first image and a second image;
  • FIG. 5 is a schematic diagram of an output image obtained through image compositing according to an embodiment of the present application;
  • FIG. 6 is a schematic diagram of an output image obtained through image overlaying according to an embodiment of the present application;
  • FIG. 7 is a flowchart of a method for processing an image according to an embodiment of the present application;
  • FIG. 8 is a flowchart of a method for processing an image according to another embodiment of the present application;
  • FIG. 9 is a functional block diagram of an image processing apparatus according to an embodiment of the present application;
  • FIG. 10 is a functional block diagram of an image processing apparatus according to another embodiment of the present application; and
  • FIG. 11 is a schematic structural diagram of hardware of an aerial vehicle for performing a method for processing an image according to an embodiment of the present application.
  • DETAILED DESCRIPTION
  • To make the objectives, technical solutions, and advantages of the present application clearer and more comprehensible, the following further describes the present application in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely intended to explain the present application rather than limit the present application.
  • In addition, technical features in implementations of the present application described below may be combined with each other, provided that no conflict exists therebetween.
  • The embodiments of the present application are specifically described below with reference to specific accompanying drawings.
  • As shown in FIG. 1, an aerial vehicle 10 includes a housing 11, arms 12 connected to the housing 11, a power apparatus 13 disposed at one end of each arm 12, a gimbal 15 connected to the housing 11, a multi-camera photographing apparatus 14 connected to the gimbal 15, and a controller 16 and an image processing chip 17 that are disposed in the housing 11.
  • In this embodiment, there are four arms 12, that is, the aerial vehicle is a four-rotor aerial vehicle. In other possible embodiments, there may be three arms 12, six arms 12, eight arms 12, ten arms 12, or the like. The aerial vehicle 10 may alternatively be other movable objects such as a manned aerial vehicle, a model airplane, an unmanned airship, a fixed-wing unmanned aerial vehicle, or an unmanned hot air balloon.
  • The power apparatus 13 includes a motor 132 disposed at one end of the arm 12 and a propeller 131 connected to a rotating shaft of the motor 132. The rotating shaft of the motor 132 rotates to drive the propeller 131 to rotate, so as to provide a lift force for the aerial vehicle 10.
  • The gimbal 15 is configured to reduce and even eliminate vibration transferred from the power apparatus 13 to the multi-camera photographing apparatus 14, to ensure that the multi-camera photographing apparatus 14 can photograph a stable and clear image or video. The multi-camera photographing apparatus 14 is carried on the aerial vehicle 10 by using the gimbal 15 described in this embodiment.
  • As shown in FIG. 2, the multi-camera photographing apparatus 14 includes a body 141, and a first lens module 145 and a second lens module 144 that are disposed on the body 141. The first lens module 145 and the second lens module 144 are both fixed-focus lens modules. Because the first lens module 145 and the second lens module 144 are both fixed-focus lenses, a lighter material can be used during design and a mechanical apparatus of a zoom lens is eliminated.
  • In an embodiment of the present application, the first lens module 145 includes two lens modules, and the two lens modules are respectively a normal lens module 142 and a wide-angle lens module 143. The second lens module 144 is a long-focus lens module. In an embodiment of the present application, the normal lens module 142 has an equivalent focal length of 35 mm, the wide-angle lens module 143 has an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module 144 is 3 to 20 times the equivalent focal length of the wide-angle lens module 143. In this embodiment of the present application, image sensors having corresponding resolutions may be flexibly configured for the wide-angle lens module 143 and the long-focus lens module 144.
  • In an embodiment of the present application, on the premise that the first lens module 145 includes at least two lens modules, the first lens module 145 and the second lens module 144 are arranged in an array. Specifically, as shown in FIG. 2, the normal lens module 142, the wide-angle lens module 143 and the long-focus lens module 144 are arranged in an isosceles triangle. In other possible embodiments, the first lens module 145′ and the second lens module 144′ may alternatively be linearly arranged. Specifically, as shown in FIG. 3, the normal lens module 142′, the wide-angle lens module 143′ and the long-focus lens module 144′ are linearly arranged at equal intervals. The long-focus lens module 144′ and the wide-angle lens module 143′ are arranged on two sides of the normal lens module 142′.
  • Before the multi-camera photographing apparatus 14 is started, the controller 16 first controls the gimbal 15 to correct optical axes of the first lens module 145 and the second lens module 144 and to record optical axis correction parameters of the first lens module 145 and the second lens module 144. It may be understood that the optical axis correction of the first lens module 145 and the second lens module 144 may alternatively be completed at delivery by using an existing lens-module optical axis correction technology.
  • Considering differences between cameras, when the multi-camera photographing apparatus 14 is started, the controller 16 needs to perform registration analysis on the first lens module 145 and the second lens module 144 and generate a corresponding registration parameter.
  • An example in which the first lens module 145 includes the normal lens module 142 and the wide-angle lens module 143 is used. When a target object of interest appears in a scene, the wide-angle lens module 143 photographs image data, and the controller 16 obtains location information of the target object based on the image data obtained by the wide-angle lens module 143, and adjusts the optical axis of the second lens module 144, so that the optical axis of the second lens module 144 faces the target object of interest.
  • In this case, the image processing chip 17 obtains a first image photographed by the first lens module 145, the first image including the target object. In this embodiment of the present application, the target object may include, but is not limited to, objects such as a person, an animal, a plant, an automobile and a building. The first image may be a single frame of image that is captured by the first lens module. The aerial vehicle may further include an image recognition chip, configured to obtain a feature parameter of the target object, where the feature parameter is used for tracing the target object and may be used for representing an overall contour of the target object. For example, the multi-camera photographing apparatus is mounted on the aerial vehicle. The aerial vehicle can target an automobile by obtaining a feature parameter such as a license number and/or an automobile body contour and/or an automobile color of the automobile, and further trace the automobile.
  • Next, the image processing chip 17 obtains a second image photographed by the second lens module 144, the second image also including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image. In an embodiment of the present application, the second lens module 144 is a long-focus lens module.
  • At last, the image processing chip 17 performs image processing on the first image and the second image based on the registration parameter obtained by the controller 16 and generates an output image.
  • In an optional implementation, processing manners for the first image and the second image may be image compositing. Specifically, as shown in FIG. 4, the image processing chip 17 replaces an area 211 in the first image 210 that includes the target object with the second image 220, and as shown in FIG. 5, determines the first image 210 including the second image 220 as the output image. That is, an image in a corresponding area in a large perspective image is replaced with a clear small-perspective image that is photographed, so that the clear image of the target object can be seen in the large-perspective image.
  • In an optional implementation, combination manners for the first image and the second image may alternatively be image overlaying. Specifically, as shown in FIG. 6, the image processing chip 17 overlays the second image 230 on the first image 240, and determines the first image 240 and the second image 230 that are overlaid as the output image. The second image 230 may be overlaid at positions such as a top right corner, a top left corner, a bottom right corner, or a bottom left corner of the first image 240, and the second image 230 may be zoomed in or zoomed out according to a user operation. For example, the second image 230 is zoomed in through a double-touch operation, to cover the first image 240.
  • According to the aerial vehicle in the present application, when a target of interest appears in a scene, a range in a large-perspective scene and precision in a small-perspective scene can be both combined, to obtain an output image having relatively high precision.
  • As shown in FIG. 7, an embodiment of the present application provides a method for processing an image, where the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module, and the method may include the following steps.
  • S101: Perform optical axis correction on the first lens module and the second lens module.
  • The optical axis correction of the first lens module and the second lens module may be completed by using an optical axis correction algorithm of a controller located in an aerial vehicle, or may be completed at delivery by using an existing lens-module optical axis correction technology.
  • S102: Perform registration analysis on the first lens module and the second lens module, and generate a corresponding registration parameter.
  • Considering differences between cameras, when the multi-camera photographing apparatus 14 is started, the controller 16 needs to perform registration analysis on the first lens module 145 and the second lens module 144 and generate the corresponding registration parameter. The registration parameter may be used as a basis of subsequent image processing.
  • Obtaining location information of a target object belongs to the prior art, and details are not described herein again.
  • S103: Obtain a first image photographed by the first lens module, the first image including a target object.
  • In this embodiment of the present application, the target object may include, but is not limited to, objects such as a person, an animal, a plant, an automobile and a building. The first image may be a single frame of image that is captured by the first lens module. In an embodiment of the present application, before step S103, the method may further include: obtaining a feature parameter of the target object, where the feature parameter is used for tracing the target object and may be used for representing an overall contour of the target object. For example, the multi-camera photographing apparatus is mounted on the aerial vehicle. The aerial vehicle can target an automobile by obtaining a feature parameter such as a license number and/or an automobile body contour and/or an automobile color of the automobile, and further trace the automobile. For another example, a person can be traced by obtaining a feature parameter such as a facial contour and/or a body contour and/or a clothes color of the person. The feature parameter of the target object may be obtained by photographing the target object by the first lens module, or may be obtained from a device such as a server or another intelligent terminal through wireless transmission. This is not limited in this embodiment of the present application.
  • S104: Obtain location information of the target object.
  • The location information of the target object may be obtained by using a currently known technology, and details are not described herein again.
  • S105: Adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis faces a direction of the target object.
  • S106: Obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • The target object included in the second image and the target object included in the first image are a same target object, and there are more pixels of the target object in the second image. Therefore, the target object in the second image is clearer.
  • S107: Perform image processing on the first image and the second image, and generate an output image.
  • In an optional implementation, combination manners for the first image and the second image may be image compositing. Specifically, as shown in FIG. 4, an area 211 in the first image 210 that includes the target object is replaced with the second image 220, and as shown in FIG. 5, the first image 210 including the second image 220 is determined as the output image. That is, an image in a corresponding area in a large perspective is replaced with a clear small-perspective image that is photographed, so that the clear image of the target object can be seen in the large-perspective image.
  • In an optional implementation, combination manners for the first image and the second image may alternatively be image overlaying. Specifically, as shown in FIG. 6, the second image 230 is overlaid on the first image 240, and the first image 240 and the second image 230 that are overlaid are determined as the output image. The second image 230 may be overlaid at positions such as a top right corner, a top left corner, a bottom right corner, or a bottom left corner of the first image 240, and the second image 230 may be zoomed in or zoomed out according to a user operation. For example, the second image 230 is zoomed in through a double-touch operation, so that the second image 230 is displayed in full screen, and the first image 240 is zoomed out and may be caused to be located at a position at which the second image 230 is originally located.
  • As shown in FIG. 8, an embodiment of the present application provides a method for processing an image, where the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module, and the method may include the following steps.
  • Step S11: Obtain a first image photographed by the first lens module, the first image including a target object.
  • In this embodiment of the present application, the target object may include, but is not limited to, objects such as a person, an animal, a plant, an automobile and a building. The first image may be a single frame of image that is captured by the first lens module. In an embodiment of the present application, before step S11, the method may further include: obtaining a feature parameter of the target object, where the feature parameter is used for tracing the target object and may be used for representing an overall contour of the target object. For example, the multi-camera photographing apparatus is mounted on an aerial vehicle. The aerial vehicle can target an automobile by obtaining a feature parameter such as a license number and/or an automobile body contour and/or an automobile color of the automobile, and further trace the automobile. For another example, a person can be traced by obtaining a feature parameter such as a facial contour and/or a body contour and/or a clothes color of the person. The feature parameter of the target object may be obtained by photographing the target object by the first lens module, or may be obtained from a device such as a server or another intelligent terminal through wireless transmission. This is not limited in this embodiment of the present application.
  • Step S12: Obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • In this embodiment of the present application, after an optical axis of the second lens module faces the target object, the second lens module photographs the second image, the second image including the target object.
  • In an optional implementation, a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths. An image sensor configured for a zoom lens is undiversified and cannot change, while image sensors having corresponding resolutions may be flexibly configured for the fixed-focus first lens module and second lens module. In addition, a zoom lens is generally heavier than a fixed-focus lens, and the first lens module and the second lens module are both fixed-focus lenses. A lighter material can be used during design and a mechanical apparatus of a zoom lens is eliminated, so that the load of the aerial vehicle can be reduced.
  • In this embodiment of the present application, the focal length of the first lens module is different from the focal length of the second lens module. Preferably, the focal length of the first lens module is less than the focal length of the second lens module. The first lens module may include a normal lens module and/or a wide-angle lens module and the second lens module may include a long-focus lens module. The first image photographed by the first lens module belongs to a large-perspective scene and the second image photographed by the second lens module belongs to a small-perspective scene.
  • When the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module, the wide-angle lens module and the long-focus lens module may be arranged in a triangle or may be linearly arranged. Optionally, the normal lens module may have an equivalent focal length of 35 mm, the wide-angle lens module may have an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module may be 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • Step S13: Perform image processing on the first image and the second image, and generate an output image.
  • In an optional implementation, combination manners for the first image and the second image may be image compositing. Specifically, as shown in FIG. 4, an area 211 in the first image 210 that includes the target object is replaced with the second image 220, and as shown in FIG. 5, the first image 210 including the second image 220 is determined as the output image. That is, an image in a corresponding area in a large perspective is replaced with a clear small-perspective image that is photographed, so that the clear image of the target object can be seen in the large-perspective image.
  • In an optional implementation, combination manners for the first image and the second image may alternatively be image overlaying. Specifically, as shown in FIG. 6, the second image 230 is overlaid on the first image 240, and the first image 240 and the second image 230 that are overlaid are determined as the output image. The second image 230 may be overlaid at positions such as a top right corner, a top left corner, a bottom right corner, or a bottom left corner of the first image 240, and the second image 230 may be zoomed in or zoomed out according to a user operation. For example, the second image 230 is zoomed in through a double-touch operation, so that the second image 230 is displayed in full screen, and the first image 240 is zoomed out and may be caused to be located at a position at which the second image 230 is originally located.
  • As shown in FIG. 9, an embodiment of the present application provides an image processing apparatus 40, which may be configured to perform a method for processing an image disclosed in the embodiments of the present application. The image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including s first lens module and a second lens module. The multi-camera photographing apparatus may be mounted in the image processing apparatus 40, or may be mounted in another device and transmit, to the image processing apparatus 40, an image photographed by the device. The apparatus 40 may include a first obtaining module 41, a second obtaining module 45, and an image generation module 47.
  • The first obtaining module 41 is configured to obtain a first image photographed by the first lens module, the first image including a target object.
  • In this embodiment of the present application, the target object may include, but is not limited to, objects such as a person, an animal, a plant, an automobile and a building. The first image may be a single frame of image that is captured by the first lens module. The apparatus may further include a feature parameter obtaining module, configured to obtain a feature parameter of the target object before the first obtaining module 41 obtains the first image photographed by the first lens module, where the feature parameter is used for tracing the target object and may be used for representing an overall contour of the target object. For example, the multi-camera photographing apparatus is mounted on an aerial vehicle. The aerial vehicle can target an automobile by obtaining a feature parameter such as a license number and/or an automobile body contour and/or an automobile color of the automobile, and further trace the automobile.
  • The second obtaining module 45 is configured to obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • In this embodiment of the present application, after an optical axis of the second lens module faces the target object, the second lens module photographs the second image, the second image including the target object.
  • In an optional implementation, a focal length of the first lens module and a focal length of the second lens module are both fixed focal lengths. Preferably, the focal length of the first lens module is less than the focal length of the second lens module. The first lens module may include a normal lens module and/or a wide-angle lens module and the second lens module may include a long-focus lens module. The first image photographed by the first lens module belongs to a large-perspective scene and the second image photographed by the second lens module belongs to a small-perspective scene.
  • When the first lens module includes the normal lens module and the wide-angle lens module, the normal lens module, the wide-angle lens module and the long-focus lens module may be arranged in a triangle or may be linearly arranged. Optionally, the normal lens module may have an equivalent focal length of 35 mm, the wide-angle lens module may have an equivalent focal length of 18 mm to 24 mm, and an equivalent focal length of the long-focus lens module may be 3 to 20 times the equivalent focal length of the wide-angle lens module.
  • The image generation module 47 is configured to determine an output image based on the first image and the second image.
  • In an optional implementation, combination manners for the first image and the second image may be image compositing. Specifically, as shown in FIG. 4, an area 211 in the first image 210 that includes the target object is replaced with the second image 220, and as shown in FIG. 5, the first image 210 including the second image 220 is determined as the output image.
  • In an optional implementation, combination manners for the first image and the second image may alternatively be image overlaying. Specifically, as shown in FIG. 6, the second image 230 is overlaid on the first image 240, and the first image 240 and the second image 230 that are overlaid are determined as the output image. The second image 230 may be overlaid at positions such as a top right corner, a top left corner, a bottom right corner, or a bottom left corner of the first image 240, and the second image 230 may be zoomed in or zoomed out according to a user operation. For example, the second image 230 is zoomed in through a double-touch operation, to cover the first image 240.
  • The first obtaining module 41 and the second obtaining module 45 may be flight-control chips or vision chips in the aerial vehicle. The image generation module 47 may be an independent image processing chip or an image processing unit integrated in a flight-control chip or a vision chip.
  • This embodiment of the present application provides an image processing apparatus. The first obtaining module obtains a first image photographed by the first lens module, the first image including a target object; then the second obtaining module obtains a second image photographed by the second lens module; and the image generation module generates an output image based on the first image and the second image, so that a range in a large-perspective scene and precision of a target object in a small-perspective scene can be both combined.
  • As shown in FIG. 10, an embodiment of the present application provides an image processing apparatus 50, which may be configured to perform a method for processing an image disclosed in the embodiments of the present application. The image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus including a first lens module and a second lens module. The multi-camera photographing apparatus may be mounted in the image processing apparatus 50, or may be mounted in another device and transmit, to the image processing apparatus 50, an image photographed by the device. The apparatus 50 may include a first obtaining module 51, a location obtaining module 52, an adjustment module 53, a second obtaining module 55, a correction module 56, an image generation module 57 and an analysis module 54.
  • The first obtaining module 51 is configured to obtain a first image photographed by the first lens module, the first image including a target object.
  • The location obtaining module 52 is configured to obtain location information of the target object.
  • The adjustment module 53 is configured to adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
  • The second obtaining module 55 is configured to obtain a second image photographed by the second lens module, the second image including the target object, and a quantity of pixels of the target object included in the second image being greater than a quantity of pixels of the target object in the first image.
  • In this embodiment of the present application, for explanations and descriptions of the first obtaining module 51 and the second obtaining module 55, refer to the foregoing explanations and descriptions of the first obtaining module 41 and the second obtaining module 45, and details are not described herein again.
  • The correction module 56 is configured to correct optical axes of the first lens module and the second lens module or to correct the first image and the second image based on an optical-axis offset of the multi-camera photographing apparatus.
  • In this embodiment of the present application, the first image and the second image may be corrected based on the optical-axis offset of the multi-camera photographing apparatus. For example, distortion of the first image and the second image is corrected.
  • The analysis module 54 is configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
  • The image generation module 57 is configured to replace an area in the first image that includes the target object with the second image based on the registration parameter, and to generate an output image of the first image including the second image.
  • In this embodiment of the present application, for explanations and descriptions of the image generation module 57, refer to the foregoing explanations and descriptions of a case in which combination manners for the first image and the second image in the image generation module 47 are image overlaying.
  • In other embodiments of the present application, the image generation module 57 may further be configured to overlay the second image on the first image and to determine the first image and the second image that are overlaid as the output image.
  • This embodiment of the present application provides an image processing apparatus. The first obtaining module obtains a first image photographed by the first lens module, the first image including a target object. The location obtaining module obtains location information of the target object. The adjustment module adjusts an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object. The second obtaining module obtains a second image photographed by the second lens module. The correction module corrects the first image and the second image based on an optical-axis offset of the multi-camera photographing apparatus. The image generation module replaces an area in the corrected first image that includes the second image with the corrected second image, and determines the first image including the corrected second image as an output image. A range in a large-perspective scene and precision of a target object in a small-perspective scene can be both combined, to obtain an output image having a relatively desirable combination effect.
  • The first obtaining module 51, the location obtaining module 52, the correction module 56, the analysis module 54 and the second obtaining module 55 may be flight-control chips or vision chips in an aerial vehicle. The adjustment module 53 may be a flight-control chip or a control chip controlling rotation of a gimbal. The image generation module 57 may be an independent image processing chip or an image processing unit integrated in a flight-control chip or a vision chip.
  • FIG. 11 is a schematic structural diagram of hardware of an aerial vehicle 60 for performing a method for processing an image according to an embodiment of the present application. The aerial vehicle 60 may include: a body 61, a gimbal 62 disposed on the body 61 and a multi-camera photographing apparatus 63 disposed on the gimbal 62. The multi-camera photographing apparatus 63 includes a first lens module 631 and a second lens module 632. The aerial vehicle 60 may further include at least one processor 64 and a memory 65 in communication connection with the at least one processor 64. The memory 65 stores an instruction capable of being executed by the at least one processor 64, and the instruction is executed by the at least one processor 64, to enable the at least one processor 64 to perform the foregoing method for processing an image. A processor 64 is used as an example in FIG. 6. The processor 64 and the memory 65 may be connected by using a bus or in other manners. An example in which the processor 64 and the memory 65 are connected by using a bus is used in FIG. 6.
  • The memory 65 is used as a non-volatile computer-readable storage medium, and can be configured to store a non-volatile software program, a non-volatile computer-executable program and a module, for example, a program instruction/module (for example, the first obtaining module 41, the adjustment module 43, the second obtaining module 45 and the image determining module 47 that are shown in FIG. 4; and the first obtaining module 51, the adjustment module 53, the second obtaining module 55, the correction module 56 and the image determining module 57 that are shown in FIG. 5) correspondingly configured to perform the method for processing an image in the embodiments of the present application. The processor 64 runs the non-volatile software program, the instruction and the module that are stored in the memory 65, to perform various functional applications and data processing of the aerial vehicle 60, that is, to implement the method for processing an image in the foregoing method embodiment.
  • The memory 65 may include a program storage area and a data storage area. The program storage area may store an operating system and an application program required by at least one function. The data storage area may store data created according to use of the aerial vehicle 60 and the like. In addition, the memory 65 may include a high speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory or other non-volatile solid-state storage devices. In some embodiments, the memory 65 may optionally include a memory remotely arranged relative to the processor 64, and the remote memory may be connected to the aerial vehicle 60 through a network. Instances of the network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communications network and a combination thereof.
  • The one or more modules are stored in the memory 65, and when executed by the one or more processors 64, perform the method for processing an image in any of the foregoing method embodiments, for example, perform step S11, step S13, step S15 and step S17 in the method in FIG. 8, and implement functions of the first obtaining module 41, the adjustment module 43, the second obtaining module 45, and the image determining module 47 that are shown in FIG. 9 and functions of the first obtaining module 51, the adjustment module 53, the second obtaining module 55, the correction module 56 and the image determining module 57 that are in FIG. 10.
  • At last, it should be noted that the foregoing embodiments are merely intended to describe the technical solutions of the present application rather than limit the technical solutions of the present application. In the concept of the present application, technical features between the foregoing embodiments or different embodiments may be combined, steps may be performed in any sequence, and many other changes of the foregoing descriptions in different aspects of the present application exist. For brevity, the changes are not provided in details. Although the present application is described in detail with reference to the foregoing embodiments, persons of ordinary skill in the art should understand that they may still make modifications to the technical solutions described in the foregoing embodiments or make equivalent replacements to some technical features thereof, without departing from the scope of the technical solutions of the embodiments of the present application.

Claims (16)

What is claimed is:
1. A method for processing an image, wherein the image is photographed by a multi-camera photographing apparatus, the multi-camera photographing apparatus comprising a first lens module and a second lens module, and the method comprises:
obtaining a first image photographed by the first lens module, the first image comprising a target object;
obtaining a second image photographed by the second lens module, the second image comprising the target object, and a quantity of pixels of the target object comprised in the second image being greater than a quantity of pixels of the target object in the first image; and
performing image processing on the first image and the second image, and generating an output image;
wherein the first lens module comprises a wide-angle lens module, wherein the second lens module comprises a long-focus lens module, a wide-angle lens module and the long-focus lens module are arranged in a line.
2. The method according to claim 1, wherein the method further comprises:
obtaining location information of the target object; and
adjusting an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
3. The method according to claim 1, wherein the performing image processing on the first image and the second image, and generating the output image comprises:
replacing an area in the first image that comprises the target object with the second image, and determining the first image obtained after the replacement as the output image.
4. The method according to claim 1, wherein the performing image processing on the first image and the second image, and generating the output image comprises:
overlaying the second image on the first image, and determining the first image and the second image that are overlaid as the output image.
5. The method according to claim 1, wherein the method further comprises:
performing optical axis correction on optical axes of the first lens module and the second lens module.
6. The method according to claim 1, wherein the method further comprises:
performing registration analysis on the first lens module and the second lens module, and generating a corresponding registration parameter.
7. The method according to claim 6, wherein the performing image processing on the first image and the second image, and generating the output image comprises:
performing image processing on the first image and the second image based on the registration parameter, and generating the output image.
8. An aerial vehicle, comprising a vehicle body, arms connected to the vehicle body, power apparatuses disposed on the arms, a multi-camera photographing apparatus connected to the vehicle body and an image processing chip, the multi-camera photographing apparatus comprising a first lens module and a second lens module, and the image processing chip being configured to:
obtain a first image photographed by the first lens module, the first image comprising a target object;
obtain a second image photographed by the second lens module, the second image comprising the target object, and a quantity of pixels of the target object comprised in the second image being greater than a quantity of pixels of the target object in the first image; and
perform image processing on the first image and the second image, and generate an output image;
wherein the first lens module comprises a wide-angle lens module, wherein the second lens module comprises a long-focus lens module, a wide-angle lens module and the long-focus lens module are arranged in a line.
9. The aerial vehicle according to claim 8, wherein the image processing chip is specifically configured to:
replace an area in the first image that comprises the target object with the second image, and determine the first image obtained after the replacement as the output image.
10. The aerial vehicle according to claim 8, wherein the image processing chip is specifically configured to:
overlay the second image on the first image, and determine the first image and the second image that are overlaid as the output image.
11. The aerial vehicle according to claim 8, wherein the aerial vehicle further comprises a controller and a gimbal that are connected to the vehicle body, the controller being configured to control the gimbal to perform optical axis correction on optical axes of the first lens module and the second lens module.
12. The aerial vehicle according to claim 10, wherein the controller is further configured to obtain location information of the target object and to adjust an optical axis of the second lens module based on the location information of the target object, so that the optical axis of the second lens module faces the target object.
13. The aerial vehicle according to claim 11, wherein the controller is further configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
14. The aerial vehicle according to claim 8, wherein the aerial vehicle further comprises a controller, configured to perform registration analysis on the first lens module and the second lens module and to generate a corresponding registration parameter.
15. The aerial vehicle according to claim 13, wherein the image processing chip is further configured to perform image processing on the first image and the second image based on the registration parameter and to generate the output image.
16. The aerial vehicle according to claim 8, wherein the multi-camera photographing apparatus comprises a body, and the first lens module and the second lens module that are disposed on the body, the first lens module and the second lens module being arranged in an array on the body.
US17/122,672 2016-10-20 2020-12-15 Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle Abandoned US20210105403A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/122,672 US20210105403A1 (en) 2016-10-20 2020-12-15 Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201610914681.1A CN106506941A (en) 2016-10-20 2016-10-20 The method and device of image procossing, aircraft
CN201610914681.1 2016-10-20
PCT/CN2017/106143 WO2018072657A1 (en) 2016-10-20 2017-10-13 Image processing method, image processing device, multi-camera photographing device, and aerial vehicle
US16/388,292 US10904430B2 (en) 2016-10-20 2019-04-18 Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle
US17/122,672 US20210105403A1 (en) 2016-10-20 2020-12-15 Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/388,292 Continuation US10904430B2 (en) 2016-10-20 2019-04-18 Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle

Publications (1)

Publication Number Publication Date
US20210105403A1 true US20210105403A1 (en) 2021-04-08

Family

ID=58319274

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/388,292 Active US10904430B2 (en) 2016-10-20 2019-04-18 Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle
US17/122,672 Abandoned US20210105403A1 (en) 2016-10-20 2020-12-15 Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/388,292 Active US10904430B2 (en) 2016-10-20 2019-04-18 Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle

Country Status (3)

Country Link
US (2) US10904430B2 (en)
CN (1) CN106506941A (en)
WO (1) WO2018072657A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106506941A (en) 2016-10-20 2017-03-15 深圳市道通智能航空技术有限公司 The method and device of image procossing, aircraft
CN106791377B (en) * 2016-11-29 2019-09-27 Oppo广东移动通信有限公司 Control method, control device and electronic device
CN109215055A (en) 2017-06-30 2019-01-15 杭州海康威视数字技术股份有限公司 A kind of target's feature-extraction method, apparatus and application system
CN109218572A (en) * 2017-07-05 2019-01-15 北京臻迪科技股份有限公司 A kind of image-pickup method, system and unmanned plane
CN107577246B (en) * 2017-09-29 2020-08-07 深圳市富斯科技有限公司 Image shooting method and system, electronic holder and aircraft
CN108622428B (en) * 2018-05-08 2021-08-03 重庆邮电大学 Multi-camera unmanned aerial vehicle
CN108377342B (en) * 2018-05-22 2021-04-20 Oppo广东移动通信有限公司 Double-camera shooting method and device, storage medium and terminal
CN108900763B (en) * 2018-05-30 2022-03-22 Oppo(重庆)智能科技有限公司 Shooting device, electronic equipment and image acquisition method
US11425307B2 (en) 2018-07-19 2022-08-23 Hangzhou Hikvision Digital Technology Co., Ltd. Image capture device in which the focal length of the image capture device can be expanded without increasing the size of lenses
CN111917941B (en) * 2019-05-08 2021-10-01 杭州海康威视数字技术股份有限公司 Camera picture processing method and camera
CN111614951B (en) * 2019-02-25 2022-05-13 浙江宇视科技有限公司 Optical axis calibration equipment and method for integrated pan-tilt camera
CN109905604B (en) * 2019-03-29 2021-09-21 深圳市道通智能航空技术股份有限公司 Focusing method and device, shooting equipment and aircraft
CN111345029B (en) 2019-05-30 2022-07-08 深圳市大疆创新科技有限公司 Target tracking method and device, movable platform and storage medium
CN110493514B (en) * 2019-08-07 2021-07-16 Oppo广东移动通信有限公司 Image processing method, storage medium, and electronic device
CN110519540A (en) * 2019-08-29 2019-11-29 深圳市道通智能航空技术有限公司 A kind of image processing method, device, equipment and storage medium
CN114143472A (en) * 2019-09-02 2022-03-04 深圳市道通智能航空技术股份有限公司 Image exposure method and device, shooting equipment and unmanned aerial vehicle
CN112702564B (en) * 2019-10-23 2023-04-18 成都鼎桥通信技术有限公司 Image monitoring method and device
JP6880380B2 (en) * 2019-11-01 2021-06-02 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Image processing equipment, image processing methods, and programs
CN111818304B (en) * 2020-07-08 2023-04-07 杭州萤石软件有限公司 Image fusion method and device
CN112173149A (en) * 2020-10-30 2021-01-05 南方电网数字电网研究院有限公司 Stability augmentation cradle head with edge computing capability, unmanned aerial vehicle and target identification method
JP2022172978A (en) * 2021-05-07 2022-11-17 シャープ株式会社 Imaging device

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6738073B2 (en) * 1999-05-12 2004-05-18 Imove, Inc. Camera system with both a wide angle view and a high resolution view
CN1587910A (en) * 2004-09-21 2005-03-02 中国科学院上海技术物理研究所 Manless machine carried light small multiple spectrum imaging instrument for monitoring
US20140240469A1 (en) 2013-02-28 2014-08-28 Motorola Mobility Llc Electronic Device with Multiview Image Capture and Depth Sensing
US9538096B2 (en) * 2014-01-27 2017-01-03 Raytheon Company Imaging system and methods with variable lateral magnification
CN104301669A (en) * 2014-09-12 2015-01-21 重庆大学 Suspicious target detection tracking and recognition method based on dual-camera cooperation
CN104867155A (en) * 2015-06-02 2015-08-26 阔地教育科技有限公司 Target tracking method and device on the basis of joint calibration
US9952485B1 (en) * 2015-06-10 2018-04-24 Innotech Security, Inc. Video surveillance camera having a separable and removable gimbal
CN105407283B (en) * 2015-11-20 2018-12-18 成都因纳伟盛科技股份有限公司 A kind of multiple target initiative recognition tracing and monitoring method
CN105893510A (en) * 2016-03-30 2016-08-24 北京格灵深瞳信息技术有限公司 Video structurization system and target search method thereof
CN105890590A (en) * 2016-04-12 2016-08-24 西北工业大学 UAV (unmanned aerial vehicle) remote optical landing guidance system based on infrared laser lamps and multi-camera array
US10074183B1 (en) * 2016-06-03 2018-09-11 Amazon Technologies, Inc. Image alignment correction for imaging processing during operation of an unmanned aerial vehicle
CN107770433B (en) * 2016-08-15 2020-08-04 广州立景创新科技有限公司 Image acquisition device and image smooth scaling method thereof
CN106506941A (en) * 2016-10-20 2017-03-15 深圳市道通智能航空技术有限公司 The method and device of image procossing, aircraft

Also Published As

Publication number Publication date
CN106506941A (en) 2017-03-15
WO2018072657A1 (en) 2018-04-26
US20190253620A1 (en) 2019-08-15
US10904430B2 (en) 2021-01-26

Similar Documents

Publication Publication Date Title
US10904430B2 (en) Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle
US20220078349A1 (en) Gimbal control method and apparatus, control terminal and aircraft system
US20190246042A1 (en) Photographing composition method and apparatus, movable object and computer-readable storage medium
US10979615B2 (en) System and method for providing autonomous photography and videography
EP3687156B1 (en) Dual lens system having a light splitter
CN113794840B (en) Video processing method, video processing equipment, unmanned aerial vehicle and video processing system
WO2019113966A1 (en) Obstacle avoidance method and device, and unmanned aerial vehicle
WO2017020150A1 (en) Image processing method, device and camera
WO2018035764A1 (en) Method for taking wide-angle pictures, device, cradle heads, unmanned aerial vehicle and robot
US20210176395A1 (en) Gimbal system and image processing method thereof and unmanned aerial vehicle
US11353689B2 (en) Lens system, imaging apparatus, and moving object
CN113273172A (en) Panorama shooting method, device and system and computer readable storage medium
US20200275026A1 (en) Intelligent dual-lens photographing device and photographing method therefor
US11521395B2 (en) Image processing device, image processing method, and program
WO2021217371A1 (en) Control method and apparatus for movable platform
WO2019023914A1 (en) Image processing method, unmanned aerial vehicle, ground console, and image processing system thereof
US11061412B2 (en) Information processing device and information processing method
JP2017034320A (en) Imaging apparatus with movement mechanism
CN108419052B (en) Panoramic imaging method for multiple unmanned aerial vehicles
WO2022141271A1 (en) Control method and control device for platform system, platform system, and storage medium
CN110720210B (en) Lighting device control method, device, aircraft and system
WO2021237625A1 (en) Image processing method, head-mounted display device, and storage medium
WO2017117039A1 (en) Omnidirectional catadioptric lens with odd aspheric contour or multi-lens
KR20170009178A (en) Multicopter Installed A Plurality Of Cameras And Apparatus For Monitoring Image Received Therefrom
WO2020150974A1 (en) Photographing control method, mobile platform and storage medium

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

AS Assignment

Owner name: AUTEL ROBOTICS CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LIU, YANG;REEL/FRAME:057641/0263

Effective date: 20160103

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION