CN112087571A - Image acquisition method and device, electronic equipment and computer readable storage medium - Google Patents

Image acquisition method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN112087571A
CN112087571A CN201910516211.3A CN201910516211A CN112087571A CN 112087571 A CN112087571 A CN 112087571A CN 201910516211 A CN201910516211 A CN 201910516211A CN 112087571 A CN112087571 A CN 112087571A
Authority
CN
China
Prior art keywords
wide
angle
image
camera
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910516211.3A
Other languages
Chinese (zh)
Inventor
陈伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910516211.3A priority Critical patent/CN112087571A/en
Publication of CN112087571A publication Critical patent/CN112087571A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Abstract

The application relates to an image acquisition method, an image acquisition device, an electronic device and a computer-readable storage medium. The method is applied to the electronic equipment, the electronic equipment comprises a wide-angle camera and at least two tele cameras, and an overlapped view field area exists between each tele camera and the wide-angle camera; the method comprises the following steps: controlling a wide-angle camera to acquire a wide-angle image; controlling each long-focus camera to collect a frame of long-focus images respectively by using a preset focus to obtain at least two frames of long-focus images, wherein the preset focus is located at infinity; and carrying out fusion processing on the wide-angle image and the long-focus images in at least two frames to obtain a target image. The image acquisition method can ensure that the fused target image keeps consistent definition, and can improve the quality of the target image.

Description

Image acquisition method and device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of imaging technologies, and in particular, to an image acquisition method and apparatus, an electronic device, and a computer-readable storage medium.
Background
With the development of the imaging technology, people have higher and higher requirements on the image acquisition technology of electronic equipment, and in order to improve the imaging effect of the electronic equipment, more and more equipment manufacturers configure a plurality of cameras for the electronic equipment. The electronic equipment can acquire images through the plurality of cameras so as to splice and synthesize the images acquired by the plurality of cameras.
Disclosure of Invention
The embodiment of the application provides an image acquisition method, an image acquisition device, electronic equipment and a computer-readable storage medium, which can improve the image quality.
An image acquisition method is applied to electronic equipment, wherein the electronic equipment comprises a wide-angle camera and at least two tele cameras, and an overlapped field of view area exists between each tele camera and the wide-angle camera; the method comprises the following steps:
controlling the wide-angle camera to acquire a wide-angle image;
controlling each long-focus camera to collect a frame of long-focus images with a preset focus to obtain at least two frames of long-focus images, wherein the preset focus is located at infinity;
and carrying out fusion processing on the wide-angle image and the at least two frames of the tele images to obtain a target image. An image acquisition apparatus comprising:
the first acquisition module is used for controlling the wide-angle camera to acquire a wide-angle image;
the second acquisition module is used for controlling each long-focus camera in the at least two long-focus cameras to acquire a frame of long-focus images with a preset focus respectively to obtain at least two frames of long-focus images, wherein the preset focus is located at infinity, and an overlapped view field area exists between each long-focus camera and the wide-angle camera;
and the processing module is used for carrying out fusion processing on the wide-angle image and the at least two frames of the tele images to obtain a target image.
An electronic device comprises a wide-angle camera, at least two tele cameras, a memory and a processor; wherein an overlapping field of view region exists between each tele camera and each wide camera; the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
controlling the wide-angle camera to acquire a wide-angle image;
controlling each long-focus camera to collect a frame of long-focus images with a preset focus to obtain at least two frames of long-focus images, wherein the preset focus is located at infinity;
and carrying out fusion processing on the wide-angle image and the at least two frames of the tele images to obtain a target image.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
controlling the wide-angle camera to acquire a wide-angle image;
controlling each long-focus camera to collect a frame of long-focus images with a preset focus to obtain at least two frames of long-focus images, wherein the preset focus is located at infinity;
and carrying out fusion processing on the wide-angle image and the at least two frames of the tele images to obtain a target image.
According to the image acquisition method, the image acquisition device, the electronic equipment and the computer-readable storage medium, the wide-angle image used for fusing to obtain the target image is the image shot by the wide-angle camera, and the tele image is the image shot by the tele camera with the preset focus at the infinite distance, so that the shot objects in the tele images of at least two frames can keep consistent definition, the problems of inconsistent definition and poor image effect and quality of the fused images caused by different focusing positions among different cameras can be solved, and the quality of the target image can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary environment in which an image capture method may be implemented;
FIG. 2 is a schematic diagram of an image processing circuit in one embodiment;
FIG. 3 is a flow diagram of a method of image acquisition in one embodiment;
FIG. 4 is a schematic diagram of a Wide image and a 4 frame tele image in one embodiment;
FIG. 5 is a flow chart of a method of image acquisition in another embodiment;
FIG. 6 is a schematic diagram of a 2 frame wide image and a 4 frame long focus image in one embodiment;
FIG. 7 is a block diagram showing the structure of an image pickup device according to an embodiment;
FIG. 8 is a block diagram of an electronic device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first processor may be referred to as a second processor, and similarly, a second processor may be referred to as a first processor, without departing from the scope of the present application. The first processor and the second processor are both processors, but they are not the same processor.
Fig. 1 is a schematic diagram of an application environment of an image acquisition method in an embodiment. As shown in fig. 1, the application environment includes an electronic device 100. Among them, the electronic apparatus 100 includes a wide camera 210 and at least two tele cameras 220, and specifically, the tele cameras 220 are fixed-focus cameras having a focus at infinity. When image acquisition is performed, the wide-angle camera 210 and the at least two tele cameras 220 are arranged on the same side of the electronic device 100 in a certain structure, so that an overlapped view field area exists between each tele camera 220 and the wide-angle camera 210. The electronic device 100 may control the wide-angle camera 210 to collect a wide-angle image, control each telephoto camera 220 to collect a frame of long-focus images with a preset focus, obtain at least two frames of long-focus images, and perform fusion processing on the wide-angle image and the at least two frames of long-focus images to obtain a target image. The electronic device 100 may not be limited to various mobile phones, tablet computers, or personal digital assistants or wearable devices, etc.
The electronic device 100 includes therein an Image Processing circuit, which may be implemented by hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 2 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 2, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 2, the image processing circuit includes a first ISP processor 230, a second ISP processor 240 and control logic 250. Wide angle camera 210 includes one or more first lenses 212 and a first image sensor 214. The first image sensor 214 may include a color filter array (e.g., a Bayer filter), and the first image sensor 214 may acquire light intensity and wavelength information captured with each imaging pixel of the first image sensor 214 and provide a set of image data that may be processed by the first ISP processor 230. Tele camera 220 includes one or more second lenses 222 and a second image sensor 224. The second image sensor 224 may include a color filter array (e.g., a Bayer filter), and the second image sensor 224 may acquire light intensity and wavelength information captured with each imaging pixel of the second image sensor 224 and provide a set of image data that may be processed by the second ISP processor 240.
The wide-angle image collected by the wide-angle camera 210 is transmitted to the first ISP processor 230 for processing, after the wide-angle image is processed by the first ISP processor 230, the statistical data of the wide-angle image (such as the brightness of the image, the contrast value of the image, the color of the image, etc.) may be sent to the control logic 250, and the control logic 250 may determine the control parameter of the wide-angle camera 210 according to the statistical data, so that the wide-angle camera 210 may perform operations such as auto-focusing and auto-exposure according to the control parameter. The wide-angle image may be stored in the image memory 260 after being processed by the first ISP processor 230, and the first ISP processor 230 may also read the image stored in the image memory 260 for processing. In addition, the wide-angle image may be directly transmitted to the display 270 for display after being processed by the ISP processor 230, or the display 270 may read and display the image in the image memory 260.
Wherein the first ISP processor 230 processes the image data pixel by pixel in a plurality of formats. For example, each image pixel may have a bit depth of 2, 10, 12, or 14 bits, and the first ISP processor 230 may perform one or more image processing operations on the image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The image Memory 260 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving the interface from the first image sensor 214, the first ISP processor 230 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 260 for additional processing before being displayed. The first ISP processor 230 receives the processed data from the image memory 260 and performs image data processing in RGB and YCbCr color spaces on the processed data. The image data processed by the first ISP processor 230 may be output to a display 270 for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of the first ISP processor 230 may also be transmitted to the image memory 260, and the display 270 may read image data from the image memory 260. In one embodiment, image memory 260 may be configured to implement one or more frame buffers.
The statistics determined by the first ISP processor 230 may be sent to the control logic 250. For example, the statistical data may include first image sensor 214 statistical information such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, first lens 212 shading correction, and the like. Control logic 250 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters for wide angle camera 210 and control parameters for first ISP processor 230 based on the received statistics. For example, the control parameters of wide-angle camera 210 may include gain, integration time of exposure control, anti-shake parameters, flash control parameters, first lens 212 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters, and the like. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as first lens 212 shading correction parameters.
Similarly, the tele image acquired by the tele camera 220 is transmitted to the second ISP processor 240 for processing, after the second ISP processor 240 processes the tele image, the statistical data of the tele image (such as the brightness of the image, the contrast value of the image, the color of the image, etc.) may be sent to the control logic 250, and the control logic 250 may determine the control parameter of the tele camera 220 according to the statistical data, so that the tele camera 220 may perform operations such as auto-focusing and auto-exposure according to the control parameter. The tele image may be stored in the image memory 260 after being processed by the second ISP processor 240, and the second ISP processor 240 may also read the image stored in the image memory 260 for processing. The tele image may be processed by the ISP processor 240 and then directly transmitted to the display 270 for display, or the display 270 may read and display the image in the image memory 260. Tele camera 220 and second ISP processor 240 may also implement the processes described as wide camera 210 and first ISP processor 230.
In the present embodiment, the angle of view of the wide camera 210 is larger than that of the tele camera 220, and the tele camera 230 is a fixed-focus camera whose preset focus is at infinity. For example, the field angle of wide-angle camera 210 may be 80 degrees, 85 degrees, 90 degrees, 100 degrees, etc.; the field angle of the telephoto camera 220 may be 20 degrees, 25 degrees, 30 degrees, 40 degrees, etc., and is not limited thereto. Electronic device 100 may contain at least one wide camera 210 and at least two tele cameras 220. For example, the number of wide cameras 210 may be 1, 2, etc., and the number of tele cameras 220 may be 2, 3, 4, etc.
Optionally, after the electronic device 100 controls the wide-angle camera 210 to collect a wide-angle image and controls each tele camera 220 to collect a frame of tele image with a preset focus, the first ISP processor 230 may perform fusion processing on the wide-angle image and at least two frames of tele images, or the second ISP processor 240 may perform fusion processing on the wide-angle image and at least two frames of tele images; in some embodiments, the wide-angle image and the at least two frame long-focus images may also be subjected to a fusion process or the like by a processor of the electronic device, which is not limited herein.
The image acquisition method provided in the embodiment of the present application is described by taking the electronic device operating in fig. 1 as an example, where the electronic device includes a wide-angle camera and at least two tele cameras, and an overlapped field area exists between each tele camera and the wide-angle camera. Specifically, the image acquisition method includes steps 302 to 306. Wherein the content of the first and second substances,
and step 302, controlling the wide-angle camera to collect a wide-angle image.
The electronic equipment controls the wide-angle camera to collect wide-angle images. Specifically, the electronic device may control the wide-angle camera to capture the wide-angle image upon receiving the image capture instruction.
And 304, controlling each long-focus camera to collect a long-focus image in one frame according to a preset focus to obtain at least two long-focus images, wherein the preset focus is located at infinity.
Specifically, the telephoto camera is a fixed-focus camera whose focus is located at infinity. Before the electronic device leaves the factory, the focal length of the telephoto camera may be set to infinity in advance, that is, the focal point is located at infinity. Therefore, the electronic device includes a telephoto camera having only one lens with an infinite fixed focal length, and the telephoto camera has no zoom function. The electronic equipment controls each long-focus camera to collect a frame of long-focus images respectively according to a preset focus located at infinity, and the obtained imaging definition of at least two frames of long-focus images keeps consistent, namely the definition of a shot object in at least two long-focus images with the same distance with the camera is consistent.
Each tele camera and the wide camera have overlapping fields of view. The field of view region refers to a field of view range that the camera can capture. For example, the size of a scene is 4 × 6m, and the size of the target objects in the scene is 2 × 3 m; if the wide-angle camera can acquire the wide-angle image of the scenery, and one of the telephoto cameras can only acquire the telephoto image of the target object, and the target object is in the scenery, the wide-angle camera and the telephoto camera have overlapped view field areas. In particular, because the field angle of the wide camera is greater than the field angle of the tele camera, the wide camera may encompass the field of view area of the tele camera.
Optionally, the electronic device may control the wide-angle camera to acquire a wide-angle image when receiving a camera start instruction, and display the wide-angle image as a preview image on a display screen of the electronic device; when the electronic equipment receives an image acquisition instruction, the electronic equipment controls the wide-angle cameras to acquire the wide-angle images and simultaneously controls each long-focus camera to acquire a frame of long-focus images with a preset focus located at infinity respectively to obtain at least two frames of long-focus images.
And step 306, fusing the wide-angle image and the long-focus images in at least two frames to obtain a target image.
The fusion processing is an operation of generating a final image from a plurality of images according to a certain rule. Specifically, the electronic device may perform fusion processing on the wide-angle image and the at least two frames of the long-focus image by one or more fusion methods based on linear weighting, nonlinear weighting fusion, fusion of principal component analysis, pyramid change, wavelet change, and the like, to obtain a target image. Optionally, after the electronic device performs fusion processing to obtain the target image, the target image may be displayed on a display screen of the electronic device.
In the embodiment provided by the application, the wide-angle images are collected by controlling the wide-angle cameras, each long-focus camera is controlled to collect a frame of long-focus images with a preset focus located at infinity, and the obtained at least two frames of long-focus images and the wide-angle images are subjected to fusion processing to obtain the target images. Because the wide-angle image used for obtaining the target image by fusion is the image shot by the wide-angle camera and the long-focus image is the image shot by the long-focus camera with the preset focus being infinite, the shot objects in the long-focus images of at least two frames can keep consistent definition, the problems of inconsistent definition of the fused images and poor image effect and quality caused by different focusing positions among different cameras can be avoided, and the quality of the target image can be improved.
In one embodiment, there is no overlapping field of view area between at least two tele cameras.
Each tele camera may be configured to capture different field of view regions such that the field of view regions of each frame tele camera may correspond to partial field of view regions of the wide camera, respectively. For example, when the electronic device includes two tele cameras, the field areas of the two tele cameras may respectively correspond to the upper and lower field areas of the wide camera, or may also respectively correspond to the left and right field areas of the wide camera; when the field of view area of the wide-angle camera comprises three areas, namely a building area, a tree area and a road area, one long-focus camera can be used for collecting long-focus images comprising the building area, one long-focus camera can be used for collecting long-focus images comprising the tree area, and the other long-focus camera can be used for shooting long-focus images comprising the road area.
During assembly and configuration of the electronic device, the field of view of each tele camera may be adjusted so that there is no overlapping field of view between each tele camera. At least two long-focus cameras do not have overlapped field of view areas, and the image details contained in at least two long-focus images obtained by the electronic equipment are richer. Taking the telephoto camera as a 16M pixel as an example, if the electronic device includes 4 telephoto cameras, and there is no overlapping area between 4 frames of telephoto images acquired by the 4 telephoto cameras, the image obtained by fusing the 4 frames of telephoto images by the electronic device is 64M pixels; if the overlapped parallax regions exist between the long-focus cameras, the overlapped regions exist between the acquired long-focus images of at least two frames, and the pixels of the images obtained by fusion are smaller than 64M pixels. The electronic equipment comprises at least two long-focus cameras, wherein an overlapped field of view region does not exist between the at least two long-focus cameras, and pixels of a fused image can be increased.
In one embodiment, the electronic device includes 4 tele cameras, and the field of view areas of the 4 tele cameras extend from one of corner positions of the field of view area of the wide camera to a middle position of the field of view area of the wide camera, respectively.
As shown in fig. 4, taking an example that the electronic device includes 1 wide-angle camera and 4 telephoto cameras as an example, the electronic device may acquire a frame of wide-angle image 402 through the wide-angle camera, and the electronic device acquires an image through 4 telephoto cameras whose preset focuses are located at infinite distance, so as to obtain corresponding 4 frames of telephoto images 404, 406, 408, and 410 with consistent imaging sharpness. Specifically, the field of view areas of the 4-frame long- focus images 404, 406, 408, and 410 extend from one of the corner positions of the field of view area of the wide image 402 to the middle position of the field of view area of the wide image, respectively, as shown in the figure, the field of view areas of the 4-frame long- focus images 404, 406, 408, and 410 are located at the upper left area, the upper right area, the lower left area, and the lower right area of the field of view area of the wide image 402, respectively. The overlap region 412 is the overlapping field of view region between the 4 tele cameras. Optionally, in some embodiments, there may also be no such overlapping field of view regions between the 4 tele cameras. The parallax area of the wide-angle camera is approximate to the total field of view area of the 4 tele cameras, and the electronic device can fuse the wide-angle image 402 and the 4 frame tele images 404, 406, 408 and 410 by taking the wide-angle image 402 as a reference to obtain a target image with a large shooting picture, consistent image definition and good quality.
In one embodiment, the electronic device comprises at least two wide-angle cameras, and the provided image acquisition method can comprise the following steps:
and 502, controlling each wide-angle camera to respectively collect a frame of wide-angle image to obtain at least two frames of wide-angle images.
The field angles of at least two wide-angle cameras included in the electronic equipment are the same. For example, the wide-angle camera has a field angle of 75 degrees, 80 degrees, 85 degrees, 90 degrees, 95 degrees, and the like, and is not limited herein. Optionally, in some embodiments, the field angles of the at least two wide-angle cameras may also be different.
There is the field of view region of overlapping between at least two wide-angle cameras, and specifically, the field of view region between the wide-angle cameras can be set for according to actual need, does not do the restriction here. For example, when two wide-angle cameras are included, the two wide-angle cameras may include an overlapped field of view region and an un-overlapped field of view region, and the acquired image frame may be increased; when the camera comprises three wide-angle cameras, the three wide-angle cameras can be sequentially arranged, two adjacent wide-angle cameras can have overlapped view field areas, and two non-adjacent wide-angle cameras can not have overlapped view field areas; the three wide-angle cameras may also have an overlapped field of view area between every two wide-angle cameras, which is not limited herein.
The electronic equipment can control each wide-angle camera in at least two wide-angle cameras to respectively collect a frame of wide-angle image, and the same shot object and the different shot object exist between the obtained at least two frames of wide-angle images.
And 504, controlling each long-focus camera to collect a long-focus image in one frame according to a preset focus to obtain at least two long-focus images, wherein the preset focus is located at infinity.
And step 506, fusing the at least two frames of wide-angle images according to the overlapped area in the at least two frames of wide-angle images to obtain a reference image.
The electronic device may fuse the at least two wide-angle images according to an area of overlap between the at least two wide-angle images. Compared with the wide-angle image of each frame, the reference image obtained by fusion has a larger field area and contains more objects to be shot.
And step 508, fusing the reference image and the long-focus images of at least two frames to obtain a target image.
Specifically, each frame has an overlapping region between the tele image and the reference image, and the electronic device may perform fusion processing on the reference image and at least two frames of the tele image according to the overlapping region between the tele image and the reference image. Based on the camera imaging principle, the definition and the imaging effect of the central area in the image acquired by the camera are often higher than those of the edge area, and the electronic device can respectively acquire the image content of the central area of the reference image and the image content of the long-focus images of at least two frames to synthesize the target image in the process of fusing the reference image and the long-focus images of at least two frames.
Optionally, in some embodiments, an overlapped region exists between at least two frame long-focus images, and the electronic device may also splice at least two frame long-focus images first, and then perform fusion processing on an image obtained after the splicing processing and the reference image to obtain the target image. Specifically, the electronic device may sequentially stitch the multi-frame long-focus images in a predetermined direction according to the reference image.
The electronic equipment respectively collects a frame of wide-angle image by controlling each wide-angle camera, at least two obtained frames of wide-angle images are fused to obtain a reference image, each camera is controlled to collect a frame of long-focus image at a preset focus, the obtained at least two frames of long-focus images and the at least two frames of wide-angle images are fused to obtain a target image, and the field area of the target image can be improved while the definition of the target image is ensured.
FIG. 6 is a schematic diagram of 2 Wide images and 4 tele images acquired by the electronic device in one embodiment. As shown in fig. 6, the electronic device is provided with 2 wide-angle cameras and 4 telephoto cameras with focuses at infinity, and the electronic device can control the two wide-angle cameras to collect a frame of wide-angle image, so as to obtain a wide-angle image 602 and a wide-angle image 604 respectively collected by the two wide-angle cameras. The electronics can fuse wide-angle image 602 and wide-angle image 604 to obtain reference image 606. Reference image 606 includes the field of view areas of wide-angle image 602 and wide-angle image 604, i.e., the field of view area of reference image 606 is larger than the field of view areas of wide-angle image 602 and wide-angle image 604. The electronic equipment acquires images through 4 tele cameras at preset focuses to obtain corresponding tele images 608, 610, 612 and 614 with consistent 4-frame imaging definition. There may be no overlapping field of view regions between the 4 frame tele images 608, 610, 612, and 614. The electronic device can perform fusion processing on the long- focus images 608, 610, 612, and 614 in 4 frames based on the reference image 606 to obtain a target image with a larger field of view and a better image definition effect.
In one embodiment, the image capturing method provided in the embodiment, before the fusing the at least two wide-angle images according to the overlapped region in the at least two wide-angle images to obtain the reference image, further includes: obtaining calibration parameters between at least two wide-angle cameras; and correcting at least two frames of wide-angle images according to the calibration parameters.
The calibration parameters are obtained by calibrating at least two wide-angle cameras before the electronic equipment leaves a factory. The calibration processing refers to the operation of solving parameters in a geometric model imaged by the camera, and the shot image can restore an object in a space through the geometric model imaged by the camera. For example, when the electronic device includes two wide-angle cameras, such as the wide-angle camera a and the wide-angle camera B, calibration parameters between the two wide-angle cameras acquired by the electronic device are binocular calibration parameters between the wide-angle camera a and the wide-angle camera B. The binocular calibration parameters may include a rotation matrix and a translation matrix. And the electronic equipment corrects the at least two frames of wide-angle images according to the calibration parameters so as to enable the positions of the same characteristic points in the at least two frames of processed wide-angle images to correspond to each other in each frame of wide-angle image.
Optionally, in some embodiments, the calibration parameters may also include a monocular calibration parameter corresponding to each wide-angle camera, and the electronic device may perform correction processing on the wide-angle image acquired by the wide-angle camera according to the binocular calibration parameter after processing the wide-angle image acquired by the wide-angle camera according to the monocular calibration parameter corresponding to each wide-angle camera.
By acquiring calibration parameters between at least two wide-angle cameras and correcting at least two frames of wide-angle images according to the calibration parameters, the positions of the same characteristic in each frame of wide-angle image in the at least two processed frames of wide-angle images correspond to each other, so that when the at least two frames of wide-angle images are synthesized, the positions of the same characteristic points in different wide-angle images can be accurately found, and the accuracy of synthesis processing can be improved.
In an embodiment, before controlling each telephoto camera to respectively acquire a frame of telephoto image at a preset focus to obtain at least two frames of telephoto images, the provided image acquisition method further includes: acquiring a focusing distance when a wide-angle camera acquires a wide-angle image; and when the focusing distance is larger than or equal to the preset distance, controlling each long-focus camera to respectively collect a frame of long-focus images by using the preset focus to obtain at least two frames of long-focus images.
The focusing distance refers to a distance between a focusing object and the camera. The focusing distance when the electronic equipment acquires a wide-angle image is acquired by the wide-angle camera. Specifically, the electronic equipment can acquire a preview image through the wide-angle camera and display the preview image on a display screen of the electronic equipment so as to receive a focusing area selected by a user and further acquire a focusing distance of a focusing object corresponding to the focusing area; the electronic device may also determine the focusing object by using automatic focusing such as contrast focusing, laser focusing, or phase focusing, when acquiring the wide-angle image, so as to obtain the focusing distance of the focusing object.
The preset distance may be set by actual application requirements, and is not limited herein. Specifically, the preset distance is used for determining that the electronic equipment is in a close-range shooting state or a long-range shooting state. For example, the preset distance may be 3 meters, 5 meters, 7 meters, 10 meters, etc., and is not limited herein. The electronic equipment can determine that the electronic equipment is in a long-range shooting state when the focusing distance is larger than or equal to the preset distance, so that the electronic equipment can control each long-focus camera to collect a frame of long-focus images at the preset focus while collecting the wide-angle images through the wide-angle cameras to obtain at least two frames of long-focus images, and the wide-angle images and the at least two frames of long-focus images are subjected to fusion processing to obtain a target image.
Optionally, in an embodiment, when the focal distance is smaller than the preset distance and the electronic device includes a wide-angle camera, the wide-angle image is taken as the target image; and when the focusing distance is smaller than the preset distance and the electronic equipment comprises at least two wide-angle cameras, performing fusion processing on at least two frames of wide-angle images to obtain a target image.
The electronic equipment can determine that the electronic equipment is in a close shot state when the focusing distance is smaller than the preset distance, and at the moment, the electronic equipment can acquire a wide-angle image through the wide-angle camera so as to obtain a target image. Specifically, if the electronic device only includes one wide-angle camera, the electronic device may use a wide-angle image acquired by the wide-angle camera as a target image; if the electronic equipment comprises at least two wide-angle cameras, the electronic equipment can acquire multi-frame wide-angle images of the plurality of wide-angle cameras to perform fusion processing so as to obtain a target image.
The method comprises the steps of determining a generation mode of a target image according to the size relation between the focal distance and the preset distance by acquiring the focal distance when the wide-angle camera collects the wide-angle image, fusing images collected by the wide-angle camera and at least two long-focus cameras to obtain the target image when the focal distance is larger than or equal to the preset distance, obtaining the target image according to the wide-angle image collected by the wide-angle camera when the focal distance is smaller than the preset distance, meeting different shooting states, ensuring the quality and effect of the target image obtained by the electronic equipment under different shooting states, and improving the accuracy of image collection.
It should be understood that although the steps in the flowcharts of fig. 3 and 5 are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 3 and 5 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
Fig. 7 is a block diagram of an image capturing apparatus according to an embodiment. As shown in fig. 7, the image acquisition apparatus comprises a first acquisition module 702, a second acquisition module 704 and a processing module 706, wherein:
the first collecting module 702 is configured to control the wide-angle camera to collect a wide-angle image.
The second collecting module 704 is configured to control each tele camera to collect a frame of tele image with a preset focus to obtain at least two frames of tele images, where the preset focus is located at infinity, and an overlapped view field area exists between each tele camera and the wide camera.
And the processing module 706 is configured to perform fusion processing on the wide-angle image and the long-focus image in at least two frames to obtain a target image.
The image acquisition device that this application embodiment provided for the wide angle image that the fusion obtained the target image is the image that wide angle camera was shot, the long focus image is the image that the long focus camera that the focus is located infinity department was shot, then shot the object between the long focus image of at least two frames and can keep unanimous definition, can avoid because the position of focusing is different between the different cameras, and the image definition that leads to the fusion to obtain is inconsistent, image effect, the relatively poor problem of quality, can improve the quality of target image.
In one embodiment, the first collecting module 702 may be further configured to control each wide-angle camera to collect one frame of wide-angle image, respectively, so as to obtain at least two frames of wide-angle images; the processing module 706 may further be configured to fuse the at least two frames of wide-angle images according to the overlapping area in the at least two frames of wide-angle images to obtain a reference image, and perform fusion processing on the reference image and the at least two frames of long-focus images to obtain a target image.
In one embodiment, the processing module 706 may be further configured to obtain calibration parameters between at least two wide-angle cameras; and correcting at least two frames of wide-angle images according to the calibration parameters.
In one embodiment, the second acquisition module 704 may also be configured to acquire a focal distance when the wide-angle camera acquires a wide-angle image; and when the focusing distance is greater than or equal to the preset distance, controlling each long-focus camera to collect a frame of long-focus image with the preset focus respectively to obtain at least two frames of long-focus images.
In one embodiment, the processing module 706 may be further configured to take the wide-angle image as the target image when the focal distance is less than the preset distance and the number of the wide-angle cameras is 1; and when the focusing distance is smaller than the preset distance and the number of the wide-angle cameras is 2 or more than 2, fusing at least two frames of wide-angle images to obtain a target image.
In an embodiment, the second collecting module 704 may be further configured to control the 4 tele cameras to respectively collect a frame of tele images, so as to obtain 4 frames of tele images, where a field of view area of each of the 4 frames of tele images extends from one corner position of the field of view area of the wide image to a middle position of the field of view area of the wide image.
In one embodiment, the second acquiring module 704 may be further configured to control each of the at least two tele cameras to acquire a frame of long-focus images, so as to obtain at least two frames of long-focus images, where there is no overlapping field of view between the at least two frames of long-focus images.
The division of the modules in the image capturing device is only for illustration, and in other embodiments, the image capturing device may be divided into different modules as needed to complete all or part of the functions of the image capturing device.
Fig. 8 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 8, the electronic device includes a processor and a memory connected by a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor for implementing an image acquisition method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc.
The implementation of each module in the image acquisition apparatus provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the image acquisition method.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform an image acquisition method.
Any reference to memory, storage, database, or other medium used by embodiments of the present application may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. The image acquisition method is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a wide-angle camera and at least two tele cameras, and an overlapped view field area exists between each tele camera and the wide-angle camera; the method comprises the following steps:
controlling the wide-angle camera to acquire a wide-angle image;
controlling each long-focus camera to collect a frame of long-focus images with a preset focus to obtain at least two frames of long-focus images, wherein the preset focus is located at infinity;
and carrying out fusion processing on the wide-angle image and the at least two frames of the tele images to obtain a target image.
2. The method of claim 1, wherein the electronic device comprises at least two of the wide-angle cameras, wherein there is an overlapping disparity region between the at least two wide-angle cameras;
control the wide-angle image that the wide-angle camera gathered corresponds includes:
controlling each wide-angle camera to respectively collect a frame of wide-angle image to obtain at least two frames of wide-angle images;
the fusing the wide-angle image and the at least two frames of the tele images to obtain a target image comprises:
fusing at least two frames of the wide-angle images according to the overlapped areas in the at least two frames of the wide-angle images to obtain a reference image;
and carrying out fusion processing on the reference image and at least two frames of the tele images to obtain the target image.
3. The method of claim 2, wherein before fusing the at least two frames of the wide-angle images according to the overlapping area of the at least two frames of the wide-angle images to obtain the reference image, further comprising:
obtaining calibration parameters between at least two wide-angle cameras;
and correcting at least two frames of the wide-angle images according to the calibration parameters.
4. The method according to claim 1, wherein before controlling each of the tele cameras to collect a frame of tele image with a preset focus to obtain at least two frames of tele images, the method further comprises:
acquiring the focusing distance of the wide-angle camera when the wide-angle camera collects the wide-angle image;
and when the focusing distance is greater than or equal to a preset distance, executing the step of controlling each long-focus camera to respectively collect a frame of long-focus image by using a preset focus to obtain at least two frames of long-focus images.
5. The method of claim 4, further comprising:
when the focusing distance is smaller than the preset distance and the electronic equipment comprises one wide-angle camera, taking the wide-angle image as the target image;
and when the focusing distance is smaller than the preset distance and the electronic equipment comprises at least two wide-angle cameras, performing fusion processing on the at least two frames of wide-angle images to obtain the target image.
6. The method of claim 1, wherein the electronic device comprises 4 tele cameras, and wherein the field of view of the 4 tele cameras extends from one of the corner positions of the field of view of the wide camera to a middle position of the field of view of the wide camera.
7. The method of any of claims 1 to 6, wherein there is no overlapping field of view area between at least two of the tele cameras.
8. An image acquisition apparatus, comprising:
the first acquisition module is used for controlling the wide-angle camera to acquire a wide-angle image;
the second acquisition module is used for controlling each long-focus camera in the at least two long-focus cameras to acquire a frame of long-focus images with a preset focus respectively to obtain at least two frames of long-focus images, wherein the preset focus is located at infinity, and an overlapped view field area exists between each long-focus camera and the wide-angle camera;
and the processing module is used for carrying out fusion processing on the wide-angle image and the at least two frames of the tele images to obtain a target image.
9. An electronic device comprises a wide-angle camera, at least two tele cameras, a memory and a processor; an overlapped field of view area exists between each tele camera and each wide camera; the memory has stored therein a computer program which, when executed by the processor, causes the processor to carry out the steps of the image acquisition method according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN201910516211.3A 2019-06-14 2019-06-14 Image acquisition method and device, electronic equipment and computer readable storage medium Pending CN112087571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910516211.3A CN112087571A (en) 2019-06-14 2019-06-14 Image acquisition method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910516211.3A CN112087571A (en) 2019-06-14 2019-06-14 Image acquisition method and device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112087571A true CN112087571A (en) 2020-12-15

Family

ID=73733965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910516211.3A Pending CN112087571A (en) 2019-06-14 2019-06-14 Image acquisition method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112087571A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112702530A (en) * 2020-12-29 2021-04-23 维沃移动通信(杭州)有限公司 Algorithm control method and electronic equipment
CN114500981A (en) * 2022-02-12 2022-05-13 北京蜂巢世纪科技有限公司 Method, device, equipment and medium for tracking venue target
CN114511595A (en) * 2022-04-19 2022-05-17 浙江宇视科技有限公司 Multi-mode cooperation and fusion target tracking method, device, system and medium
CN115499565A (en) * 2022-08-23 2022-12-20 盯盯拍(深圳)技术股份有限公司 Image acquisition method, device and medium based on double lenses and automobile data recorder
WO2023071948A1 (en) * 2021-11-01 2023-05-04 华为技术有限公司 Image capturing method and related device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004320175A (en) * 2003-04-11 2004-11-11 Victor Co Of Japan Ltd Monitoring camera system
CN205596207U (en) * 2016-05-17 2016-09-21 徐文波 Camera
CN109379528A (en) * 2018-12-20 2019-02-22 Oppo广东移动通信有限公司 Imaging method, imaging device, electronic device and medium
CN109379522A (en) * 2018-12-06 2019-02-22 Oppo广东移动通信有限公司 Imaging method, imaging device, electronic device and medium
CN109639974A (en) * 2018-12-20 2019-04-16 Oppo广东移动通信有限公司 Control method, control device, electronic device and medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004320175A (en) * 2003-04-11 2004-11-11 Victor Co Of Japan Ltd Monitoring camera system
CN205596207U (en) * 2016-05-17 2016-09-21 徐文波 Camera
CN109379522A (en) * 2018-12-06 2019-02-22 Oppo广东移动通信有限公司 Imaging method, imaging device, electronic device and medium
CN109379528A (en) * 2018-12-20 2019-02-22 Oppo广东移动通信有限公司 Imaging method, imaging device, electronic device and medium
CN109639974A (en) * 2018-12-20 2019-04-16 Oppo广东移动通信有限公司 Control method, control device, electronic device and medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112702530A (en) * 2020-12-29 2021-04-23 维沃移动通信(杭州)有限公司 Algorithm control method and electronic equipment
WO2023071948A1 (en) * 2021-11-01 2023-05-04 华为技术有限公司 Image capturing method and related device
CN114500981A (en) * 2022-02-12 2022-05-13 北京蜂巢世纪科技有限公司 Method, device, equipment and medium for tracking venue target
CN114500981B (en) * 2022-02-12 2023-08-11 北京蜂巢世纪科技有限公司 Venue target tracking method, device, equipment and medium
CN114511595A (en) * 2022-04-19 2022-05-17 浙江宇视科技有限公司 Multi-mode cooperation and fusion target tracking method, device, system and medium
CN115499565A (en) * 2022-08-23 2022-12-20 盯盯拍(深圳)技术股份有限公司 Image acquisition method, device and medium based on double lenses and automobile data recorder
CN115499565B (en) * 2022-08-23 2024-02-20 盯盯拍(深圳)技术股份有限公司 Image acquisition method and device based on double lenses, medium and automobile data recorder

Similar Documents

Publication Publication Date Title
CN110166695B (en) Camera anti-shake method and device, electronic equipment and computer readable storage medium
CN107948519B (en) Image processing method, device and equipment
CN110536057B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109089047B (en) Method and device for controlling focusing, storage medium and electronic equipment
CN112087580B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110225248B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110278360B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110473159B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112087571A (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110233970B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110290323B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109862269B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110475067B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112866549B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112004029B (en) Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium
CN110177212B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109951641B (en) Image shooting method and device, electronic equipment and computer readable storage medium
CN111246100B (en) Anti-shake parameter calibration method and device and electronic equipment
CN110035206B (en) Image processing method and device, electronic equipment and computer readable storage medium
KR20190068618A (en) Method and terminal for photographing a terminal
CN113875219B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109963080B (en) Image acquisition method and device, electronic equipment and computer storage medium
CN112019734B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN110049240B (en) Camera control method and device, electronic equipment and computer readable storage medium
CN109559352B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201215