CN111080683B - Image processing method, device, storage medium and electronic equipment - Google Patents

Image processing method, device, storage medium and electronic equipment Download PDF

Info

Publication number
CN111080683B
CN111080683B CN201911253048.2A CN201911253048A CN111080683B CN 111080683 B CN111080683 B CN 111080683B CN 201911253048 A CN201911253048 A CN 201911253048A CN 111080683 B CN111080683 B CN 111080683B
Authority
CN
China
Prior art keywords
image
reference frame
pyramid
affine transformation
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911253048.2A
Other languages
Chinese (zh)
Other versions
CN111080683A (en
Inventor
贾玉虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911253048.2A priority Critical patent/CN111080683B/en
Publication of CN111080683A publication Critical patent/CN111080683A/en
Application granted granted Critical
Publication of CN111080683B publication Critical patent/CN111080683B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/44Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform

Abstract

The application discloses an image processing method, an image processing device, a storage medium and electronic equipment. The method comprises the following steps: acquiring a reference frame image and a non-reference frame image; calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image; calculating LBP feature graphs of each layer of images in the first image pyramid and the second image pyramid; calculating an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices; calculating a target affine transformation matrix according to the plurality of initial affine transformation matrices; the non-reference frame image and the reference frame image are registered according to the target affine transformation matrix. The method and the device can improve accuracy in registering the image with the repeated textures.

Description

Image processing method, device, storage medium and electronic equipment
Technical Field
The application belongs to the technical field of images, and particularly relates to an image processing method, an image processing device, a storage medium and electronic equipment.
Background
As the shooting capability of electronic devices becomes more and more powerful, users often use electronic devices to take images, such as taking photographs or recording videos. Therefore, the electronic apparatus is often required to perform various image processing operations. In the image processing, the electronic device can acquire multiple frames of images shot in the same shooting scene, and perform image registration and image fusion on the multiple frames of images, so as to obtain corresponding images. However, in the related art, when there is a repeated texture in a photographed scene, the accuracy is low when image registration is performed on a multi-frame image photographed in the photographed scene. That is, in the related art, when an electronic device registers an image having a repetitive texture, the accuracy of image registration is low.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and electronic equipment, which can improve the accuracy of registering images with repeated textures.
In a first aspect, an embodiment of the present application provides an image processing method, including:
acquiring a reference frame image and a non-reference frame image;
calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image;
Calculating LBP feature maps of each layer of images in the first image pyramid and the second image pyramid;
calculating an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices;
calculating a target affine transformation matrix according to the initial affine transformation matrices;
registering the non-reference frame image and the reference frame image according to the target affine transformation matrix.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the acquisition module is used for acquiring a reference frame image and a non-reference frame image;
a first calculation module for calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image;
the second calculation module is used for calculating LBP characteristic diagrams of each layer of images in the first image pyramid and the second image pyramid;
the third calculation module is used for calculating an initial affine transformation matrix corresponding to each layer of image according to the LBP characteristic map of each layer of image in the first image pyramid and the LBP characteristic map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices;
A fourth calculation module, configured to calculate a target affine transformation matrix according to the plurality of initial affine transformation matrices;
and the registration module is used for registering the non-reference frame image and the reference frame image according to the target affine transformation matrix.
In a third aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon a computer program, which when executed on a computer, causes the computer to execute a flow in the image processing method provided in the embodiments of the present application.
In a fourth aspect, embodiments of the present application further provide an electronic device, including a memory, and a processor, where the processor is configured to execute a flow in the image processing method provided in the embodiments of the present application by calling a computer program stored in the memory.
In the embodiment of the application, the electronic device may calculate the multi-scale images (i.e. the image pyramid) of the reference frame image and the non-reference frame image, and then calculate the multi-scale LBP feature map corresponding to the multi-scale images. Then, the electronic device may calculate a multi-scale affine transformation matrix (i.e. an initial affine transformation matrix) according to the multi-scale LBP feature map, and calculate a target affine transformation matrix according to the multi-scale affine transformation matrix. The electronic device may then register the reference frame image and the non-reference frame image according to the target affine transformation matrix. Since the embodiment is the target affine transformation matrix calculated according to the LBP feature map, and the LBP feature map extracts the local texture features of the image, the embodiment can perform image registration according to the local texture features, thereby effectively improving the accuracy of registration of the image with repeated textures.
Drawings
The technical solution of the present application and the advantageous effects thereof will be made apparent from the following detailed description of the specific embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present application.
Fig. 2 is a first schematic diagram of an image pyramid provided in an embodiment of the present application.
Fig. 3 is another flow chart of the image processing method according to the embodiment of the present application.
Fig. 4 is a second schematic view of an image pyramid provided in an embodiment of the present application.
Fig. 5 to 6 are schematic views of a scenario of an image processing method according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an image processing apparatus provided in an embodiment of the present application.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 9 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numerals refer to like elements throughout, the principles of the present application are illustrated as embodied in a suitable computing environment. The following description is based on the illustrated embodiments of the present application and should not be taken as limiting other embodiments not described in detail herein.
It is understood that the execution subject of the embodiments of the present application may be an electronic device with a camera, such as a smart phone or tablet computer.
Referring to fig. 1, fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application, where the flow may include:
101. a reference frame image and a non-reference frame image are acquired.
As the shooting capability of electronic devices becomes more and more powerful, users often use electronic devices to take images, such as taking photographs or recording videos. Therefore, the electronic apparatus is often required to perform various image processing operations. In the image processing, the electronic device can acquire multiple frames of images shot in the same shooting scene, and perform image registration and image fusion on the multiple frames of images, so as to obtain corresponding images. However, in the related art, when there is a repeated texture in a photographed scene, the accuracy is low when image registration is performed on a multi-frame image photographed in the photographed scene.
In the embodiment of the present application, for example, the electronic device may first acquire two frame images, which are a reference frame image and a non-reference frame image, respectively.
102. A first image pyramid of the reference frame image is calculated, and a second image pyramid of the non-reference frame image is calculated.
For example, after acquiring the reference frame image and the non-reference frame image, the electronic device may calculate a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image.
The image pyramid is one of multi-scale expressions of images, and is an effective and conceptually simple structure for interpreting images in multiple resolutions. A pyramid of one image is a series of image sets that are arranged in a pyramid shape with progressively lower resolution and that are derived from the same original image. The image pyramid is obtained by downsampling a step-wise, and sampling is not stopped until a certain termination condition is reached. We can metaphe a layer-by-layer image into a pyramid, the higher the level, the smaller the image and the lower the resolution.
103. And calculating LBP characteristic diagrams of each layer of images in the first image pyramid and the second image pyramid.
For example, after calculating the first image pyramid corresponding to the reference frame image, the electronic device may calculate the LBP feature map of each layer of image in the first image pyramid. And after the second image pyramid corresponding to the non-reference frame image is calculated, the electronic device can calculate the LBP feature map of each layer of image in the second image pyramid.
It should be noted that LBP (Local Binary Pattern ) is an operator used to describe local texture features of an image; it has the obvious advantages of rotation invariance, gray scale invariance and the like. It can be used for texture feature extraction, and the extracted features are local texture features of the image.
The original LBP operator is defined as comparing the gray values of the adjacent 8 pixels with the gray value of the center pixel of the window as a threshold within the window of 3*3 pixels. If the gray value of the surrounding pixel is greater than that of the central pixel, the position of the pixel point is marked as 1, otherwise, the position is marked as 0. Thus, comparing 8 points in the 3*3 pixel neighborhood can generate an 8-bit binary number (typically converted into a decimal number, i.e., LBP code, 256 in total), i.e., the LBP value of the center pixel point of the window is obtained, and this value is used to reflect the texture information of the region.
The extracted LBP operator can obtain an LBP "code" at each pixel, and then, after extracting its original LBP operator for an image (the gray value of each pixel is recorded), the obtained original LBP feature is still a "picture" (the LBP value of each pixel is recorded).
For example, the reference frame image is A 0 The non-reference frame picture is B 0 . And respectively performing downsampling for 4 times on the reference frame image and the non-reference frame image to obtain respective corresponding image pyramids. For example, as shown in FIG. 2, reference frame image A 0 The corresponding first image pyramid is A from the first layer 0 、A 1 、A 2 、A 3 、A 4 . Rather than reference frame image B 0 The corresponding first image pyramid is B from the first layer 0 、B 1 、B 2 、B 3 、B 4
Then, in the process of 103, the electronic device may calculate an LBP feature map for each layer of image in the first image pyramid. That is, the electronic devices can calculate the image A respectively 0 LBP feature map, image a of (2) 1 LBP feature map, image a of (2) 2 LBP feature map, image a of (2) 3 LBP feature map, image a of (2) 4 Is a LBP signature of (b). Similarly, the electronic devices can respectively calculate to obtain images B 0 LBP feature map, image B of (a) 1 LBP feature map, image B of (a) 2 LBP feature map, image B of (a) 3 LBP feature map, image B of (a) 4 Is a LBP signature of (b).
104. According to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, calculating an initial affine transformation matrix corresponding to each layer of image to obtain a plurality of initial affine transformation matrices.
105. A target affine transformation matrix is calculated from the plurality of initial affine transformation matrices.
For example, 104 and 105 may include:
after the LBP feature map of each layer of image in the first image pyramid and the second image pyramid is obtained by calculation, the electronic device may calculate an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices.
For example, from a first layer image A in a first image pyramid 0 And a second layer image B in a second image pyramid 0 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix J 0 )。
From the second layer image A in the first image pyramid 1 And a second layer image B in a second image pyramid 1 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix J 1 )。
From the third layer image A in the first image pyramid 2 And third layer image B in second image pyramid 2 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix J 2 )。
From the fourth layer image A in the first image pyramid 3 And a fourth layer image B in the second image pyramid 3 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix J 3 )。
According to fifth layer image A in first image pyramid 4 And fifth layer image B in second image pyramid 4 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix J 4 )。
In this way, the electronic device can calculate 5 initial affine transformation matrices in total.
After calculating a plurality of initial affine transformation matrices, the electronic device may calculate a target affine transformation matrix according to the plurality of initial affine transformation matrices. For example, the electronic device may transform matrix J according to an initial affine 0 、J 1 、J 2 、J 3 、J 4 Calculating to obtain a target affine transformation matrix Y 1
106. Registering the non-reference frame image and the reference frame image according to the target affine transformation matrix.
For example, the target affine transformation matrix Y is obtained by calculation 1 The electronic device can then transform matrix Y according to the target affine 1 To non-reference frame image B 0 And reference frame image A 0 Image registration is performed.
It can be understood that in the embodiment of the present application, the electronic device may calculate the multi-scale images (i.e. the image pyramid) of the reference frame image and the non-reference frame image first, and then calculate the multi-scale LBP feature map corresponding to the multi-scale images. Then, the electronic device may calculate a multi-scale affine transformation matrix (i.e. an initial affine transformation matrix) according to the multi-scale LBP feature map, and calculate a target affine transformation matrix according to the multi-scale affine transformation matrix. The electronic device may then register the reference frame image and the non-reference frame image according to the target affine transformation matrix. Since the embodiment is the target affine transformation matrix calculated according to the LBP feature map, and the LBP feature map extracts the local texture features of the image, the embodiment can perform image registration according to the local texture features, thereby effectively improving the accuracy of registration of the image with repeated textures.
Referring to fig. 3, fig. 3 is another flow chart of the image processing method provided in the embodiment of the present application, where the flow may include:
201. the electronic device acquires a reference frame image and a non-reference frame image.
For example, the electronic device may first acquire two frames of images, e.g., image C and image D, respectively. The electronic device may then determine a reference frame image and a non-reference frame image from the two frame images.
In one embodiment, the electronic device may determine that the image with higher definition of the two frames is the reference frame image, and then the other image is the non-reference frame image. For example, image C may have a greater sharpness than image D, and the electronic device may determine image C as a reference frame image and image D as a non-reference frame image.
202. The electronic device obtains a first gray scale image of a reference frame image and a second gray scale image of a non-reference frame image.
For example, after determining the reference frame image C and the non-reference frame image D, the electronic device may acquire a gray scale image (i.e., a first gray scale image) of the reference frame image C and a gray scale image (i.e., a second gray scale image) of the non-reference frame image D.
203. The electronic device calculates a first image pyramid of the first gray scale map and a second image pyramid of the second gray scale map.
For example, after the first gray level map corresponding to the reference frame image C and the second gray level map corresponding to the non-reference frame image D are obtained, the electronic device may calculate a first image pyramid corresponding to the first gray level map and calculate a second image pyramid corresponding to the second gray level map.
For example, the first gray scale is C 0 The second gray level is D 0 . The electronic device may calculate the image pyramid corresponding to the first gray scale map and the second gray scale map according to the preset downsampling times and multiples. For example, the electronic device may calculate the first gray-scale image C if the preset number of downsampling is 4 times and the downsampling multiple is two times (i.e. the size of the next image is half of the previous image in each downsampling) 0 Corresponding first image pyramid and calculating to obtain a second gray level diagram D 0 A corresponding second image pyramid. Of course, in other embodiments, the preset number of downsampling may be other values, for example, 3 times, 5 times, or 6 times, and the downsampling multiple may be other values, for example, three times, four times, or so on, which is not specifically limited in the embodiments of the present application.
For example, as shown in fig. 4, in the case where the preset number of downsampling is 4 times and the downsampling multiple is two times, the first gray-scale image C 0 The corresponding first image pyramid is C from the first layer 0 、C 1 、C 2 、C 3 、C 4 . And a second gray scale D 0 Corresponding second image gold wordThe column is D from the first layer 0 、D 1 、D 2 、D 3 、D 4
204. The electronic device calculates an LBP characteristic map of each layer of images in the first image pyramid and the second image pyramid.
For example, after calculating the first image pyramid and the second image pyramid, the electronic device may calculate an LBP feature map of each layer of image in the first image pyramid, and calculate an LBP feature map of each layer of image in the second image pyramid.
That is, the electronic devices can calculate the images C respectively 0 LBP feature map, image C of (C) 1 LBP feature map, image C of (C) 2 LBP feature map, image C of (C) 3 LBP feature map, image C of (C) 4 Is a LBP signature of (b). Similarly, the electronic devices can respectively calculate to obtain images D 0 LBP feature map, image D of (C) 1 LBP feature map, image D of (C) 2 LBP feature map, image D of (C) 3 LBP feature map, image D of (C) 4 Is a LBP signature of (b).
In one embodiment, the process of calculating the LBP feature map for each layer of images in the first image pyramid and the second image pyramid by the electronic device in 204 may include:
and the electronic equipment calculates the LBP characteristic map of each layer of image in the first image pyramid and the second image pyramid by using the circular LBP characteristic operator.
For example, the electronic device may utilize a circular LBP feature operator to calculate an LBP feature map for each layer of images in the first image pyramid and the second image pyramid. For example, the circular LBP feature operator may be
Figure BDA0002309560800000071
Or->
Figure BDA0002309560800000072
And (5) an operator.
Of course, in other embodiments, the electronics can also use conventional LBP feature operators to calculate the LBP feature map for each layer of image in the image pyramid. The original LBP operator is defined as that in the window of 3*3 pixels, the gray value of the central pixel of the window is taken as a threshold value, the gray values of 8 adjacent pixels are compared with the gray value of the central pixel, if the gray value of the surrounding pixels is larger than the gray value of the central pixel, the position of the pixel is marked as 1, otherwise, the position of the pixel is marked as 0. Thus, comparing 8 points in the 3*3 pixel neighborhood can generate an 8-bit binary number (typically converted into a decimal number, i.e., LBP code, 256 in total), i.e., the LBP value of the center pixel point of the window is obtained, and this value is used to reflect the texture information of the region.
205. According to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, the electronic equipment calculates an initial affine transformation matrix corresponding to each layer of image to obtain a plurality of initial affine transformation matrices.
For example, after calculating the LBP feature map of each layer of image in the first image pyramid and the second image pyramid, the electronic device may calculate an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices.
For example, from a first layer image C in a first image pyramid 0 And a second layer image D in a second image pyramid 0 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix Z 0 )。
From the second layer image C in the first image pyramid 1 And a second layer image D in a second image pyramid 1 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix Z 1 )。
According to the third layer image C in the first image pyramid 2 And third layer image D in second image pyramid 2 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix Z 2 )。
According to a first image pyramidFourth layer image C of (3) 3 And a fourth layer image D in the second image pyramid 3 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix Z 3 )。
According to fifth layer image C in first image pyramid 4 And fifth layer image D in second image pyramid 4 The electronic device can calculate a corresponding affine transformation matrix (such as initial affine transformation matrix Z 4 )。
In this way, the electronic device can calculate 5 initial affine transformation matrices in total.
In one implementation, the electronic device in this embodiment may implement the flow 205 in the following manner:
for two frames of images of each corresponding layer in the first image pyramid and the second image pyramid, the electronic equipment executes preset processing to obtain a plurality of initial affine transformation matrixes, wherein the electronic equipment respectively marks the two frames of images of each corresponding layer in the first image pyramid and the second image pyramid as P i And Q i I is a positive integer less than or equal to n, where n is the number of layers of the image pyramid, and the preset process includes:
for image P i And Q i The electronic equipment determines a plurality of image blocks in the LBP feature images corresponding to the images, and calculates LBP feature vectors of the image blocks according to the LBP feature images of the images;
For image Q in the second image pyramid i The LBP feature vector S of each block 0 And the image P in the first image pyramid i LBP feature vector R of tile of corresponding position in 0 And the LBP feature vectors of the corresponding position neighborhood are respectively matched and are matched with S 0 Is determined as S 0 Is a matching feature vector of (a);
in image Q i After LBP feature vectors of all the image blocks in the image block determine corresponding matching feature vectors, sorting the matching degree of all the matching feature vector pairs, and determining the matching feature vector pairs positioned in a preset sequenceDefining as a target feature vector pair;
in image P i And Q i And extracting the features of the corresponding positions of the target feature vector pairs, and calculating a corresponding initial affine transformation matrix based on the extracted features.
For example, for two frames of images for each corresponding layer in the first image pyramid and the second image pyramid (e.g., first layer image C in the first image pyramid) 0 And a first layer image D in a second image pyramid 0 Two frames of images of the corresponding layer; as another example, second layer image C in the first image pyramid 1 And a second layer image D in a second image pyramid 1 For two frames of images of the corresponding layer, etc.), the electronic device may perform a preset process to obtain an initial affine transformation matrix corresponding to the layer. After each layer of image is subjected to preset processing, a plurality of initial affine transformation matrixes can be obtained.
For example, in this embodiment, the electronic device may record two frames of images of each corresponding layer in the first image pyramid and the second image pyramid as P i And Q i I is a positive integer less than or equal to n, which is the number of layers of the image pyramid. For example, the first image pyramid and the second image pyramid are each 5 layers. Then, the first layer image of the first image pyramid is denoted as P 1 The second layer image of the second image pyramid is denoted as Q 1 . The first layer image of the first image pyramid is denoted as P 2 The second layer image of the second image pyramid is denoted as Q 2 . The first layer image of the first image pyramid is denoted as P 3 The second layer image of the second image pyramid is denoted as Q 3 . The first layer image of the first image pyramid is denoted as P 4 The second layer image of the second image pyramid is denoted as Q 4 . The first layer image of the first image pyramid is denoted as P 5 The second layer image of the second image pyramid is denoted as Q 5
The preset processing may include: for image P i And Q i The electronic equipment determines a plurality of image blocks in the LBP characteristic map corresponding to each image, and according to the LBP of the imageCalculating LBP characteristic vectors of all the blocks by the characteristic diagram; for image Q in the second image pyramid i The LBP feature vector S of each block 0 And the image P in the first image pyramid i LBP feature vector R of tile of corresponding position in 0 And the feature vectors of the corresponding position neighborhoods are respectively matched and are matched with S 0 Is determined as S 0 Is a matching feature vector of (a); in image Q i After LBP feature vectors of all the image blocks in the image block determine corresponding matching feature vectors, sorting the matching degrees of all the matching feature vector pairs, and determining the matching feature vector pairs positioned in a preset sequence as target feature vector pairs; in image P i And Q i And extracting the features of the corresponding positions of the target feature vector pairs, and calculating a corresponding initial affine transformation matrix based on the extracted features.
The fifth layer image in the first image pyramid and the second image pyramid is described below as an example. For example, fifth layer image C in first image pyramid 4 Is LC in LBP characteristic diagram 4 Fifth layer image D in second image pyramid 4 Is LD as LBP characteristic diagram 4 . Wherein C is 4 Is denoted as P by the electronic device 5 ,D 4 Is denoted as Q by the electronic device 5
First, the electronic device may be in the image LC 4 A plurality of image blocks are determined, and LBP characteristic vectors of the image blocks are calculated. For example, for LBP characteristic image LC 4 The electronic device may traverse the entire image with a 16 x 16 (pixel) sliding window in 2 bit steps (pixels), each traversing a tile within a position sliding window may be referred to as a cell, and calculate histograms of all pixels within the cell (i.e., tile), so that feature vectors for the cell (i.e., tile) may be generated. In this way, the electronic device can obtain the image P 5 (i.e. image C 4 ) LBP profile LC of (a) 4 LBP feature vectors for each tile in (a). In the same way, the electronic device can obtain an image Q 5 (i.e. image D 4 ) LBP characteristic map LD of (C) 4 LBP feature vectors for each tile in (a).
Thereafter, the electronic device may convert the image Q 5 (image D) 4 ) LBP characteristic map LD of (C) 4 LBP feature vector S for each tile in 0 And image P 5 (image C) 4 ) LBP profile LC of (a) 4 LBP feature vector R of tile of corresponding position in 0 LBP feature vector R of the corresponding position 8 neighborhood 1 To R 8 Respectively matching, thereby obtaining 9 corresponding matching degrees. The electronic device can make R 0 To R 8 Middle and S 0 Is determined as S 0 Is used for matching the feature vector. Through the above mode, the LBP characteristic map LD 4 The LBP feature vector of each tile in the list may find a matching LBP feature vector.
In one embodiment, the electronic device may measure the degree of matching between two feature vectors by calculating the euclidean distance (Euclidean Distance) between the two feature vectors. A smaller euclidean distance indicates a higher degree of matching between two feature vectors.
In other embodiments, the electronic device may also convert the image Q 5 (image D) 4 ) LBP characteristic map LD of (C) 4 LBP feature vector S for each tile in 0 And image P 5 (image C) 4 ) LBP profile LC of (a) 4 LBP feature vector R of tile of corresponding position in 0 And LBP feature vectors of other numbers of neighborhoods (such as 15 neighborhoods or 24 neighborhoods) of the corresponding positions are respectively matched, so that the matching degree of the other numbers is obtained. The electronic device can match these matches with S 0 Is determined as S 0 Is used for matching the feature vector.
In LBP characterization map LD 4 The LBP feature vectors of all the tiles in the LBP feature map LC 4 After the corresponding matching feature vectors are determined, the electronic device can sort the matching degrees of all the matching feature vector pairs (every two matching LBP feature vectors form one matching feature vector pair), and determine the matching feature vector pair located in the preset sequence as a target feature vector pair. For example, the electronic device may compare the degree of matching The top 10% of the matching feature vector pairs are determined to be the target feature vector pairs.
Thereafter, the electronic device may be in image P 5 And Q 5 Extracting the features of the position corresponding to the target feature vector pair, and calculating the image P based on the extracted features 5 And Q 5 A corresponding initial affine transformation matrix. That is, the electronic device may take the positions corresponding to each pair of target feature vectors as the positions of the key points, extract the features of the key points, and calculate the affine transformation matrix (i.e., the initial affine transformation matrix) corresponding to the fifth layer image based on the extracted features of the key points.
Similarly, by the above calculation method, the electronic device may calculate an affine transformation matrix corresponding to the first layer image, an affine transformation matrix corresponding to the second layer image, an affine transformation matrix corresponding to the third layer image, and an affine transformation matrix corresponding to the fourth layer image.
206. The electronic device obtains weights corresponding to each initial affine transformation matrix.
207. And according to the weight corresponding to each initial affine transformation matrix, the electronic equipment carries out weighted average calculation on the plurality of initial affine transformation matrixes, and calculates to obtain a target affine transformation matrix.
For example, 206 and 207 may include:
for example, the electronic device calculates 5 initial affine transformation matrices, Z 0 、Z 1 、Z 2 、Z 3 、Z 4 . Then, the electronic device may acquire weights corresponding to the initial affine transformation matrices, and then perform weighted average calculation on the 5 initial affine transformation matrices according to the weights corresponding to the initial affine transformation matrices, thereby calculating to obtain the target affine transformation matrix.
The weights corresponding to the initial affine transformation matrices may be preset. For example, the electronic device may preset the weight corresponding to the first layer of image in the image pyramid to be W 0 The weight corresponding to the second layer image is W 1 The weight corresponding to the third layer image is W 2 Fourth layer imageThe corresponding weight is W 3 The fifth layer image has a weight W 4 . Wherein W is 0 、W 1 、W 2 、W 3 、W 4 The specific values of (2) may be set as desired. For example, W 0 、W 1 、W 2 、W 3 、W 4 0.3, 0.25, 0.2, 0.15, 0.1 in this order. Alternatively, W 0 、W 1 、W 2 、W 3 、W 4 Or 0.2, etc.
208. The electronic device registers the non-reference frame image and the reference frame image according to the target affine transformation matrix.
For example, after calculating the target affine transformation matrix, the electronic device may register the non-reference frame image D and the reference frame image C.
In the embodiment of the application, the electronic device may calculate the multi-scale images (i.e. the image pyramid) of the reference frame image and the non-reference frame image, and then calculate the multi-scale LBP feature map corresponding to the multi-scale images. Then, the electronic device may calculate a multi-scale affine transformation matrix (i.e. an initial affine transformation matrix) according to the multi-scale LBP feature map, and calculate a target affine transformation matrix according to the multi-scale affine transformation matrix. The electronic device may then register the reference frame image and the non-reference frame image according to the target affine transformation matrix. Since the embodiment is the target affine transformation matrix calculated according to the LBP feature map, and the LBP feature map extracts the local texture features of the image, the embodiment can perform image registration according to the local texture features, thereby effectively improving the accuracy of registration of the image with repeated textures.
In another embodiment, in addition to obtaining the first image pyramid and the second image pyramid from the gray level maps of the reference frame image and the non-reference frame image, respectively, the first image pyramid and the second image pyramid may also be obtained from the luminance components of the reference frame image and the non-reference frame image, respectively. That is, after the reference frame image and the non-reference frame image are acquired, the following procedure may be further included:
The electronic device obtains a first luminance component of the reference frame image and a second luminance component of the non-reference frame image.
Then, the electronic device calculates a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image, which may include: the electronic device calculates a corresponding first image pyramid from the first luminance component and a corresponding second image pyramid from the second luminance component.
Referring to fig. 5 to fig. 6, fig. 5 to fig. 6 are schematic views of an image processing method according to an embodiment of the present application.
For example, in a shooting scene with repeated textures, for example, a user needing to shoot takes a lattice shirt or a background of the shooting scene has repeated patterns, the user presses a shooting button to shoot, and after receiving a shooting instruction, the electronic device can rapidly shoot multiple frames of images in the same scene. For example, the electronic apparatus captures 3 frames of images, which are images E, F, G, respectively.
The electronic device may then determine a reference frame image and a non-reference frame image from the 3 frame images. For example, the electronic device may determine the highest-definition image of the images E, F, G as the reference frame image, and the other images as the non-reference frame images. For example, in the present embodiment, the electronic device determines the image E as a reference frame image, and the image F, G is a non-reference frame image.
After determining the reference frame image and the non-reference frame image, the electronic device may calculate the grayscale of image E, F, G, respectively, e.g., the grayscale of image E is E 0 The gray scale of the image F is F 0 The gray scale of the image G is G 0
Thereafter, the electronic device may calculate the images E, respectively 0 、F 0 And G 0 Is described. For example, the electronic device may calculate an image pyramid for each image with a number of downsampling of 4 times and a downsampling multiple of one-half. For example, image E 0 The images of each layer in the corresponding image pyramid are E in turn 0 、E 1 、E 2 、E 3 、E 4 . Image F 0 The images of each layer in the corresponding image pyramid are sequentially F 0 、F 1 、F 2 、F 3 、F 4 . Image G 0 Each layer of images in the corresponding image pyramid are sequentially G 0 、G 1 、G 2 、G 3 、G 4
Thereafter, the electronic device may calculate image E 0 LBP feature map of each layer of image in corresponding image pyramid, and calculating image F 0 LBP feature map of each layer of image in corresponding image pyramid, and calculating image G 0 LBP feature maps for each layer of images in the corresponding image pyramid.
The electronic device can then first respond to the image E 0 LBP feature map of each layer of image in corresponding image pyramid and image F 0 And calculating an affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the corresponding image pyramid in a mode provided by the embodiment of the application, so as to obtain a plurality of affine transformation matrices. Then, the electronic device may calculate a target affine transformation matrix X according to the affine transformation matrices in the manner provided in the embodiments of the present application, and then the electronic device may register and align the image F and the image E according to the affine transformation matrix X, as shown in fig. 5.
The electronic device can then first respond to the image E 0 LBP feature map of each layer of image in corresponding image pyramid and image G 0 And calculating an affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the corresponding image pyramid in a mode provided by the embodiment of the application, so as to obtain a plurality of affine transformation matrices. Then, the electronic device may calculate a target affine transformation matrix V according to the affine transformation matrices in a manner provided in the embodiments of the present application, and then the electronic device may register and align the image G and the image E according to the affine transformation matrix V, as shown in fig. 5.
After the images are aligned, the electronic device may perform image fusion on the image E, F, G to obtain the final photograph, as shown in fig. 6.
In some embodiments, the image E, F, G can be an image with different exposure levels, e.g., image F is an overexposed image, image G is an underexposed image, and image E is a normally exposed image, then the image fusion of image E, F, G by the electronic device can be an HDR (high dynamic range) fusion, taking a photograph with high dynamic range effects.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application. The image processing apparatus 300 may include: an acquisition module 301, a first calculation module 302, a second calculation module 303, a third calculation module 304, a fourth calculation module 305, a registration module 306.
The acquiring module 301 is configured to acquire a reference frame image and a non-reference frame image.
A first calculation module 302 is configured to calculate a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image.
A second calculation module 303, configured to calculate an LBP feature map of each layer of images in the first image pyramid and the second image pyramid.
The third calculation module 304 is configured to calculate an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices.
A fourth calculation module 305 is configured to calculate a target affine transformation matrix according to the plurality of initial affine transformation matrices.
A registration module 306, configured to register the non-reference frame image and the reference frame image according to the target affine transformation matrix.
In one embodiment, the fourth computing module 305 may be configured to:
acquiring weights corresponding to the initial affine transformation matrixes;
and carrying out weighted average calculation on the plurality of initial affine transformation matrixes according to the weight corresponding to each initial affine transformation matrix, and calculating to obtain a target affine transformation matrix.
In one embodiment, the third computing module 304 may be configured to:
performing preset processing on two frames of images of each corresponding layer in the first image pyramid and the second image pyramid to obtain a plurality of initial affine transformation matrixes, wherein the two frames of images of each corresponding layer in the first image pyramid and the second image pyramid are respectively marked as P i And Q i I is a positive integer less than or equal to n, where n is the number of layers of the image pyramid, and the preset process includes: for image P i And Q i Determining a plurality of image blocks in the LBP feature images corresponding to the images, and calculating LBP feature vectors of the image blocks according to the LBP feature images of the images; for image Q in the second image pyramid i The LBP feature vector S of each block 0 And the image P in the first image pyramid i LBP feature vector R of tile of corresponding position in 0 And the feature vectors of the corresponding position neighborhoods are respectively matched and are matched with S 0 Is determined as S 0 Is a matching feature vector of (a); at the image Q i After LBP feature vectors of all the image blocks in the image block determine corresponding matching feature vectors, sorting the matching degrees of all the matching feature vector pairs, and determining the matching feature vector pairs positioned in a preset sequence as target feature vector pairs; at the image P i And Q i And extracting the features of the corresponding positions of the target feature vector pairs, and calculating a corresponding initial affine transformation matrix based on the extracted features.
In one embodiment, the acquisition module 301 may also be configured to: a first gray scale image of the reference frame image and a second gray scale image of the non-reference frame image are acquired.
Then, the first computing module 302 may be configured to: a first image pyramid of the first gray scale map and a second image pyramid of the second gray scale map are calculated.
In one embodiment, the acquisition module 301 may also be configured to: a first luminance component of the reference frame image and a second luminance component of the non-reference frame image are acquired.
Then, the first computing module 302 may be configured to: and calculating a corresponding first image pyramid according to the first brightness component, and calculating a corresponding second image pyramid according to the second brightness component.
In one embodiment, the first computing module 302 may be configured to: and calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image according to the preset downsampling times and multiples.
In one embodiment, the second computing module 303 may be configured to: and calculating the LBP characteristic map of each layer of image in the first image pyramid and the second image pyramid by using a circular LBP characteristic operator.
The present embodiment provides a computer-readable storage medium having stored thereon a computer program which, when executed on a computer, causes the computer to execute a flow in an image processing method as provided in the present embodiment.
The embodiment of the application also provides electronic equipment, which comprises a memory and a processor, wherein the processor is used for executing the flow in the image processing method provided by the embodiment by calling the computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 8, fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The electronic device 400 may include a camera module 401, a memory 402, a processor 403, and the like. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 8 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The camera module 401 may include a lens and an image sensor, wherein the lens is used to collect an external light source signal and provide the light source signal to the image sensor, and the image sensor senses the light source signal from the lens and converts the light source signal into digitized RAW image data, i.e., RAW image data. RAW is an unprocessed, also uncompressed format, which can be visually referred to as a "digital negative". The camera module 401 may include one camera or two or more cameras.
Memory 402 may be used to store applications and data. The memory 402 stores application programs including executable code. Applications may constitute various functional modules. Processor 403 executes various functional applications and data processing by running application programs stored in memory 402.
The processor 403 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing application programs stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device.
In this embodiment, the processor 403 in the electronic device loads executable codes corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 403 executes the application programs stored in the memory 402, so as to execute:
Acquiring a reference frame image and a non-reference frame image;
calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image;
calculating LBP feature maps of each layer of images in the first image pyramid and the second image pyramid;
calculating an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices;
calculating a target affine transformation matrix according to the initial affine transformation matrices;
registering the non-reference frame image and the reference frame image according to the target affine transformation matrix.
Referring to fig. 9, the electronic device 400 may include a camera module 401, a memory 402, a processor 403, an input unit 404, an output unit 405, a speaker 406, and the like.
The camera module 401 may include a lens and an image sensor, wherein the lens is used to collect an external light source signal and provide the light source signal to the image sensor, and the image sensor senses the light source signal from the lens and converts the light source signal into digitized RAW image data, i.e., RAW image data. RAW is an unprocessed, also uncompressed format, which can be visually referred to as a "digital negative". The camera module 401 may include one camera or two or more cameras.
Memory 402 may be used to store applications and data. The memory 402 stores application programs including executable code. Applications may constitute various functional modules. Processor 403 executes various functional applications and data processing by running application programs stored in memory 402.
The processor 403 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing application programs stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device.
The input unit 404 may be used to receive input numbers, character information, or user characteristic information (such as a fingerprint), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The output unit 405 may be used to display information input by a user or information provided to a user and various graphical user interfaces of an electronic device, which may be composed of graphics, text, icons, video, and any combination thereof. The output unit may include a display panel.
In this embodiment, the processor 403 in the electronic device loads executable codes corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 403 executes the application programs stored in the memory 402, so as to execute:
acquiring a reference frame image and a non-reference frame image;
calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image;
calculating LBP feature maps of each layer of images in the first image pyramid and the second image pyramid;
calculating an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices;
calculating a target affine transformation matrix according to the initial affine transformation matrices;
registering the non-reference frame image and the reference frame image according to the target affine transformation matrix.
In one embodiment, when the processor 403 executes the calculation of the target affine transformation matrix from the plurality of initial affine transformation matrices, it may execute: acquiring weights corresponding to the initial affine transformation matrixes; and carrying out weighted average calculation on the plurality of initial affine transformation matrixes according to the weight corresponding to each initial affine transformation matrix, and calculating to obtain a target affine transformation matrix.
In one embodiment, when the processor 403 executes the above-mentioned LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, and calculates an initial affine transformation matrix corresponding to each layer of image, to obtain a plurality of initial affine transformation matrices, the method may be executed: performing preset processing on two frames of images of each corresponding layer in the first image pyramid and the second image pyramid to obtain a plurality of initial affine transformation matrixes, wherein the two frames of images of each corresponding layer in the first image pyramid and the second image pyramid are respectively marked as P i And Q i I is a positive integer less than or equal to n, where n is the number of layers of the image pyramid, and the preset process includes: for image P i And Q i Determining a plurality of image blocks in the LBP feature images corresponding to the images, and calculating LBP feature vectors of the image blocks according to the LBP feature images of the images; for image Q in the second image pyramid i The LBP feature vector S of each block 0 And the image P in the first image pyramid i LBP feature vector R of tile of corresponding position in 0 And the feature vectors of the corresponding position neighborhoods are respectively matched and are matched with S 0 Is determined as S 0 Is a matching feature vector of (a); at the image Q i After LBP feature vectors of all the image blocks in the image block determine corresponding matching feature vectors, sorting the matching degrees of all the matching feature vector pairs, and determining the matching feature vector pairs positioned in a preset sequence as target feature vector pairs; at the image P i And Q i And extracting the features of the corresponding positions of the target feature vector pairs, and calculating a corresponding initial affine transformation matrix based on the extracted features.
In one embodiment, after the acquisition of the reference frame image and the non-reference frame image, the processor 403 may further perform: a first gray scale image of the reference frame image and a second gray scale image of the non-reference frame image are acquired.
Then, when the processor 403 executes the computing of the first image pyramid of the reference frame image and the second image pyramid of the non-reference frame image, it may execute: a first image pyramid of the first gray scale map and a second image pyramid of the second gray scale map are calculated.
In one embodiment, after the acquisition of the reference frame image and the non-reference frame image, the processor 403 may further perform: a first luminance component of the reference frame image and a second luminance component of the non-reference frame image are acquired.
Then, when the processor 403 executes the computing of the first image pyramid of the reference frame image and the second image pyramid of the non-reference frame image, it may execute: and calculating a corresponding first image pyramid according to the first brightness component, and calculating a corresponding second image pyramid according to the second brightness component.
In one embodiment, when the processor 403 performs the calculation of the first image pyramid of the reference frame image and the second image pyramid of the non-reference frame image, it may perform: and calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image according to the preset downsampling times and multiples.
In one embodiment, when the processor 403 performs the calculation of the LBP feature map of each layer of image in the first image pyramid and the second image pyramid, the calculation may be performed: and calculating the LBP characteristic map of each layer of image in the first image pyramid and the second image pyramid by using a circular LBP characteristic operator.
In the foregoing embodiments, the descriptions of the embodiments are focused on, and the portions of an embodiment that are not described in detail in the foregoing embodiments may be referred to the detailed description of the image processing method, which is not repeated herein.
The image processing device provided in the embodiment of the present application and the image processing method in the foregoing embodiment belong to the same concept, and any method provided in the embodiment of the image processing method may be run on the image processing device, and a specific implementation process of the method is detailed in the embodiment of the image processing method, which is not described herein again.
It should be noted that, for the image processing method according to the embodiment of the present application, it will be understood by those skilled in the art that all or part of the flow of implementing the image processing method according to the embodiment of the present application may be implemented by controlling related hardware through a computer program, where the computer program may be stored in a computer readable storage medium, such as a memory, and executed by at least one processor, and the execution may include the flow of the embodiment of the image processing method as described in the embodiment. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a random access Memory (RAM, random Access Memory), etc.
For the image processing apparatus of the embodiment of the present application, each functional module may be integrated in one processing chip, or each module may exist alone physically, or two or more modules may be integrated in one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated module, if implemented as a software functional module and sold or used as a stand-alone product, may also be stored on a computer readable storage medium such as read-only memory, magnetic or optical disk, etc.
The foregoing describes in detail an image processing method, apparatus, storage medium and electronic device provided in the embodiments of the present application, and specific examples are applied to illustrate principles and implementations of the present application, where the foregoing description of the embodiments is only used to help understand the method and core idea of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (9)

1. An image processing method, comprising:
acquiring a reference frame image and a non-reference frame image;
calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image;
calculating LBP feature maps of each layer of images in the first image pyramid and the second image pyramid;
calculating an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices;
Acquiring weights corresponding to the initial affine transformation matrixes;
according to the weight corresponding to each initial affine transformation matrix, carrying out weighted average calculation on the plurality of initial affine transformation matrixes, and calculating to obtain a target affine transformation matrix;
registering the non-reference frame image and the reference frame image according to the target affine transformation matrix.
2. The image processing method according to claim 1, wherein the calculating an initial affine transformation matrix corresponding to each layer of image according to the LBP feature map of each layer of image in the first image pyramid and the LBP feature map of the image of the corresponding layer in the second image pyramid to obtain a plurality of initial affine transformation matrices includes:
performing preset processing on two frames of images of each corresponding layer in the first image pyramid and the second image pyramid to obtain a plurality of initial affine transformation matrixes, wherein the two frames of images of each corresponding layer in the first image pyramid and the second image pyramid are respectively marked as P i And Q i I is a positive integer less than or equal to n, where n is the number of layers of the image pyramid, and the preset process includes:
for image P i And Q i Determining a plurality of image blocks in the LBP feature images corresponding to the images, and calculating LBP feature vectors of the image blocks according to the LBP feature images of the images;
For image Q in the second image pyramid i The LBP feature vector S of each block 0 With image P in the first image pyramid i LBP feature vector R of tile of corresponding position in 0 And the feature vectors of the corresponding position neighborhoods are respectively matched and are matched with S 0 Is determined as S 0 Is a matching feature vector of (a);
at the image Q i After LBP feature vectors of all the image blocks in the image block determine corresponding matching feature vectors, sorting the matching degrees of all the matching feature vector pairs, and determining the matching feature vector pairs positioned in a preset sequence as target feature vector pairs;
at the image P i And Q i And extracting the features of the corresponding positions of the target feature vector pairs, and calculating a corresponding initial affine transformation matrix based on the extracted features.
3. The image processing method according to claim 1, further comprising, after the acquisition of the reference frame image and the non-reference frame image: acquiring a first gray scale image of the reference frame image and a second gray scale image of the non-reference frame image;
the computing a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image, comprising: a first image pyramid of the first gray scale map and a second image pyramid of the second gray scale map are calculated.
4. The image processing method according to claim 1, further comprising, after the acquisition of the reference frame image and the non-reference frame image: acquiring a first brightness component of the reference frame image and a second brightness component of the non-reference frame image;
the computing a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image, comprising: and calculating a corresponding first image pyramid according to the first brightness component, and calculating a corresponding second image pyramid according to the second brightness component.
5. The image processing method according to claim 1, wherein said calculating a first image pyramid of said reference frame image and a second image pyramid of said non-reference frame image comprises:
and calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image according to the preset downsampling times and multiples.
6. The image processing method according to claim 1, wherein the calculating the LBP feature map of each layer of the image in the first image pyramid and the second image pyramid includes:
And calculating the LBP characteristic map of each layer of image in the first image pyramid and the second image pyramid by using a circular LBP characteristic operator.
7. An image processing apparatus, comprising:
the acquisition module is used for acquiring a reference frame image and a non-reference frame image;
a first calculation module for calculating a first image pyramid of the reference frame image and a second image pyramid of the non-reference frame image;
the second calculation module is used for calculating LBP characteristic diagrams of each layer of images in the first image pyramid and the second image pyramid;
the third calculation module is used for calculating an initial affine transformation matrix corresponding to each layer of image according to the LBP characteristic map of each layer of image in the first image pyramid and the LBP characteristic map of the image of the corresponding layer in the second image pyramid, so as to obtain a plurality of initial affine transformation matrices;
a fourth calculation module, configured to obtain weights corresponding to the initial affine transformation matrices, perform weighted average calculation on the multiple initial affine transformation matrices according to the weights corresponding to the initial affine transformation matrices, and calculate to obtain a target affine transformation matrix;
and the registration module is used for registering the non-reference frame image and the reference frame image according to the target affine transformation matrix.
8. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed on a computer, causes the computer to perform the method according to any one of claims 1 to 6.
9. An electronic device comprising a memory, a processor, characterized in that the processor is adapted to perform the method according to any of claims 1-6 by invoking a computer program stored in the memory.
CN201911253048.2A 2019-12-09 2019-12-09 Image processing method, device, storage medium and electronic equipment Active CN111080683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911253048.2A CN111080683B (en) 2019-12-09 2019-12-09 Image processing method, device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911253048.2A CN111080683B (en) 2019-12-09 2019-12-09 Image processing method, device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111080683A CN111080683A (en) 2020-04-28
CN111080683B true CN111080683B (en) 2023-06-02

Family

ID=70313444

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911253048.2A Active CN111080683B (en) 2019-12-09 2019-12-09 Image processing method, device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111080683B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113128554B (en) * 2021-03-10 2022-05-24 广州大学 Target positioning method, system, device and medium based on template matching
CN116188808B (en) * 2023-04-25 2023-07-25 青岛尘元科技信息有限公司 Image feature extraction method and system, storage medium and electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104933434A (en) * 2015-06-16 2015-09-23 同济大学 Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method
CN109215077A (en) * 2017-07-07 2019-01-15 腾讯科技(深圳)有限公司 A kind of method and relevant apparatus that camera posture information is determining

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7623683B2 (en) * 2006-04-13 2009-11-24 Hewlett-Packard Development Company, L.P. Combining multiple exposure images to increase dynamic range

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104933434A (en) * 2015-06-16 2015-09-23 同济大学 Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method
CN109215077A (en) * 2017-07-07 2019-01-15 腾讯科技(深圳)有限公司 A kind of method and relevant apparatus that camera posture information is determining

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
朱英宏 ; 李俊山 ; 郭莉莎 ; 余宁 ; .基于LBP的尺度不变特征的描述和匹配算法.计算机辅助设计与图形学学报.2011,(10),正文1758-1763页. *
薛艳华 等.基于改进ASIFT算法的遥感图像匹配.四川大学学报(自然科学版).2013,第50卷(第50期),正文999-1005页. *

Also Published As

Publication number Publication date
CN111080683A (en) 2020-04-28

Similar Documents

Publication Publication Date Title
US11475238B2 (en) Keypoint unwarping for machine vision applications
US11882357B2 (en) Image display method and device
CN109325954B (en) Image segmentation method and device and electronic equipment
JP7357998B2 (en) Image processing methods, smart devices and computer programs
JP4772839B2 (en) Image identification method and imaging apparatus
US20170154204A1 (en) Method and system of curved object recognition using image matching for image processing
US20110211233A1 (en) Image processing device, image processing method and computer program
US9058655B2 (en) Region of interest based image registration
JP2010045770A (en) Image processor and image processing method
CN110796041B (en) Principal identification method and apparatus, electronic device, and computer-readable storage medium
CN107018407B (en) Information processing device, evaluation chart, evaluation system, and performance evaluation method
CN111080683B (en) Image processing method, device, storage medium and electronic equipment
CN111028276A (en) Image alignment method and device, storage medium and electronic equipment
CN111429371A (en) Image processing method and device and terminal equipment
CN109543487B (en) Automatic induction triggering method and system based on bar code edge segmentation
Choi et al. A method for fast multi-exposure image fusion
CN113628259A (en) Image registration processing method and device
CN113642639A (en) Living body detection method, living body detection device, living body detection apparatus, and storage medium
CN108805883B (en) Image segmentation method, image segmentation device and electronic equipment
JP5754931B2 (en) Image analysis apparatus, image analysis method, and program
CN113824894A (en) Exposure control method, device, equipment and storage medium
JP6276504B2 (en) Image detection apparatus, control program, and image detection method
JP7025237B2 (en) Image processing equipment and its control method and program
CN112418244A (en) Target detection method, device and electronic system
CN117496019B (en) Image animation processing method and system for driving static image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant