CN117094965A - Lens picture quality analysis method and system based on image recognition algorithm - Google Patents
Lens picture quality analysis method and system based on image recognition algorithm Download PDFInfo
- Publication number
- CN117094965A CN117094965A CN202311050924.8A CN202311050924A CN117094965A CN 117094965 A CN117094965 A CN 117094965A CN 202311050924 A CN202311050924 A CN 202311050924A CN 117094965 A CN117094965 A CN 117094965A
- Authority
- CN
- China
- Prior art keywords
- image
- point set
- feature point
- calculating
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 26
- 238000004422 calculation algorithm Methods 0.000 title claims abstract description 24
- 238000011156 evaluation Methods 0.000 claims abstract description 60
- 239000011159 matrix material Substances 0.000 claims abstract description 52
- 230000006870 function Effects 0.000 claims abstract description 41
- 238000000034 method Methods 0.000 claims abstract description 28
- 238000004364 calculation method Methods 0.000 claims abstract description 13
- 238000013459 approach Methods 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims abstract description 9
- 238000004590 computer program Methods 0.000 description 13
- 238000012544 monitoring process Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000011295 pitch Substances 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000013441 quality evaluation Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Quality & Reliability (AREA)
- Biomedical Technology (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the invention provides a lens picture quality analysis method and a lens picture quality analysis system based on an image recognition algorithm, which belong to the technical field of image processing, and the method comprises the following steps: acquiring a first image P1, a second image P2 and a third image P3 which are shot by an image shooting device at t-delta t time, t time and t+delta t time; performing fast feature point calculation for the first image P1 and the second image P2; connecting the feature points in the background feature point set S0 according to a minimum approach principle to form n line segments; the evaluation functions F (L, g (J, P3), U1 and U2) are constructed to obtain the evaluation result of the image shot by the image shooting device, and g (J) is a calculation function of the position matrix J.
Description
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a lens image quality analysis method and system based on an image recognition algorithm.
Background
The video monitoring system comprises a front-end camera, a transmission cable and a video monitoring platform. The camera can be divided into a network digital camera and an analog camera, and can be used for collecting front-end video image signals. The complete video monitoring system consists of most 5 parts of shooting, transmission, control, display and record registration. The video camera transmits video images to the control host through the network cable or the coaxial video cable, the control host distributes video signals to each monitor and video equipment, and simultaneously, voice signals to be transmitted can be synchronously recorded into the video recorder. Through the control host, an operator can send out instructions to control the actions of the cradle head, such as up, down, left and right, and perform focusing and zooming operations on the lens, and can realize the switching of multiple cameras through the video matrix. The special video processing mode is utilized to carry out operations such as recording, playback, calling out, storage and the like on the image.
In the process of video monitoring, the quality of video monitoring is directly determined by the quality of an image shot by a camera, for example, when the video monitoring image has the conditions of blurred pictures, blocked pictures, noise points of the pictures, abnormal colors, excessively bright or excessively dark brightness of the pictures and the like, the monitoring efficiency is reduced or even the monitoring cannot be performed. For this reason, how to automatically and accurately evaluate the quality of the shot pictures taken by the camera is a problem to be solved.
Disclosure of Invention
In view of the above, embodiments of the present invention provide a lens image quality analysis method and system based on an image recognition algorithm, which at least partially solve the problems existing in the prior art.
In a first aspect, an embodiment of the present invention provides a lens image quality analysis method based on an image recognition algorithm, including:
acquiring a first image P1, a second image P2 and a third image P3 which are shot by an image shooting device at t-delta t time, t time and t+delta t time;
performing fast feature point calculation on the first image P1 and the second image P2 to obtain a first feature point set S1 and a second feature point set S2, and determining a background feature point set S0 common to the first image P1 and the second image P2, a foreground feature point set S11 of the first image P1 and a foreground feature point set S21 of the second image P2 by comparing the positions of feature points in the first feature point set S1 and the second feature point set S2;
connecting the characteristic points in the background characteristic point set S0 according to a minimum approach principle to form n line segments, determining the length L of the longest line segment in the n line segments, dividing the n line segments for m-1 times to obtain n multiplied by m equal dividing points, wherein the position coordinates of the n multiplied by m equal dividing points are used for forming an n multiplied by m series of position matrixes J, performing characteristic point curve closed connection on the foreground characteristic point set S11, selecting a closed graph U1 with the largest area from a plurality of obtained closed graphs, performing characteristic point curve closed connection on the foreground characteristic point set S21, and selecting a closed graph U2 with the largest area from the plurality of obtained closed graphs;
an evaluation function F (L, g (J, P3), U1, U2) is constructed to obtain an evaluation result for a captured image of the image capturing device, g (J) being a calculation function of the position matrix J.
According to a specific implementation manner of the embodiment of the present disclosure, before the capturing the first image P1, the second image P2, and the third image P3 captured by the image capturing device at the time t- Δt, the time t, and the time t+ [ Δt ], the method further includes:
acquiring the frame frequency of an image shot by an image shooting device;
based on the frame frequency, a value of Δt is determined.
According to a specific implementation manner of the embodiment of the present disclosure, before the capturing the first image P1, the second image P2, and the third image P3 captured by the image capturing device at the time t- Δt, the time t, and the time t+ [ Δt ], the method further includes:
acquiring any adjacent first video frame and second video frame shot by an image shooting device;
calculating a first gray level histogram and a second gray level histogram corresponding to the first video frame and the second video frame;
according to the similarity value alpha of the first gray level histogram and the second gray level histogram and the frame rate f of the video shot by the image shooting device, determining the value delta t as follows: Δt=γ (1- α)/f, γ being an adjustment parameter having a value greater than zero.
According to a specific implementation manner of the embodiment of the present disclosure, before the capturing the first image P1, the second image P2, and the third image P3 captured by the image capturing device at the time t- Δt, the time t, and the time t+ [ Δt ], the method further includes:
graying processing is carried out on the first video frame and the second video frame, and a first gray image and a second gray image are obtained;
determining a first pixel matrix M1 and a second pixel matrix M2 corresponding to the first gray level image and the second gray level image;
when the difference between the characteristic values of the first and second pixel matrices M1 and M2 is smaller than a preset value, the value of γ is increased so as to determine the value of Δt using γ after the increase.
According to a specific implementation manner of the embodiment of the present disclosure, the construction of the evaluation function F (L, g (J, P3), U1, U2) to obtain an evaluation result for a shot picture of the image capturing device includes:
and calculating image characteristics corresponding to the image points of the third image at the position matrix J by using the function g (J, P3), wherein the image characteristics comprise image brightness, image definition, image bias and image noise value, so as to obtain an evaluation value epsilon.
According to a specific implementation manner of the embodiment of the disclosure, the constructing the evaluation function F (L, g (J, P3), U1, U2) to obtain an evaluation result for a shot picture of the image capturing apparatus further includes:
calculating the area ratio of the closed graph U1 to the first image P1 to obtain xi 1 ;
Calculating the area ratio of the closed graph U2 to the second image P2 to obtain xi 2 ;
Then
Where D is the length of the diagonal of the first image P1 and the second image P2.
According to a specific implementation manner of the embodiment of the present disclosure, the calculating, using the function g (J, P3), an image characteristic corresponding to an image point of the third image at the position matrix J includes:
calculating variance value of pixel corresponding to image point of the third image at position matrix J, and determining third image brightness evaluation value epsilon 1 ;
Calculating Laplacian sub-value of pixel corresponding to image point of the third image at position matrix J, and determining definition evaluation value epsilon of the third image 2 ;
Calculating LAB color values of pixels corresponding to image points of the third image at the position matrix J, and determining color cast evaluation value epsilon of the third image 3 。
According to a specific implementation manner of the embodiment of the present disclosure, the calculating, using the function g (J, P3), an image characteristic corresponding to an image point of the third image at the position matrix J further includes:
calculating Sobel gradient values of pixels corresponding to image points of the third image at the position matrix J, and determining a noise evaluation value epsilon of the third image 4 。
According to a specific implementation of an embodiment of the disclosure, calculating, using the function g (J, P3), an image characteristic corresponding to an image point of the third image at the position matrix J, further includes:
calculating an evaluation value ε=τ by a formula 1 ε 1 +τ 2 ε 2 +τ 3 ε 3 +τ 4 ε 4 ,τ 1 、τ 2 、τ 3 、τ 4 For adjusting the coefficients.
In a second aspect, an embodiment of the present invention provides a lens image quality analysis system based on an image recognition algorithm, including:
the acquisition module is used for acquiring a first image P1, a second image P2 and a third image P3 which are shot at the time t-delta t, the time t and the time t+' delta t by the image shooting device;
the computing module is used for performing fast feature point computation on the first image P1 and the second image P2 to obtain a first feature point set S1 and a second feature point set S2, and determining a background feature point set S0 common to the first image P1 and the second image P2, a foreground feature point set S11 of the first image P1 and a foreground feature point set S21 of the second image P2 by comparing the positions of feature points in the first feature point set S1 and the second feature point set S2;
the determining module is used for connecting the characteristic points in the background characteristic point set S0 according to the minimum approach principle to form n line segments, determining the length L of the longest line segment in the n line segments, dividing the n line segments by m-1 times to obtain n multiplied by m equal division points, wherein the position coordinates of the n multiplied by m equal division points are used for forming an n multiplied by m series of position matrixes J, performing characteristic point curve closed connection on the foreground characteristic point set S11, selecting a closed graph U1 with the largest area from the obtained closed graphs, performing characteristic point curve closed connection on the foreground characteristic point set S21, and selecting a closed graph U2 with the largest area from the obtained closed graphs;
and the evaluation module is used for constructing evaluation functions F (L, g (J, P3), U1 and U2) so as to obtain an evaluation result of a picture shot by the image shooting device, wherein g (J) is a calculation function of the position matrix J.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
at least one processor; the method comprises the steps of,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the lens picture quality analysis method based on the image recognition algorithm of any one of the foregoing Ren Di or first aspects.
In a fourth aspect, embodiments of the present invention further provide a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the lens picture quality analysis method based on the image recognition algorithm in the foregoing first aspect or any implementation manner of the first aspect.
In a fifth aspect, embodiments of the present invention also provide a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the lens picture quality analysis method based on the image recognition algorithm in any one of the implementations of the foregoing first aspect or the first aspect.
The lens picture quality analysis scheme based on the image recognition algorithm in the embodiment of the invention comprises the following steps: acquiring a first image P1, a second image P2 and a third image P3 which are shot by an image shooting device at t-delta t time, t time and t+delta t time; performing fast feature point calculation on the first image P1 and the second image P2 to obtain a first feature point set S1 and a second feature point set S2, and determining a background feature point set S0 common to the first image P1 and the second image P2, a foreground feature point set S11 of the first image P1 and a foreground feature point set S21 of the second image P2 by comparing the positions of feature points in the first feature point set S1 and the second feature point set S2; connecting the characteristic points in the background characteristic point set S0 according to a minimum approach principle to form n line segments, determining the length L of the longest line segment in the n line segments, dividing the n line segments for m-1 times to obtain n multiplied by m equal dividing points, wherein the position coordinates of the n multiplied by m equal dividing points are used for forming an n multiplied by m series of position matrixes J, performing characteristic point curve closed connection on the foreground characteristic point set S11, selecting a closed graph U1 with the largest area from a plurality of obtained closed graphs, performing characteristic point curve closed connection on the foreground characteristic point set S21, and selecting a closed graph U2 with the largest area from the plurality of obtained closed graphs; an evaluation function F (L, g (J, P3), U1, U2) is constructed to obtain an evaluation result for a captured image of the image capturing device, g (J) being a calculation function of the position matrix J. By adopting the scheme, the data stored in the system software can be accurately combed, and the efficiency of the operation and maintenance of the system software data is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings are also obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a lens image quality analysis method based on an image recognition algorithm according to an embodiment of the present invention;
fig. 2 is a flowchart of another lens image quality analysis method based on an image recognition algorithm according to an embodiment of the present invention;
fig. 3 is a flowchart of another lens image quality analysis method based on an image recognition algorithm according to an embodiment of the present invention;
fig. 4 is a flowchart of another lens image quality analysis method based on an image recognition algorithm according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a lens image quality analysis system based on an image recognition algorithm according to an embodiment of the present invention;
fig. 6 is a schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Other advantages and effects of the present disclosure will become readily apparent to those skilled in the art from the following disclosure, which describes embodiments of the present disclosure by way of specific examples. It will be apparent that the described embodiments are merely some, but not all embodiments of the present disclosure. The disclosure is also to be embodied or carried out in other and different embodiments, and the details in this specification are to be understood as being a function of various other adaptations and modifications without departing from the spirit of the disclosure. The following embodiments and features in the embodiments are combined with each other without conflict. All other embodiments, which can be made by one of ordinary skill in the art without inventive effort, based on the embodiments in this disclosure are intended to be within the scope of this disclosure.
It is noted that various aspects of the embodiments are described below within the scope of the following claims. It should be apparent that the aspects described in this disclosure may be embodied in a wide variety of forms and that any specific structure and/or function described in this disclosure is illustrative only. Based on the present disclosure, one skilled in the art will appreciate that one aspect described in this disclosure may be implemented independently of any other aspects, and that various ways of combining two or more of these aspects. For example, apparatus may be implemented and/or methods practiced using any number of the aspects set forth in this disclosure. In addition, such apparatus may be implemented and/or such method practiced using other structure and/or functionality in addition to one or more of the aspects set forth in the disclosure.
It should also be noted that the illustrations provided in the following embodiments merely illustrate the basic concepts of the disclosure by way of illustration, and only the components related to the disclosure are shown in the drawings and are not drawn according to the number, shape and size of the components in actual implementation, and the form, number and proportion of the components in actual implementation may be arbitrarily changed, and the layout of the components may be more complicated.
In addition, in the following description, specific details are provided in order to provide a thorough understanding of the examples. However, it will be understood by those skilled in the art that the aspects may be practiced without these specific details.
The embodiment of the disclosure provides a lens picture quality analysis method based on an image recognition algorithm. The lens picture quality analysis method based on the image recognition algorithm provided in the present embodiment is executed by a computing device implemented as software or as a combination of software and hardware, which is integrally provided in a server, a terminal device, or the like.
Referring to fig. 1, fig. 2, fig. 3 and fig. 4, an embodiment of the present disclosure provides a lens image quality analysis method based on an image recognition algorithm, including:
s101, acquiring a first image P1, a second image P2 and a third image P3 which are shot by the image shooting device at the time t- Δt, the time t and the time t+Δt.
In order to improve accuracy of image quality judgment, images shot by the shooting device at 3 different moments, namely a first image P1, a second image P2 and a third image P3, can be acquired, and comprehensive analysis is performed through the first image P1, the second image P2 and the third image P3, so that quality of a lens picture shot by the shooting device is further determined.
The images can be acquired at the times t- Δt, t and t+ [ DELTA ] t, wherein the value of Deltat can be set according to actual needs so as to acquire images with higher quality by setting proper Deltat.
As one mode, the frame frequency of the image captured by the image capturing device is obtained, the value of Δt is determined based on the frame frequency, for example, the frame frequency is 24, and then selection can be madeIs taken as the value of Δt.
Alternatively, any adjacent first video frame and second video frame shot by the image shooting device can be acquired; calculating a first gray level histogram and a second gray level histogram corresponding to the first video frame and the second video frame; according to the similarity value alpha of the first gray level histogram and the second gray level histogram and the frame rate f of the video shot by the image shooting device, determining the value delta t as follows: Δt=γ (1- α)/f, where γ is an adjustment parameter with a value greater than zero, and the value of γ may be set according to an empirical value, or may be dynamically adjusted according to a change of an actual environment.
In order to achieve the above, the first video frame and the second video frame are subjected to graying treatment to obtain a first gray level image and a second gray level image; determining a first pixel matrix M1 and a second pixel matrix M2 corresponding to the first gray level image and the second gray level image; when the difference between the characteristic values of the first pixel matrix M1 and the second pixel matrix M2 is smaller than the preset value, it indicates that the similarity between the first video frame and the second video frame is higher, and the value of γ may be increased at this time, so as to determine the value of Δt by using γ after the increase, so that the first image P1, the second image P2, and the third image P3 have a certain degree of distinction.
S102, performing fast feature point calculation on the first image P1 and the second image P2 to obtain a first feature point set S1 and a second feature point set S2, and determining a background feature point set S0 common to the first image P1 and the second image P2, a foreground feature point set S11 of the first image P1 and a foreground feature point set S21 of the second image P2 by comparing the positions of feature points in the first feature point set S1 and the second feature point set S2.
For the first image P1 and the second image P2, a manner of calculating corner points may be adopted to obtain fast feature points. For example, fast corner detection may be performed on the first image P1 and the second image P2, or of course, corner detection may be performed by adopting a Harris corner monitoring manner, so as to obtain feature point sets of the first image P1 and the second image P2, which are respectively denoted as a first feature point set S1 and a second feature point set S2, where the first feature point set S1 and the second feature point set S2 include a foreground feature point set and a background feature point set at the same time, and by comparing positions of feature points in the first feature point set S1 and the second feature point set S2, a background feature point set S0 common to the first image P1 and the second image P2, a foreground feature point set S11 of the first image P1, and a foreground feature point set S21 of the second image P2 are determined. For example, feature points having the same coordinate position in the first feature point set S1 and the second feature point set S2 may be marked as background feature points, and feature points having different positions from the background feature points in the first feature point set S1 and the second feature point set S2 may be marked as foreground feature point set S11 of the first image P1 and foreground feature point set S21 of the second image P2.
S103, connecting the characteristic points in the background characteristic point set S0 according to a minimum approach principle to form n line segments, determining the length L of the longest line segment in the n line segments, dividing the n line segments by m-1 times to obtain n multiplied by m equal division points, wherein the position coordinates of the n multiplied by m equal division points are used for forming an n multiplied by m series of position matrixes J, performing characteristic point curve closed connection on the foreground characteristic point set S11, selecting a closed graph U1 with the largest area from the obtained closed graphs, performing characteristic point curve closed connection on the foreground characteristic point set S21, and selecting a closed graph U2 with the largest area from the obtained closed graphs.
The feature points in the background feature point set S0 are connected according to a minimum approach principle, for example, feature points with feature point pitches smaller than a specific value (for example, a distance of 50 pixel values) can be connected according to a principle that the approach value is nearest, so that a plurality of line segments are formed, the line segments connected according to the minimum approach principle can represent edge features of an object existing in an image, n edge object features can be obtained based on n line segments, n x m equal point positions can be obtained by dividing n line segments for m-1 times, and the positions of the n x m equal point positions can effectively represent the positions of a target object, so that an n x m series of position matrix J is formed based on position coordinates of the n x m equal point positions, and the position matrix J represents the position point with the most typical feature in the image, so that the quality of the image can be judged based on the position points.
S104, constructing evaluation functions F (L, g (J, P3), U1 and U2) to obtain an evaluation result of a picture shot by the image shooting device, wherein g (J) is a calculation function of the position matrix J.
And calculating image characteristics corresponding to the image points of the third image at the position matrix J by using the function g (J, P3), wherein the image characteristics comprise image brightness, image definition, image bias and image noise value, so as to obtain an evaluation value epsilon.
Specifically, the area ratio of the closed graph U1 to the first image P1 can be calculated to obtain ζ 1 Calculating a closureThe area ratio of the graph U2 to the second image P2 is used for obtaining xi 2 Then Where D is the length of the diagonal of the first image P1 and the second image P2.
For the luminance value, exemplary, a variance value of a pixel corresponding to the image point of the third image at the position matrix J may be calculated, and a third image luminance evaluation value ε may be determined 1 。
For the color difference value, exemplary, the Laplacian value of the pixel corresponding to the image point of the third image at the position matrix J may be calculated, and the sharpness evaluation value epsilon of the third image may be determined 2 。
For color shift, exemplary, LAB color values of pixels corresponding to image points of the third image at the position matrix J may be calculated to determine a color shift evaluation value ε of the third image 3 。
For image noise, exemplary, sobel gradient values of pixels corresponding to image points of the third image at the position matrix J may be calculated to determine a noise evaluation value ε of the third image 4 。
Finally, the evaluation value ε=τ can be calculated by a formula 1 ε 1 +τ 2 ε 2 +τ 3 ε 3 +τ 4 ε 4 ,τ 1 、τ 2 、τ 3 、τ 4 To adjust the coefficient τ 1 、τ 2 、τ 3 、τ 4 The value of (2) can be flexibly set according to the actual needs.
By the scheme of the embodiment, the image quality shot by the image pickup device can be automatically and comprehensively evaluated from multiple dimensions, so that the instantaneity and the accuracy of image quality evaluation are improved.
According to a specific implementation manner of the embodiment of the present disclosure, before the capturing the first image P1, the second image P2, and the third image P3 captured by the image capturing device at the time t- Δt, the time t, and the time t+ [ Δt ], the method further includes:
acquiring the frame frequency of an image shot by an image shooting device;
based on the frame frequency, a value of Δt is determined.
Referring to fig. 2, according to a specific implementation manner of the embodiment of the disclosure, the method further includes, before the capturing a first image P1, a second image P2, and a third image P3 captured by the image capturing device at a time t- Δt, a time t, and a time t+ [ Δt ]:
s201, acquiring any adjacent first video frame and second video frame shot by an image shooting device;
s202, calculating a first gray level histogram and a second gray level histogram corresponding to the first video frame and the second video frame;
s203, determining, according to the similarity value α of the first gray histogram and the second gray histogram and the frame rate f of the video captured by the image capturing device, a value Δt as follows: Δt=γ (1- α)/f, γ being an adjustment parameter having a value greater than zero.
Referring to fig. 3, according to a specific implementation manner of the embodiment of the disclosure, the method further includes, before the capturing a first image P1, a second image P2, and a third image P3 captured by the image capturing device at a time t- Δt, a time t, and a time t+ [ Δt ]:
s301, carrying out graying treatment on the first video frame and the second video frame to obtain a first gray image and a second gray image;
s302, determining a first pixel matrix M1 and a second pixel matrix M2 corresponding to the first gray level image and the second gray level image;
s303, when the difference between the characteristic values of the first pixel matrix M1 and the second pixel matrix M2 is smaller than the preset value, increasing the value of γ so as to determine the value of Δt by using γ after the increase.
According to a specific implementation manner of the embodiment of the present disclosure, the construction of the evaluation function F (L, g (J, P3), U1, U2) to obtain an evaluation result for a shot picture of the image capturing device includes:
and calculating image characteristics corresponding to the image points of the third image at the position matrix J by using the function g (J, P3), wherein the image characteristics comprise image brightness, image definition, image bias and image noise value, so as to obtain an evaluation value epsilon.
According to a specific implementation manner of the embodiment of the disclosure, the constructing the evaluation function F (L, g (J, P3), U1, U2) to obtain an evaluation result for a shot picture of the image capturing apparatus further includes:
calculating the area ratio of the closed graph U1 to the first image P1 to obtain xi 1 ;
Calculating the area ratio of the closed graph U2 to the second image P2 to obtain xi 2 ;
Then
Where D is the length of the diagonal of the first image P1 and the second image P2.
Referring to fig. 4, according to a specific implementation manner of the embodiment of the disclosure, the calculating, using the function g (J, P3), an image characteristic corresponding to an image point at the position matrix J of the third image includes:
s401, calculating variance value of pixel corresponding to image point of the third image at position matrix J, and determining third image brightness evaluation value epsilon 1 ;
S402, calculating Laplacian value of pixel corresponding to image point of the third image at position matrix J, and determining definition evaluation value epsilon of the third image 2 ;
S403, calculating LAB color values of pixels corresponding to the image points of the third image at the position matrix J, and determining color cast evaluation value epsilon of the third image 3 。
According to a specific implementation manner of the embodiment of the present disclosure, the calculating, using the function g (J, P3), an image characteristic corresponding to an image point of the third image at the position matrix J further includes:
s404, calculating Sobel gradient values of pixels corresponding to the image points of the third image at the position matrix J, and determining a noise evaluation value epsilon of the third image 4 。
According to a specific implementation of an embodiment of the disclosure, calculating, using the function g (J, P3), an image characteristic corresponding to an image point of the third image at the position matrix J, further includes:
calculating an evaluation value ε=τ by a formula 1 ε 1 +τ 2 ε 2 +τ 3 ε 3 +τ 4 ε 4 ,τ 1 、τ 2 、τ 3 、τ 4 For adjusting the coefficients.
Referring to fig. 5, the embodiment of the invention further discloses a lens picture quality analysis system 50 based on an image recognition algorithm, which comprises:
an acquisition module 501, configured to acquire a first image P1, a second image P2, and a third image P3 captured by the image capturing device at a time t- Δt, a time t, and a time t+ [ Δt;
the computing module 502 is configured to perform fast feature point computation on the first image P1 and the second image P2 to obtain a first feature point set S1 and a second feature point set S2, and determine a background feature point set S0 common to the first image P1 and the second image P2, a foreground feature point set S11 of the first image P1, and a foreground feature point set S21 of the second image P2 by comparing positions of feature points in the first feature point set S1 and the second feature point set S2;
the determining module 503 is configured to connect feature points in the background feature point set S0 according to a minimum approach principle, form n line segments, determine a length L of a longest line segment in the n line segments, obtain n×m equal division points by dividing the n line segments by m-1 times, use position coordinates of the n×m equal division points to form an n×m series of position matrixes J, perform feature point curve closed connection on the foreground feature point set S11, select a closed graph U1 with a largest area from the obtained closed graphs, perform feature point curve closed connection on the foreground feature point set S21, and select a closed graph U2 with a largest area from the obtained closed graphs;
an evaluation module 504, configured to construct an evaluation function F (L, g (J, P3), U1, U2) to obtain an evaluation result for a captured image of the image capturing device, where g (J) is a calculation function of the position matrix J.
Referring to fig. 6, an embodiment of the present invention also provides an electronic device 60, including:
at least one processor; the method comprises the steps of,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the lens picture quality analysis method based on the image recognition algorithm in the foregoing method embodiments.
Embodiments of the present invention also provide a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the foregoing method embodiments.
The present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the lens picture quality analysis method based on the image recognition algorithm in the foregoing method embodiments.
Referring now to fig. 6, a schematic diagram of an electronic device 60 suitable for use in implementing embodiments of the present disclosure is shown. The electronic devices in the embodiments of the present disclosure include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 6, the electronic device 60 includes a processing means (e.g., a central processing unit, a graphics processor, etc.) 601 that performs various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic device 60 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Typically, the following devices are connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touchpad, keyboard, mouse, image sensor, microphone, accelerometer, gyroscope, etc.; an output device 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 allows the electronic device 60 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 shows an electronic device 60 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. Alternatively, more or fewer devices may be implemented or provided.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts are implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program is downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 601.
It should be noted that the computer readable medium described in the present disclosure is a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium is, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer-readable storage medium is any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium includes a data signal that propagates in baseband or as part of a carrier wave, in which computer-readable program code is carried. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium is transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium is contained in the electronic device; but also alone without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring at least two internet protocol addresses; sending a node evaluation request comprising the at least two internet protocol addresses to node evaluation equipment, wherein the node evaluation equipment selects an internet protocol address from the at least two internet protocol addresses and returns the internet protocol address; receiving an Internet protocol address returned by the node evaluation equipment; wherein the acquired internet protocol address indicates an edge node in the content distribution network.
Alternatively, the computer-readable medium carries one or more programs that, when executed by the electronic device, cause the electronic device to: receiving a node evaluation request comprising at least two internet protocol addresses; selecting an internet protocol address from the at least two internet protocol addresses; returning the selected internet protocol address; wherein the received internet protocol address indicates an edge node in the content distribution network.
Computer program code for carrying out operations of the present disclosure is written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code executes entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer is connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams represents a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units referred to in the embodiments of the present disclosure are implemented by means of software, and also implemented by means of hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example the first acquisition unit is also described as "unit acquiring at least two internet protocol addresses".
It is to be understood that portions of the present invention are implemented in hardware, software, firmware, or a combination thereof.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present invention should be included in the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.
Claims (10)
1. The lens picture quality analysis method based on the image recognition algorithm is characterized by comprising the following steps of:
acquiring a first image P1, a second image P2 and a third image P3 which are shot by an image shooting device at t-delta t time, t time and t+delta t time;
performing fast feature point calculation on the first image P1 and the second image P2 to obtain a first feature point set S1 and a second feature point set S2, and determining a background feature point set S0 common to the first image P1 and the second image P2, a foreground feature point set S11 of the first image P1 and a foreground feature point set S21 of the second image P2 by comparing the positions of feature points in the first feature point set S1 and the second feature point set S2;
connecting the characteristic points in the background characteristic point set S0 according to a minimum approach principle to form n line segments, determining the length L of the longest line segment in the n line segments, dividing the n line segments for m-1 times to obtain n multiplied by m equal dividing points, wherein the position coordinates of the n multiplied by m equal dividing points are used for forming an n multiplied by m series of position matrixes J, performing characteristic point curve closed connection on the foreground characteristic point set S11, selecting a closed graph U1 with the largest area from a plurality of obtained closed graphs, performing characteristic point curve closed connection on the foreground characteristic point set S21, and selecting a closed graph U2 with the largest area from the plurality of obtained closed graphs;
an evaluation function F (L, g (J, P3), U1, U2) is constructed to obtain an evaluation result for a captured image of the image capturing device, g (J) being a calculation function of the position matrix J.
2. The method according to claim 1, wherein the acquiring the first, second and third images P1, P2, P3 captured by the image capturing device at times t- Δt, t and t+ [ Δt ] is preceded by:
acquiring the frame frequency of an image shot by an image shooting device;
based on the frame frequency, a value of Δt is determined.
3. The method according to claim 1, wherein the acquiring the first, second and third images P1, P2, P3 captured by the image capturing device at times t- Δt, t and t+ [ Δt ] is preceded by:
acquiring any adjacent first video frame and second video frame shot by an image shooting device;
calculating a first gray level histogram and a second gray level histogram corresponding to the first video frame and the second video frame;
according to the similarity value alpha of the first gray level histogram and the second gray level histogram and the frame rate f of the video shot by the image shooting device, determining the value delta t as follows: Δt=γ (1- α)/f, γ being an adjustment parameter having a value greater than zero.
4. A method according to claim 3, wherein the acquiring the first, second and third images P1, P2, P3 taken by the image taking means at times t- Δt, t and t+ [ Δt ] is preceded by:
graying processing is carried out on the first video frame and the second video frame, and a first gray image and a second gray image are obtained;
determining a first pixel matrix M1 and a second pixel matrix M2 corresponding to the first gray level image and the second gray level image;
when the difference between the characteristic values of the first and second pixel matrices M1 and M2 is smaller than a preset value, the value of γ is increased so as to determine the value of Δt using γ after the increase.
5. The method according to claim 1, wherein constructing the evaluation function F (L, g (J, P3), U1, U2) to obtain the evaluation result for the photographed picture of the image photographing device includes:
and calculating image characteristics corresponding to the image points of the third image at the position matrix J by using the function g (J, P3), wherein the image characteristics comprise image brightness, image definition, image bias and image noise value, so as to obtain an evaluation value epsilon.
6. The method according to claim 5, wherein constructing the evaluation function F (L, g (J, P3), U1, U2) to obtain the evaluation result for the photographed picture of the image photographing device further comprises:
calculating the area ratio of the closed graph U1 to the first image P1 to obtain xi 1 ;
Calculating the area ratio of the closed graph U2 to the second image P2 to obtain xi 2 ;
Then
Where D is the length of the diagonal of the first image P1 and the second image P2.
7. The method of claim 6, wherein calculating the image characteristic for the image point of the third image at the position matrix J using the function g (J, P3) comprises:
calculating variance value of pixel corresponding to image point of the third image at position matrix J, and determining third image brightness evaluation value epsilon 1 ;
Calculating a third image in position matrix JThe Laplacian value of the pixel corresponding to the image point at the position, and determining the definition evaluation value epsilon of the third image 2 ;
Calculating LAB color values of pixels corresponding to image points of the third image at the position matrix J, and determining color cast evaluation value epsilon of the third image 3 。
8. The method of claim 7, wherein calculating the image characteristic corresponding to the image point of the third image at the position matrix J using the function g (J, P3), further comprises:
calculating Sobel gradient values of pixels corresponding to image points of the third image at the position matrix J, and determining a noise evaluation value epsilon of the third image 4 。
9. The method of claim 8, wherein calculating image characteristics corresponding to image points of the third image at the position matrix J using the function g (J, P3), further comprises:
calculating an evaluation value ε=τ by a formula 1 ε 1 +τ 2 ε 2 +τ 3 ε 3 +τ 4 ε 4 ,τ 1 、τ 2 、τ 3 、τ 4 For adjusting the coefficients.
10. A lens picture quality analysis system based on an image recognition algorithm, comprising:
the acquisition module is used for acquiring a first image P1, a second image P2 and a third image P3 which are shot at the time t-delta t, the time t and the time t+' delta t by the image shooting device;
the computing module is used for performing fast feature point computation on the first image P1 and the second image P2 to obtain a first feature point set S1 and a second feature point set S2, and determining a background feature point set S0 common to the first image P1 and the second image P2, a foreground feature point set S11 of the first image P1 and a foreground feature point set S21 of the second image P2 by comparing the positions of feature points in the first feature point set S1 and the second feature point set S2;
the determining module is used for connecting the characteristic points in the background characteristic point set S0 according to the minimum approach principle to form n line segments, determining the length L of the longest line segment in the n line segments, dividing the n line segments by m-1 times to obtain n multiplied by m equal division points, wherein the position coordinates of the n multiplied by m equal division points are used for forming an n multiplied by m series of position matrixes J, performing characteristic point curve closed connection on the foreground characteristic point set S11, selecting a closed graph U1 with the largest area from the obtained closed graphs, performing characteristic point curve closed connection on the foreground characteristic point set S21, and selecting a closed graph U2 with the largest area from the obtained closed graphs;
and the evaluation module is used for constructing evaluation functions F (L, g (J, P3), U1 and U2) so as to obtain an evaluation result of a picture shot by the image shooting device, wherein g (J) is a calculation function of the position matrix J.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311050924.8A CN117094965B (en) | 2023-08-21 | 2023-08-21 | Lens picture quality analysis method and system based on image recognition algorithm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311050924.8A CN117094965B (en) | 2023-08-21 | 2023-08-21 | Lens picture quality analysis method and system based on image recognition algorithm |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117094965A true CN117094965A (en) | 2023-11-21 |
CN117094965B CN117094965B (en) | 2024-07-05 |
Family
ID=88779733
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311050924.8A Active CN117094965B (en) | 2023-08-21 | 2023-08-21 | Lens picture quality analysis method and system based on image recognition algorithm |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117094965B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150245007A1 (en) * | 2014-02-21 | 2015-08-27 | Sony Corporation | Image processing method, image processing device, and electronic apparatus |
CN105184802A (en) * | 2015-09-30 | 2015-12-23 | 西安电子科技大学 | Image processing method and device |
CN108259782A (en) * | 2016-12-28 | 2018-07-06 | 株式会社理光 | Image processing apparatus, camera chain, image processing method |
CN109509151A (en) * | 2018-11-30 | 2019-03-22 | 中国科学院苏州纳米技术与纳米仿生研究所 | Image and video-splicing method, computer readable storage medium and computer equipment |
CN109784230A (en) * | 2018-12-29 | 2019-05-21 | 中国科学院重庆绿色智能技术研究院 | A kind of facial video image quality optimization method, system and equipment |
CN110288037A (en) * | 2019-06-28 | 2019-09-27 | 北京字节跳动网络技术有限公司 | Image processing method, device and electronic equipment |
CN110415276A (en) * | 2019-07-30 | 2019-11-05 | 北京字节跳动网络技术有限公司 | Motion information calculation method, device and electronic equipment |
US20190385285A1 (en) * | 2016-12-21 | 2019-12-19 | Huawei Technologies Co., Ltd. | Image Processing Method and Device |
CN111652136A (en) * | 2020-06-03 | 2020-09-11 | 苏宁云计算有限公司 | Pedestrian detection method and device based on depth image |
CN112528823A (en) * | 2020-12-04 | 2021-03-19 | 燕山大学 | Striped shark movement behavior analysis method and system based on key frame detection and semantic component segmentation |
CN114049349A (en) * | 2021-12-14 | 2022-02-15 | 北京有竹居网络技术有限公司 | Camera imaging quality evaluation method, device, equipment and storage medium |
CN114841910A (en) * | 2021-01-30 | 2022-08-02 | 华为技术有限公司 | Vehicle-mounted lens shielding identification method and device |
-
2023
- 2023-08-21 CN CN202311050924.8A patent/CN117094965B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150245007A1 (en) * | 2014-02-21 | 2015-08-27 | Sony Corporation | Image processing method, image processing device, and electronic apparatus |
CN105184802A (en) * | 2015-09-30 | 2015-12-23 | 西安电子科技大学 | Image processing method and device |
US20190385285A1 (en) * | 2016-12-21 | 2019-12-19 | Huawei Technologies Co., Ltd. | Image Processing Method and Device |
CN108259782A (en) * | 2016-12-28 | 2018-07-06 | 株式会社理光 | Image processing apparatus, camera chain, image processing method |
CN109509151A (en) * | 2018-11-30 | 2019-03-22 | 中国科学院苏州纳米技术与纳米仿生研究所 | Image and video-splicing method, computer readable storage medium and computer equipment |
CN109784230A (en) * | 2018-12-29 | 2019-05-21 | 中国科学院重庆绿色智能技术研究院 | A kind of facial video image quality optimization method, system and equipment |
CN110288037A (en) * | 2019-06-28 | 2019-09-27 | 北京字节跳动网络技术有限公司 | Image processing method, device and electronic equipment |
CN110415276A (en) * | 2019-07-30 | 2019-11-05 | 北京字节跳动网络技术有限公司 | Motion information calculation method, device and electronic equipment |
CN111652136A (en) * | 2020-06-03 | 2020-09-11 | 苏宁云计算有限公司 | Pedestrian detection method and device based on depth image |
CN112528823A (en) * | 2020-12-04 | 2021-03-19 | 燕山大学 | Striped shark movement behavior analysis method and system based on key frame detection and semantic component segmentation |
CN114841910A (en) * | 2021-01-30 | 2022-08-02 | 华为技术有限公司 | Vehicle-mounted lens shielding identification method and device |
CN114049349A (en) * | 2021-12-14 | 2022-02-15 | 北京有竹居网络技术有限公司 | Camera imaging quality evaluation method, device, equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
QI JIA等: "Leveraging Line-Point Consistence To Preserve Structures for Wide Parallax Image Stitching", 《IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR)》, 31 December 2021 (2021-12-31), pages 12186 - 12195 * |
胡坤福: "基于自适应感兴趣区域的车道线识别技术研究", 《中国优秀硕士学位论文全文数据库_工程科技Ⅱ辑》, 15 March 2021 (2021-03-15), pages 035 - 148 * |
Also Published As
Publication number | Publication date |
---|---|
CN117094965B (en) | 2024-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI777112B (en) | Method, apparatus and electronic device for image processing and storage medium | |
CN110796664B (en) | Image processing method, device, electronic equipment and computer readable storage medium | |
EP3940633B1 (en) | Image alignment method and apparatus, electronic device, and storage medium | |
CN103078924A (en) | Visual field sharing method and equipment | |
CN111385484B (en) | Information processing method and device | |
CN110830714B (en) | Information acquisition method and device, terminal and storage medium | |
CN109194878B (en) | Video image anti-shake method, device, equipment and storage medium | |
CN114022662B (en) | Image recognition method, device, equipment and medium | |
CN114650361B (en) | Shooting mode determining method, shooting mode determining device, electronic equipment and storage medium | |
CN112465940B (en) | Image rendering method and device, electronic equipment and storage medium | |
CN110809166B (en) | Video data processing method and device and electronic equipment | |
CN117094965B (en) | Lens picture quality analysis method and system based on image recognition algorithm | |
CN115114463B (en) | Method and device for displaying media content, electronic equipment and storage medium | |
CN116069221A (en) | Media content display method and device, electronic equipment and storage medium | |
CN113301324B (en) | Virtual focus detection method, device, equipment and medium based on camera device | |
CN115022611A (en) | VR picture display method, electronic device and readable storage medium | |
CN113099101A (en) | Camera shooting parameter adjusting method and device and electronic equipment | |
CN114820404B (en) | Image processing method, device, electronic equipment and medium | |
CN114157848B (en) | Projection device correction method, projection device correction device, storage medium and projection device | |
CN111629107B (en) | Terminal control method and device, terminal and storage medium | |
CN117690064B (en) | Transmission line detection method, transmission line detection device, electronic equipment and computer readable medium | |
CN115225823B (en) | Image acquisition method and device | |
CN112818748B (en) | Method and device for determining plane in video, storage medium and electronic equipment | |
CN112651909B (en) | Image synthesis method, device, electronic equipment and computer readable storage medium | |
CN114727011B (en) | Image pickup optimization method, device, electronic equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |