CN111800626A - Photographing consistency evaluation method and device, mobile terminal and storage medium - Google Patents

Photographing consistency evaluation method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN111800626A
CN111800626A CN202010802853.2A CN202010802853A CN111800626A CN 111800626 A CN111800626 A CN 111800626A CN 202010802853 A CN202010802853 A CN 202010802853A CN 111800626 A CN111800626 A CN 111800626A
Authority
CN
China
Prior art keywords
photographing
cameras
consistency
color
consistency evaluation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010802853.2A
Other languages
Chinese (zh)
Other versions
CN111800626B (en
Inventor
冯斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202010802853.2A priority Critical patent/CN111800626B/en
Publication of CN111800626A publication Critical patent/CN111800626A/en
Application granted granted Critical
Publication of CN111800626B publication Critical patent/CN111800626B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The application is applicable to the technical field of camera shooting, and provides a shooting consistency evaluation method, a device, a mobile terminal and a storage medium, which comprise the following steps: respectively photographing the test chart through N cameras to obtain N test chart images, wherein N is an integer greater than 1, and the N cameras respectively correspond to one test chart image; acquiring photographing consistency evaluation values of the N cameras according to the N test chart images; and acquiring the photographing consistency degree of the N cameras according to the photographing consistency evaluation values of the N cameras. The shooting consistency evaluation of the at least two cameras can be realized through the method and the device.

Description

Photographing consistency evaluation method and device, mobile terminal and storage medium
Technical Field
The application belongs to the technical field of camera shooting, and particularly relates to a shooting consistency evaluation method and device, a mobile terminal and a storage medium.
Background
With the development of mobile terminal technology, the functions of the mobile terminal are more and more powerful, and the requirement of a user is difficult to meet by integrating one camera on the mobile terminal, so that at least two cameras are integrated on more and more mobile terminals. When a mobile terminal integrating at least two cameras takes a picture, a plurality of cameras are generally required to be matched with each other, the requirement on the consistency of each camera is high, and therefore the shooting consistency of each camera needs to be evaluated before leaving a factory.
Disclosure of Invention
The application provides a photographing consistency evaluation method and device, a mobile terminal and a storage medium, so that photographing consistency evaluation of at least two cameras is realized.
In a first aspect, an embodiment of the present application provides a photographing consistency evaluation method, where the photographing consistency evaluation method includes:
respectively photographing the test chart through N cameras to obtain N test chart images, wherein N is an integer greater than 1, and the N cameras respectively correspond to one test chart image;
acquiring photographing consistency evaluation values of the N cameras according to the N test chart images;
and acquiring the photographing consistency degree of the N cameras according to the photographing consistency evaluation values of the N cameras.
In a second aspect, an embodiment of the present application provides a photographing consistency evaluation apparatus, including:
the image card photographing module is used for photographing the test image card through N cameras respectively to obtain N test image card images, wherein N is an integer larger than 1, and the N cameras respectively correspond to one test image card image;
the evaluation value acquisition module is used for acquiring the photographing consistency evaluation values of the N cameras according to the N test card images;
and the consistency acquisition module is used for acquiring the photographing consistency degrees of the N cameras according to the photographing consistency evaluation values of the N cameras.
In a third aspect, an embodiment of the present application provides a mobile terminal, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the photographing consistency evaluation method according to the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the steps of the photographing consistency evaluation method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when running on a mobile terminal, causes the mobile terminal to execute the steps of the photographing consistency evaluation method according to the first aspect.
As can be seen from the above, in the present application, N cameras integrated on a mobile terminal respectively photograph a test chart, so as to obtain N test chart images, and according to the N test chart images, a photographing consistency evaluation value of the N cameras can be obtained, and according to the photographing consistency evaluation value of the N cameras, a photographing consistency degree of the N cameras can be obtained, so as to realize photographing consistency evaluation of the N cameras, for example, the larger the photographing consistency evaluation value is, the worse the photographing consistency of the N cameras is, the smaller the photographing consistency evaluation value is, and the better the photographing consistency of the N cameras is.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of a photographing consistency evaluation method provided in an embodiment of the present application;
FIG. 2 is an exemplary 24 color chart;
FIG. 3a is an exemplary graph of brightness uniformity for two models; FIG. 3b is an exemplary white balance consistency graph for two models; FIG. 3c is an exemplary graph of color accuracy consistency for two models;
fig. 4 is a schematic flow chart illustrating an implementation of the photographing consistency evaluation method provided in the second embodiment of the present application;
fig. 5 is a schematic structural diagram of a photographing consistency evaluation apparatus provided in the third embodiment of the present application;
fig. 6 is a schematic structural diagram of a mobile terminal according to a fourth embodiment of the present application;
fig. 7 is a schematic structural diagram of a mobile terminal according to a fifth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In particular implementations, the mobile terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a mobile terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the mobile terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The mobile terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the mobile terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, which is a schematic view of an implementation flow of a photographing consistency evaluation method provided in an embodiment of the present application, where the photographing consistency evaluation method is applied to a mobile terminal, as shown in the figure, the photographing consistency evaluation method may include the following steps:
and 101, respectively photographing the test chart through N cameras to obtain N test chart images.
And N is an integer greater than 1, and the N cameras respectively correspond to a test chart image.
The N cameras can be cameras integrated on the mobile terminal, for example, the N cameras are a main camera, a wide-angle camera, a long-focus camera and the like respectively, and a user can set the N cameras according to actual needs.
Before the test card is photographed by the N cameras respectively, the color temperature and the illumination intensity related to a photographing scene can be preset to simulate a real photographing scene, so that the photographing scene is closer to the environment of the real photographing scene, and the test card is photographed under the set photographing scene. For example, four photographing scenes are set, the color temperature of the first photographing scene is D65, and the illuminance is 1000 Lux; the color temperature of the second photographing scene is TL84, and the illumination is 300 Lux; the color temperature of the third photographing scene is TL83, the illuminance is 100Lux, the color temperature of the fourth photographing scene is a, and the illuminance is 20 Lux.
When the test chart is photographed by the N cameras respectively, one test chart image can be obtained by one camera, so that the N test chart images can be obtained by the N cameras, and the test chart image refers to the image of the test chart obtained by the cameras. It should be noted that, when the N cameras take pictures of the test chart card, the test chart card needs to be kept at the same view field position on the picture, so as to improve the accuracy of the obtained shooting consistency.
The test chart may refer to a chart for performing consistency evaluation on the photographs taken by the N cameras, including but not limited to a 24-color chart, an 18-color chart, and the like, and the test chart usually includes a plurality of color blocks of different colors, for example, the 24-color chart includes 24 color blocks, and the 18-color chart includes 18 color blocks. Fig. 2 is an exemplary diagram of a 24-color card.
And 102, acquiring photographing consistency evaluation values of the N cameras according to the N test chart images.
The photographing consistency evaluation value of the N cameras is used for evaluating the photographing consistency of the N cameras, and the photographing consistency of the N cameras includes but is not limited to brightness consistency of the N cameras, white balance consistency of the N cameras, color accuracy consistency of the N cameras and the like. The brightness consistency of the N cameras may refer to the brightness consistency of the N cameras in imaging in the same photographing scene; the white balance consistency of the N cameras may refer to the white balance consistency of the N cameras in imaging in the same photographing scene; the accurate consistency of the colors of the N cameras can refer to the consistency of the reduction degree of each color of the N cameras in imaging in the same photographing scene, and the accurate color refers to the color reduction capability.
Optionally, the photographing consistency evaluation values of the N cameras include brightness consistency evaluation values of the N cameras, the color mode of the N test chart images is a Lab mode, and obtaining the photographing consistency evaluation values of the N cameras according to the N test chart images includes:
acquiring brightness values of M gray color blocks in the test chart in N test chart images respectively, wherein M is an integer larger than 1;
acquiring the standard deviation of the brightness value of each gray color block in the M gray color blocks and the weight of each gray color block according to the brightness value of each gray color block in the N test chart card images;
and acquiring brightness consistency evaluation values of the N cameras according to the standard deviation of the brightness value of each gray color block and the weight of each gray color block.
The Lab mode is composed of three elements, one element is a brightness value, the other two elements are an a component and a b component, the a component and the b component are two color channels, the color included in the a component is from dark green to gray to bright pink red, and the color included in the b component is from bright blue to gray to yellow.
Gray patches may also be referred to as gray levels. The M gray color patches may be at least two gray color patches selected from all gray color patches of the test card. Optionally, in order to more accurately obtain the brightness consistency evaluation values of the N cameras, the M gray color blocks may be all gray color blocks in the test chart, that is, M is the number of gray color blocks in the test chart, for example, 24 color blocks include six gray color blocks, the six gray color blocks are respectively white, gray-off, light gray, medium gray, dark gray, black, and the like, and then M is six.
After the standard deviation of the brightness value of each gray-scale color block and the weight of each gray-scale color block are obtained, the standard deviation of the brightness value of each of the M gray-scale color blocks and the weight of each of the M gray-scale color blocks can be obtained, and the standard deviation of the brightness value of each of the M gray-scale color blocks can be weighted based on the weight of each of the M gray-scale color blocks to obtain the brightness consistency evaluation value of the N cameras, namely the brightness consistency evaluation value of the N cameras is obtained according to a formula
Figure BDA0002628021560000061
The brightness consistency evaluation values of the N cameras can be calculated (namely, the evaluation values are calculated in a Lab modeStandard deviation of luminance characterizes luminance uniformity), where LumaerrExpressing the brightness uniformity evaluation value, STD, of the N camerasiStandard deviation of brightness value, W, representing ith gray color blocki LThe weight representing the ith gray shade block,
Figure BDA0002628021560000062
the sum of the weights representing the M gray color patches, i.e.
Figure BDA0002628021560000063
According to the method and the device, the weight of each gray color block is set, the resolving power of human eyes under different brightness is considered, and the accuracy of brightness consistency can be improved.
Optionally, the obtaining of the standard deviation of the brightness value of each gray color block of the M gray color blocks according to the brightness value of each gray color block in the N test card images, and the weighting of each gray color block includes:
acquiring the average brightness value of each gray color block according to the brightness value of each gray color block in the N test chart card images;
acquiring the standard deviation of the brightness value of each gray color block according to the average brightness value of each gray color block and the brightness value of each gray color block in the N test chart card images;
and acquiring the weight of each gray color block according to the average brightness value of each gray color block.
Wherein, the weight of each gray color block is used for simulating the sensitivity of human eyes to brightness difference under different brightness, and generally conforms to exponential change.
For the ith gray color block, the ith gray color block is any one of the M gray color blocks, the luminance values of the ith gray color block in the N test card images may be accumulated to obtain an accumulated value, and the accumulated value is divided by N, where the obtained value is the average luminance value of the ith gray color block.
Can be according to the formula
Figure BDA0002628021560000071
And calculating the standard deviation of the brightness value of the ith gray-scale color block, wherein,
Figure BDA0002628021560000072
represents the average luminance value of the ith gray color block,
Figure BDA0002628021560000073
representing the brightness value of the ith gray color block in the jth test chart image; can be according to the formula
Figure BDA0002628021560000074
The weight of the ith gray color block is calculated, wherein,
Figure BDA0002628021560000075
show that
Figure BDA0002628021560000076
And (3) performing normalization, wherein a represents an adjustment coefficient for adjusting the weight of the gray color block, and optionally, a user can set the adjustment coefficient by himself according to actual needs, for example, the adjustment coefficient is set to 1.1.
As shown in fig. 3a, the luminance consistency exemplary graph of two models is shown, the abscissa represents illuminance, and the ordinate represents a luminance consistency evaluation value, and the two models are mobile terminals of two different models, and as can be seen from fig. 3a, the present application can implement luminance consistency evaluation on a plurality of cameras integrated in a mobile terminal.
Optionally, the photographing consistency evaluation values of the N cameras include a white balance consistency evaluation value of the N cameras, the color mode of the N test card images is an RGB mode, and obtaining the photographing consistency evaluation values of the N cameras according to the N test card images includes:
acquiring R components, G components and B components of target gray color blocks in the test chart in the N test chart images;
and acquiring white balance consistency evaluation values of the N cameras according to the R component, the G component and the B component of the target gray color block in the N test chart images.
The RGB scheme is a color standard in the industry, and various colors are obtained by changing three color channels of red (R), green (G), and blue (B) and superimposing them on each other.
The target gray color block may refer to one of the alternative gray color blocks, the alternative gray color block refers to other gray color blocks except for white and black in all gray color blocks of the test chart, for example, if the test chart is a 24-color chart, the alternative gray color block is four color blocks of gray white, light gray, middle gray, dark gray, etc., and the target gray color block may be any one of four color blocks of gray white, light gray, middle gray, dark gray, etc. It should be noted that, when evaluating the white balance consistency, the white and black are not considered, and it is possible to avoid overexposure due to white being too bright and white balance consistency inaccuracy due to black being too dark.
Optionally, acquiring white balance consistency evaluation values of the N cameras according to R components, G components, and B components of the target gray-scale patches in the N test chart images includes:
acquiring a white balance difference value of each image group according to an R component, a G component and a B component of each image group of the target gray-scale color block, wherein any two test chart images in the N test chart images form one image group, and the N test chart images form one image group
Figure BDA0002628021560000081
The number of the image groups is one,
Figure BDA0002628021560000082
individual image group correspondence
Figure BDA0002628021560000083
Individual white balance difference values;
according to
Figure BDA0002628021560000084
And acquiring white balance consistency evaluation values of the N cameras according to the white balance difference values.
Wherein the target gray color block isThe R component, G component, and B component in each image group refer to the R component, G component, and B component of the target gray patch in the two test card images in each image group, respectively, for example, the two test card images in each image group are referred to as a first test card image and a second test card image, respectively, then the R component, G component, and B component of the target gray patch in the kth image group refer to the R component, G component, and B component of the target gray patch in the first test card image in the kth image group, and the R component, G component, and B component of the second test card image in the kth image group, wherein the kth image group is the R component, G component, and B component of the target gray patch in the kth image group
Figure BDA0002628021560000091
Any one image group in the plurality of image groups.
Can be according to the formula
Figure BDA0002628021560000092
Calculating a white balance difference value for each image group, wherein WBkA white balance difference value representing the k-th image group,
Figure BDA0002628021560000093
represents the R component of the target gray-scale patch in the first test card image in the kth image group,
Figure BDA0002628021560000094
a G component, a B component, a C component, a,
Figure BDA0002628021560000095
Represents the B component of the target gray-scale patch in the first test card image in the kth image group,
Figure BDA0002628021560000096
represents the R component of the target gray-scale patch in the second test card image in the kth image group,
Figure BDA0002628021560000097
a G component, a B component, a C component, a,
Figure BDA0002628021560000098
And B components of the target gray color block in the second test chart image in the k image group are represented.
At the moment of acquisition
Figure BDA0002628021560000099
Obtaining the white balance difference value of each image group
Figure BDA00026280215600000910
The difference value of white balance can be adjusted
Figure BDA00026280215600000911
And averaging the white balance difference values, wherein the averaged value is the white balance consistency evaluation value of the N cameras.
Illustratively, the N cameras are a main camera, a wide-angle camera and a telephoto camera, respectively, RmainRepresenting the R component, G, of a target gray shade block in a test card image captured by a main cameramainRepresenting the G component, B, of a target gray shade block in a test card image captured by a main cameramainRepresenting the B component, R, of a target gray shade block in a test card image captured by a main camerawideRepresenting the R component, G, of a target gray shade block in a test chart image captured by a wide-angle camerawideRepresenting the G component, B, of a target gray-scale patch in a test card image captured by a wide-angle camerawideRepresenting the B component, R, of a target grayscale patch in a test card image captured by a wide-angle camerateleRepresenting the R component, G, of a target gray shade in a test card image captured by a tele-camerateleRepresenting the G component, B, of a target gray-scale patch in a test card image taken by a tele-camerateleRepresents the B component of the target gray-scale patch in the test card image taken by the tele camera,
Figure BDA00026280215600000912
represents the white balance difference value of the main camera and the wide-angle camera,
Figure BDA0002628021560000101
represents the white balance difference value of the main camera and the tele camera,
Figure BDA0002628021560000102
the white balance difference value of the telephoto camera and the wide-angle camera, and the evaluation value of the white balance consistency of the three cameras
Figure BDA0002628021560000103
As shown in fig. 3b, which is an example of white balance consistency of two models, the abscissa represents a color temperature, and the ordinate represents a white balance consistency evaluation value, it can be seen from fig. 3b that the present application can implement white balance consistency evaluation on a plurality of cameras integrated with a mobile terminal.
Optionally, the photographing consistency evaluation values of the N cameras include color accuracy consistency evaluation values of the N cameras, the color mode of the N test chart images is a Lab mode, and acquiring the photographing consistency evaluation values of the N cameras according to the N test chart images includes:
obtaining H color blocks in the test chart card
Figure BDA0002628021560000104
A component and b component in the image group, wherein any two test card images in the N test card images form an image group, and the N test card images form
Figure BDA0002628021560000105
The image groups are arranged, and H color blocks refer to color blocks except gray color blocks in the test chart;
according to H color blocks respectively
Figure BDA0002628021560000106
Individual image groupThe component a and the component b in the color image are respectively obtained by H color blocks
Figure BDA0002628021560000107
Color difference values of the individual image groups;
according to H color blocks respectively
Figure BDA0002628021560000108
Acquiring respective target color difference values of H color blocks according to the color difference values of the image groups;
acquiring the weights of H color blocks;
and calculating the color accuracy consistency evaluation values of the N cameras according to the respective target color difference values of the H color blocks and the weights of the H color blocks.
The a component and the b component of a color patch in an image group refer to the a component and the b component of a color patch in two test card images of an image group, respectively, and the euclidean distance of a color patch in (a, b) of two test card images of an image group can be used to represent the color difference value of the color patch in an image group.
For the q color block, the q color block is any one of the H color blocks, and the q color block has the color difference value in the k image group
Figure BDA0002628021560000111
Wherein,
Figure BDA0002628021560000112
represents the a-component of the q-th color patch in the first test card image of the k-th image group,
Figure BDA0002628021560000113
a b-component representing the q-th color patch in the first test card image of the k-th image group,
Figure BDA0002628021560000114
indicating the qth color block in the kth drawingThe a-component in the second test card image of the group,
Figure BDA0002628021560000115
representing the b-component of the q-th color patch in the second test card image of the k-th image group.
One color block can be arranged on
Figure BDA0002628021560000116
Averaging the color difference values of the image groups, wherein the obtained average value is the target color difference value of the color block, so that the target color difference values of the H color blocks are obtained, the target color difference values of the H color blocks are weighted based on the weights of the H color blocks, the weighted values are divided by the sum of the weights, and the result is the color accuracy consistency evaluation value of the N cameras.
The weights of the H color blocks can be set according to the importance of the H color blocks in the color accuracy consistency evaluation, for example, the skin color of an Asian person is generally light skin color, and the color is formed by combining three primary colors of red, green and blue, so that the weights of the light skin color, the red, the green and the blue in the H color blocks can be set to be larger than the weights of other color blocks, namely, the attention of subjective perception to different color blocks is considered when the color accuracy consistency is calculated, and the accuracy of the color accuracy consistency can be improved. For example, the H color patches are 18 patches except for 6 gray patches in 24 patches, the weights of light skin color, red color, green color and blue color in the 18 patches are all set to be 2, and the weights of the remaining 14 patches are set to be 1.
Illustratively, the N cameras are a main camera, a wide-angle camera and a telephoto camera, respectively, and the H color patches are 18 color patches, a, of 24 color patchesmainqRepresenting the a-component of the q-th color patch in a test chart image taken by a main camera, bmainqRepresenting the b-component, a, of the q-th color patch in a test chart image taken by a main camerawideqRepresenting a component a, b of the q-th color patch in a test chart image taken by a wide-angle camerawideqIndicates that the qth color patch is wideComponent b, a in a test card image taken by an angle camerateleqRepresenting the a-component, b-component of the q-th color patch in a test card image taken by a tele-camerateleqThe b component of the q color block in the test chart image shot by the long-focus camera is represented, and the color difference value of the main camera and the wide-angle camera is
Figure BDA0002628021560000121
The color difference value between the main camera and the long-focus camera is
Figure BDA0002628021560000122
The color difference value between the telephoto camera and the wide-angle camera is
Figure BDA0002628021560000123
Target color difference value of qth color block
Figure BDA0002628021560000124
Evaluation value of color accuracy consistency of the three cameras
Figure BDA0002628021560000126
Figure BDA0002628021560000128
Represents the weight of the qth color patch,
Figure BDA0002628021560000129
representing the sum of the weights of the 18 color patches.
As shown in fig. 3c, the color accuracy consistency example of two models is shown, the abscissa represents the color temperature, and the ordinate represents the color accuracy consistency evaluation value, and as can be seen from fig. 3c, the color accuracy consistency evaluation of multiple cameras integrated with the mobile terminal can be realized in the present application.
And 103, acquiring the photographing consistency degree of the N cameras according to the photographing consistency evaluation values of the N cameras.
The photographing consistency degree of the N cameras is used for representing the quality of the photographing consistency of the N cameras, for example, the larger the photographing consistency evaluation value is, the worse the photographing consistency of the N cameras is, the smaller the photographing consistency evaluation value is, and the better the photographing consistency of the N cameras is.
According to the embodiment of the application, the test chart is photographed by N cameras integrated on the mobile terminal respectively, N test chart images can be obtained, photographing consistency evaluation values of the N cameras can be obtained according to the N test chart images, and photographing consistency degrees of the N cameras can be obtained according to the photographing consistency evaluation values of the N cameras, so that photographing consistency evaluation of the N cameras is realized.
Referring to fig. 4, which is a schematic view of an implementation flow of the photographing consistency evaluation method provided in the second embodiment of the present application, the photographing consistency evaluation method is applied to a mobile terminal, and as shown in the figure, the photographing consistency evaluation method may include the following steps:
step 401, taking pictures of the test chart through the N cameras respectively to obtain N test chart images.
The step is the same as step 101, and reference may be made to the related description of step 101, which is not described herein again.
And step 402, acquiring photographing consistency evaluation values of the N cameras according to the N test chart images.
The step is the same as step 102, and reference may be made to the related description of step 102, which is not repeated herein.
And step 403, comparing the photographing consistency evaluation values of the N cameras with an evaluation threshold value to obtain a comparison result.
The evaluation threshold may be a preset threshold for evaluating the photographing consistency, and corresponding evaluation thresholds may be set according to different types of the photographing consistency evaluation values, for example, when the photographing consistency evaluation value is a brightness consistency evaluation value, a corresponding brightness threshold may be set; when the photographing consistency evaluation value is a white balance consistency evaluation value, a corresponding white balance threshold value can be set; when the photographing consistency evaluation value is the color accuracy consistency evaluation value, a corresponding color accuracy threshold value may be set.
And step 404, acquiring the photographing consistency degrees of the N cameras according to the comparison result.
The comparison result includes that the photographing consistency evaluation value is greater than the evaluation threshold value, and the photographing consistency evaluation value is less than or equal to the evaluation threshold value, when the comparison result is that the photographing consistency evaluation value is greater than the evaluation threshold value, the photographing consistency of the N cameras is poor, a user can be prompted to optimize the photographing consistency of the N cameras by adjusting related parameters, and when the comparison result is that the photographing consistency evaluation value is less than or equal to the evaluation threshold value, the photographing consistency of the N cameras is good.
For example, when the photographing consistency evaluation value is a brightness consistency evaluation value, the brightness consistency evaluation value is compared with a brightness threshold, and if the comparison result is that the brightness consistency evaluation value is greater than the brightness threshold, the brightness consistency of the N cameras is poor, so that a user can be prompted to optimize the brightness consistency of the N cameras by adjusting parameters affecting the brightness in the N cameras; when the photographing consistency evaluation value is a white balance consistency evaluation value, comparing the white balance consistency evaluation value with a white balance threshold, and if the comparison result is that the white balance consistency evaluation value is larger than the white balance threshold, indicating that the white balance consistency of the N cameras is poor, and prompting a user to optimize the white balance consistency of the N cameras by adjusting parameters influencing the white balance in the N cameras; when the photographing consistency evaluation value is the color accuracy consistency evaluation value, the color accuracy consistency evaluation value is compared with the color accuracy threshold, if the comparison result is that the color accuracy consistency evaluation value is larger than the color accuracy threshold, the color accuracy consistency of the N cameras is poor, and a user can be prompted to optimize the color accuracy consistency of the N cameras by adjusting the parameters influencing the color accuracy in the N cameras.
According to the embodiment of the application, after the photographing consistency evaluation values of the N cameras are obtained, the photographing consistency degrees of the N cameras can be obtained by comparing the photographing consistency evaluation values with the corresponding evaluation threshold values, and therefore the photographing consistency evaluation of the N cameras is achieved.
Referring to fig. 5, a schematic structural diagram of a photographing consistency evaluation apparatus provided in the third embodiment of the present application is shown, and for convenience of description, only the parts related to the third embodiment of the present application are shown.
The photographing consistency evaluating device includes:
the image card photographing module 51 is configured to photograph the test image card through N cameras respectively to obtain N test image card images, where N is an integer greater than 1, and each of the N cameras corresponds to one test image card image;
an evaluation value obtaining module 52, configured to obtain, according to the N test card images, photographing consistency evaluation values of the N cameras;
and the consistency obtaining module 53 is configured to obtain the photographing consistency degrees of the N cameras according to the photographing consistency evaluation values of the N cameras.
Optionally, the photographing consistency evaluation values of the N cameras include brightness consistency evaluation values of the N cameras, the color modes of the N test chart images are Lab modes, and the evaluation value obtaining module includes:
the first acquisition unit is used for acquiring the brightness values of M gray-scale color blocks in the test chart in N test chart images, wherein M is an integer greater than 1;
the second acquisition unit is used for acquiring the standard deviation of the brightness value of each gray color block and the weight of each gray color block according to the brightness value of each gray color block in the M gray color blocks in the N test chart card images;
and the third acquisition unit is used for acquiring the brightness consistency evaluation values of the N cameras according to the standard deviation of the brightness value of each gray color block and the weight of each gray color block.
Optionally, the second obtaining unit is specifically configured to:
calculating the average brightness value of each gray color block according to the brightness value of each gray color block in the N test chart card images;
acquiring the standard deviation of the brightness value of each gray color block according to the average brightness value of each gray color block and the brightness value of each gray color block in the N test chart card images;
and acquiring the weight of each gray color block according to the average brightness value of each gray color block.
The photographing consistency evaluation value of the N cameras comprises a white balance consistency evaluation value of the N cameras, the color mode of the N test card images is an RGB mode, and the evaluation value acquisition module comprises:
the fourth acquisition unit is used for acquiring the R component, the G component and the B component of the target gray-scale color block in the test chart in the N test chart images;
and the fifth acquisition unit is used for acquiring the white balance consistency evaluation values of the N cameras according to the R component, the G component and the B component of the target gray color block in the N test chart images.
Optionally, the fifth obtaining unit is specifically configured to:
acquiring a white balance difference value of each image group according to an R component, a G component and a B component of each image group of the target gray-scale color block, wherein any two test chart images in the N test chart images form one image group, and the N test chart images form one image group
Figure BDA0002628021560000151
The number of the image groups is one,
Figure BDA0002628021560000152
individual image group correspondence
Figure BDA0002628021560000153
Individual white balance difference values;
according to
Figure BDA0002628021560000154
And acquiring white balance consistency evaluation values of the N cameras according to the white balance difference values.
Optionally, the photographing consistency evaluation values of the N cameras include color accuracy consistency evaluation values of the N cameras, the color modes of the N test chart images are Lab modes, and the evaluation value obtaining module includes:
a sixth acquiring unit for acquiring H color blocks in the test chart
Figure BDA0002628021560000161
A component and b component in the image group, wherein any two test card images in the N test card images form an image group, and the N test card images form
Figure BDA0002628021560000162
The image groups are arranged, and H color blocks refer to color blocks except gray color blocks in the test chart;
a seventh obtaining unit for obtaining H color blocks
Figure BDA0002628021560000163
Acquiring H color blocks of a component and a component b in each image group
Figure BDA0002628021560000164
Color difference values of the individual image groups;
an eighth acquiring unit for respectively acquiring the H color blocks
Figure BDA0002628021560000165
Acquiring respective target color difference values of H color blocks according to the color difference values of the image groups;
a ninth acquiring unit configured to acquire weights of the H color patches;
and the tenth acquisition unit is used for acquiring color precision consistency evaluation values of the N cameras according to respective target color difference values of the H color blocks and the weights of the H color blocks.
Optionally, the consistency obtaining module 53 is specifically configured to:
comparing the photographing consistency evaluation values of the N cameras with an evaluation threshold value to obtain a comparison result;
and according to the comparison result, acquiring the photographing consistency degrees of the N cameras.
The photographing consistency evaluation device provided in the embodiment of the present application can be applied to the first method embodiment and the second method embodiment, and for details, reference is made to the description of the first method embodiment and the second method embodiment, and details are not repeated here.
Fig. 6 is a schematic structural diagram of a mobile terminal according to a fourth embodiment of the present application. The mobile terminal as shown in the figure may include: one or more processors 601 (only one shown); one or more input devices 602 (only one shown), one or more output devices 603 (only one shown), and memory 604. The processor 601, the input device 602, the output device 603, and the memory 604 are connected by a bus 605. The memory 604 is used for storing instructions, and the processor 601 is used for implementing the steps in the above-mentioned each photographing consistency evaluation method embodiment when the instructions stored in the memory 604 are executed.
It should be understood that, in the embodiment of the present Application, the Processor 601 may be a Central Processing Unit (CPU), and the Processor may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 602 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, a data receiving interface, and the like. The output device 603 may include a display (LCD, etc.), speakers, a data transmission interface, and the like.
The memory 604 may include both read-only memory and random access memory, and provides instructions and data to the processor 601. A portion of the memory 604 may also include non-volatile random access memory. For example, the memory 604 may also store device type information.
In a specific implementation, the processor 601, the input device 602, the output device 603, and the memory 604 described in this embodiment of the present application may execute the implementation described in the embodiment of the photographing consistency evaluation method provided in this embodiment of the present application, or may execute the implementation described in the photographing consistency evaluation apparatus provided in the third embodiment of the present application, which is not described herein again.
Fig. 7 is a schematic structural diagram of a mobile terminal according to a fifth embodiment of the present application. As shown in fig. 7, the mobile terminal 7 of this embodiment includes: one or more processors 70 (only one of which is shown), a memory 71, and a computer program 72 stored in the memory 71 and executable on the at least one processor 70. The steps in the above-described respective photographing consistency evaluation method embodiments are implemented when the processor 70 executes the computer program 72.
The mobile terminal 7 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The mobile terminal may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is only an example of a mobile terminal 7 and does not constitute a limitation of the mobile terminal 7, and that it may comprise more or less components than those shown, or some components may be combined, or different components, e.g. the mobile terminal may further comprise input output devices, network access devices, buses, etc.
The processor 70 may be a central processing unit CPU, but may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 71 may be an internal storage unit of the mobile terminal 7, such as a hard disk or a memory of the mobile terminal 7. The memory 71 may also be an external storage device of the mobile terminal 7, such as a plug-in hard disk provided on the mobile terminal 7, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 71 may also include both an internal storage unit of the mobile terminal 7 and an external storage device. The memory 71 is used for storing computer programs and other programs and data required by the mobile terminal. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/mobile terminal and method may be implemented in other ways. For example, the above-described apparatus/mobile terminal embodiments are merely illustrative, and for example, a division of modules or units is merely a logical division, and an actual implementation may have another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the embodiments described above may be implemented by a computer program, which is stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, U.S. disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution media, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, in accordance with legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunications signals.
When the computer program product runs on the mobile terminal, the steps in the method embodiments can be realized when the mobile terminal executes the computer program product.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A photographing consistency evaluation method is characterized by comprising the following steps:
respectively photographing the test chart through N cameras to obtain N test chart images, wherein N is an integer greater than 1, and the N cameras respectively correspond to one test chart image;
acquiring photographing consistency evaluation values of the N cameras according to the N test chart images;
and acquiring the photographing consistency degree of the N cameras according to the photographing consistency evaluation values of the N cameras.
2. The photographing consistency evaluation method according to claim 1, wherein the photographing consistency evaluation values of the N cameras comprise brightness consistency evaluation values of the N cameras, the color mode of the N test card images is a Lab mode, and the obtaining the photographing consistency evaluation values of the N cameras according to the N test card images comprises:
acquiring brightness values of M gray color blocks in the test chart in the N test chart images respectively, wherein M is an integer larger than 1;
acquiring the standard deviation of the brightness value of each gray color block in the M gray color blocks and the weight of each gray color block according to the brightness value of each gray color block in the N test chart card images;
and acquiring the brightness consistency evaluation values of the N cameras according to the standard deviation of the brightness value of each gray color block and the weight of each gray color block.
3. The method of claim 2, wherein the obtaining of the standard deviation of the brightness value of each gray color block of the M gray color blocks according to the brightness value of each gray color block in the N test card images and the weighting of each gray color block comprises:
calculating the average brightness value of each gray color block according to the brightness value of each gray color block in the N test chart card images;
acquiring a standard deviation of the brightness value of each gray color block according to the average brightness value of each gray color block and the brightness value of each gray color block in the N test chart card images;
and acquiring the weight of each gray color block according to the average brightness value of each gray color block.
4. The photographing consistency evaluation method according to claim 1, wherein the photographing consistency evaluation values of the N cameras comprise white balance consistency evaluation values of the N cameras, the color mode of the N test card images is an RGB mode, and the obtaining the photographing consistency evaluation values of the N cameras according to the N test card images comprises:
acquiring R components, G components and B components of target gray color blocks in the test chart in the N test chart images;
and acquiring the white balance consistency evaluation values of the N cameras according to the R component, the G component and the B component of the target gray color block in the N test chart images.
5. The photographing consistency evaluation method according to claim 4, wherein the obtaining the white balance consistency evaluation values of the N cameras according to the R component, the G component and the B component of the target gray-scale color block in the N test chart images comprises:
acquiring a white balance difference value of each image group according to an R component, a G component and a B component of the target gray color block in each image group, wherein any two test chart images in the N test chart images form one image group, and the N test chart images form
Figure FDA0002628021550000021
The number of the image groups is one,
Figure FDA0002628021550000022
individual image group correspondence
Figure FDA0002628021550000023
Individual white balance difference values;
according to
Figure FDA0002628021550000024
And acquiring a white balance consistency evaluation value of the N cameras according to the white balance difference values.
6. The photographing consistency evaluation method according to claim 1, wherein the photographing consistency evaluation values of the N cameras comprise color accurate consistency evaluation values of the N cameras, the color mode of the N test card images is a Lab mode, and the obtaining the photographing consistency evaluation values of the N cameras according to the N test card images comprises:
obtaining respective H color blocks in the test chart
Figure FDA0002628021550000025
Any two test card images in the N test card images form an image group, and the N test card images form an image group
Figure FDA0002628021550000031
The H color blocks refer to color blocks except for gray color blocks in the test chart;
according to the H color blocks respectively
Figure FDA0002628021550000032
Obtaining the component a and the component b in each image group, and obtaining the H color blocks in the image groups
Figure FDA0002628021550000033
Color difference values of the individual image groups;
according to the H color blocks respectively in the
Figure FDA0002628021550000034
Acquiring target color difference values of the H color blocks according to the color difference values of the image groups;
acquiring the weights of the H color blocks;
and acquiring the color accuracy consistency evaluation values of the N cameras according to the respective target color difference values of the H color blocks and the weights of the H color blocks.
7. The photographing consistency evaluation method according to any one of claims 1 to 6, wherein the obtaining the photographing consistency degrees of the N cameras according to the photographing consistency evaluation values of the N cameras comprises:
comparing the photographing consistency evaluation values of the N cameras with an evaluation threshold value to obtain a comparison result;
and acquiring the photographing consistency degree of the N cameras according to the comparison result.
8. A photographing consistency evaluating apparatus, characterized by comprising:
the image card photographing module is used for photographing the test image card through N cameras respectively to obtain N test image card images, wherein N is an integer larger than 1, and the N cameras respectively correspond to one test image card image;
the evaluation value acquisition module is used for acquiring the photographing consistency evaluation values of the N cameras according to the N test card images;
and the consistency acquisition module is used for acquiring the photographing consistency degrees of the N cameras according to the photographing consistency evaluation values of the N cameras.
9. A mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the photographing consistency evaluation method according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the photographing consistency evaluation method according to any one of claims 1 to 7.
CN202010802853.2A 2020-08-11 2020-08-11 Photographing consistency evaluation method and device, mobile terminal and storage medium Active CN111800626B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010802853.2A CN111800626B (en) 2020-08-11 2020-08-11 Photographing consistency evaluation method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010802853.2A CN111800626B (en) 2020-08-11 2020-08-11 Photographing consistency evaluation method and device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111800626A true CN111800626A (en) 2020-10-20
CN111800626B CN111800626B (en) 2022-04-26

Family

ID=72833908

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010802853.2A Active CN111800626B (en) 2020-08-11 2020-08-11 Photographing consistency evaluation method and device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111800626B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292461A (en) * 2000-04-06 2001-10-19 Sony Corp Performance evaluation system for camera and performance evaluation method for camera
CN106303505A (en) * 2015-05-29 2017-01-04 鸿富锦精密工业(深圳)有限公司 Camera module color consistency detection device and detection method
CN108012142A (en) * 2017-10-20 2018-05-08 上海与德科技有限公司 Camera testing system, method, terminal and computer-readable recording medium
CN110490938A (en) * 2019-08-05 2019-11-22 Oppo广东移动通信有限公司 For verifying the method, apparatus and electronic equipment of camera calibration parameter

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001292461A (en) * 2000-04-06 2001-10-19 Sony Corp Performance evaluation system for camera and performance evaluation method for camera
CN106303505A (en) * 2015-05-29 2017-01-04 鸿富锦精密工业(深圳)有限公司 Camera module color consistency detection device and detection method
CN108012142A (en) * 2017-10-20 2018-05-08 上海与德科技有限公司 Camera testing system, method, terminal and computer-readable recording medium
CN110490938A (en) * 2019-08-05 2019-11-22 Oppo广东移动通信有限公司 For verifying the method, apparatus and electronic equipment of camera calibration parameter

Also Published As

Publication number Publication date
CN111800626B (en) 2022-04-26

Similar Documents

Publication Publication Date Title
CN111654594B (en) Image capturing method, image capturing apparatus, mobile terminal, and storage medium
Mantiuk et al. High-dynamic range imaging pipeline: perception-motivated representation of visual content
CN104580922B (en) A kind of control method and device for shooting light filling
CN113192470B (en) Screen adjusting method and device, storage medium and electronic equipment
Chakrabarti et al. Modeling radiometric uncertainty for vision with tone-mapped color images
CN109040596B (en) Method for adjusting camera, mobile terminal and storage medium
CN108200351A (en) Image pickup method, terminal and computer-readable medium
CN112840636A (en) Image processing method and device
US20170193643A1 (en) Image processing apparatus, image processing method, program, and recording medium
TW202137133A (en) Image processing method, electronic device and computer readable storage medium
WO2019029573A1 (en) Image blurring method, computer-readable storage medium and computer device
WO2023071933A1 (en) Camera photographing parameter adjustment method and apparatus and electronic device
WO2023077981A1 (en) Display parameter adjusting method and apparatus, storage medium, and display device
CN110618852B (en) View processing method, view processing device and terminal equipment
CN108932703B (en) Picture processing method, picture processing device and terminal equipment
CN115065814A (en) Screen color accuracy detection method and device
CN114764771A (en) Image quality evaluation method, device, equipment, chip and storage medium
CN112489144A (en) Image processing method, image processing apparatus, terminal device, and storage medium
CN111800626B (en) Photographing consistency evaluation method and device, mobile terminal and storage medium
WO2020155072A1 (en) Mixed layer processing method and apparatus
CN111602390A (en) Terminal white balance processing method, terminal and computer readable storage medium
CN116843566A (en) Tone mapping method, tone mapping device, display device and storage medium
CN114820514A (en) Image processing method and electronic equipment
CN115439577A (en) Image rendering method and device, terminal equipment and storage medium
CN111479074A (en) Image acquisition method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant