CN115118963A - Image quality adjusting method, electronic device and storage medium - Google Patents

Image quality adjusting method, electronic device and storage medium Download PDF

Info

Publication number
CN115118963A
CN115118963A CN202110293776.7A CN202110293776A CN115118963A CN 115118963 A CN115118963 A CN 115118963A CN 202110293776 A CN202110293776 A CN 202110293776A CN 115118963 A CN115118963 A CN 115118963A
Authority
CN
China
Prior art keywords
image
picture
test
verification
image quality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110293776.7A
Other languages
Chinese (zh)
Inventor
李庄
王宇冬
夏永霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110293776.7A priority Critical patent/CN115118963A/en
Publication of CN115118963A publication Critical patent/CN115118963A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/04Diagnosis, testing or measuring for television systems or their details for receivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The embodiment of the application provides an image quality adjusting method, electronic equipment and a storage medium, which relate to the technical field of communication, and the method comprises the following steps: responding to preset operation of a user, and establishing communication connection with second equipment; sending a display instruction to the second device, wherein the display instruction is used for indicating the second device to display a test picture; responding to a received first shooting instruction sent by the second equipment, shooting a test picture displayed by the second equipment to obtain a test image; and calculating the image quality parameters of the test image, and sending the image quality parameters of the test image to the second equipment. The method provided by the embodiment of the application can facilitate the operation of a user, improve the efficiency of image quality adjustment and save the cost of image quality adjustment.

Description

Image quality adjusting method, electronic device and storage medium
Technical Field
The embodiment of the application relates to the technical field of communication, and in particular relates to a picture quality adjusting method, an electronic device and a storage medium.
Background
At present, with the continuous development of the television industry and panel technology. More and more smart televisions are present in each household. Generally, before shipment from a factory, image quality parameters of a smart television are adjusted by a professional image quality person and then are built in the smart television. However, in a home scene, since the home environment light where the user is located is not consistent with the factory-corrected environment light, the color perceived by the user is not consistent with the optimally-corrected color, and the user's perceptibility is poor.
Disclosure of Invention
The embodiment of the application provides a picture quality adjusting method, electronic equipment and a storage medium, and aims to provide a mode for adjusting the picture quality of a smart television through a mobile phone.
In a first aspect, an embodiment of the present application provides an image quality adjustment method applied to a first device, including:
responding to preset operation of a user, and establishing communication connection with second equipment; specifically, the first device may be an electronic device having a shooting function, such as a mobile phone. The communication connection may be a local connection, e.g., a WIFI connection; the communication connection may also be an internet connection.
Sending a display instruction to the second equipment, wherein the display instruction is used for indicating the second equipment to display the test picture; in particular, the display instruction may include a type of test picture, which may be used to instruct the second device what type of test picture to display. The types of the test picture may include a normal type and a special type. The type of normal picture can be used to correct color accuracy and brightness. The type of the special picture can be used to correct the white balance.
In response to a received first shooting instruction sent by the second equipment, shooting a test picture displayed by the second equipment to obtain a test image; specifically, the test picture corresponds to the test image.
And calculating the image quality parameters of the test image, and sending the image quality parameters of the test image to the second equipment. Specifically, the image quality parameters may include color coordinates and brightness.
In the embodiment of the application, the test picture displayed on the large screen is shot through the mobile phone, the picture quality parameter of the shot picture is calculated, and the picture quality parameter is sent to the large screen, so that the large screen can judge based on the received picture quality parameter, therefore, the correction of the picture quality can be realized, the operation of a user is facilitated, the efficiency of picture quality adjustment is improved, and the cost of the picture quality adjustment is saved.
In one possible implementation manner, before sending the display instruction to the second device, the method further includes:
and detecting whether the ambient light and the test distance meet preset conditions, and determining whether to send a display instruction to the second equipment according to a detection result.
In the embodiment of the application, the mobile phone is subjected to ambient light and test distance detection before the test picture is shot, so that the shooting quality can be further improved, and the calculation accuracy of the picture quality parameters of the test picture can be further improved.
In one possible implementation manner, after sending the image quality parameter of the test image to the second device, the method further includes:
in response to a received second shooting instruction sent by the second equipment, shooting a verification picture displayed by the second equipment to obtain a verification image;
and calculating the image quality parameters of the verification image, and sending the image quality parameters of the verification image to the second equipment.
In the embodiment of the application, after the test picture is judged, the verification picture is displayed again so as to further verify the picture after the picture quality is adjusted, and therefore the accuracy of the picture quality adjustment can be improved.
In one possible implementation manner, the method further includes:
and receiving an image quality adjustment completion notification sent by the second device, wherein the image quality adjustment completion notification is used for notifying the first device that the image quality adjustment is completed.
In one possible implementation, the image quality parameters include color coordinates and brightness.
The embodiment of the present application further provides an image quality adjustment method, applied to a second device, where the second device and a first device have established a communication connection, including:
responding to a received display instruction sent by the first equipment, and displaying a test picture; sending a first shooting instruction to first equipment; receiving image quality parameters of a test image sent by first equipment; wherein the test image corresponds to the test picture; and adjusting the quality of the test picture based on the quality parameters of the test picture. Specifically, the second device may be an electronic device having a display function such as a large screen.
In one possible implementation manner, the method further includes:
displaying the verification picture; the verification picture is obtained by adjusting the image quality of the test picture;
receiving image quality parameters of a verification image sent by first equipment; the verification image corresponds to the verification picture;
and adjusting the quality of the verification picture based on the quality parameters of the verification picture.
In one possible implementation manner, adjusting the quality of the verification picture based on the quality parameter of the verification image includes:
comparing the image quality parameters of the verification image with preset image quality parameters, and judging whether the image quality of the verification image needs to be adjusted or not according to the comparison result;
if the adjustment is needed, displaying the test picture again to finish the image quality adjustment;
and if the adjustment is not needed, sending an image quality adjustment completion notification to the first equipment.
In order to improve the flexibility of selecting the verification pictures and improve the verification efficiency; in one possible implementation manner, the test picture includes a plurality of pictures, and the number of the verification pictures is determined by the number of the test pictures.
In one possible implementation manner, the test picture is a single special picture, the single special picture is used for correcting white balance, and displaying the test picture includes:
a single special picture is displayed.
In one possible implementation manner, the test picture is a plurality of normal pictures, the normal pictures are used for correcting color precision and color gamut, and displaying the test picture includes:
and displaying a plurality of normal pictures in sequence.
In a second aspect, an embodiment of the present application provides an image quality adjustment apparatus, applied to a first device, including:
the communication connection module is used for responding to preset operation of a user and establishing communication connection with the second equipment;
the indication module is used for sending a display instruction to the second equipment, and the display instruction is used for indicating the second equipment to display the test picture;
the first shooting module is used for shooting a test picture displayed by the second equipment in response to a received first shooting instruction sent by the second equipment to obtain a test image;
the first calculation module is used for calculating the image quality parameters of the test image;
and the first sending module is used for sending the image quality parameters of the test image to the second equipment.
In one possible implementation manner, the apparatus further includes:
and the detection module is used for detecting whether the ambient light and the test distance meet preset conditions or not and determining whether to send a display instruction to the second equipment or not according to the detection result.
In one possible implementation manner, the apparatus further includes:
the second shooting module is used for shooting a verification picture displayed by the second equipment in response to a received second shooting instruction sent by the second equipment to obtain a verification image;
the second calculation module is used for calculating the image quality parameters of the verification image;
and the second sending module is used for sending the image quality parameters of the verification image to the second equipment.
In one possible implementation manner, the apparatus further includes:
the receiving module is used for receiving a picture quality adjustment completion notification sent by the second device, wherein the picture quality adjustment completion notification is used for notifying that the picture quality adjustment of the first device is completed.
In one possible implementation, the image quality parameters include color coordinates and brightness.
An embodiment of the present application further provides an image quality adjustment apparatus, which is applied to a second device, where the second device and a first device have established a communication connection, and the image quality adjustment apparatus includes:
the display module is used for responding to a received display instruction sent by the first equipment and displaying the test picture;
the indication module is used for sending a first shooting instruction to the first equipment;
the receiving module is used for receiving the image quality parameters of the test image sent by the first equipment; wherein the test image corresponds to the test picture;
and the image quality adjusting module is used for adjusting the image quality of the picture based on the image quality parameters of the test image.
In one possible implementation manner, the apparatus further includes:
the verification module is used for displaying a verification picture; the verification picture is obtained by adjusting the image quality of the test picture; receiving image quality parameters of a verification image sent by first equipment; wherein the verification image corresponds to the verification picture; and adjusting the quality of the verification picture based on the quality parameters of the verification picture.
In one possible implementation manner, the image quality adjustment module is further configured to adjust the image quality of the image
Comparing the image quality parameters of the verification image with preset image quality parameters, and judging whether the image quality of the verification image needs to be adjusted according to the comparison result;
if the adjustment is needed, the test picture is displayed again to complete the picture quality adjustment;
and if the adjustment is not needed, sending an image quality adjustment completion notification to the first equipment.
In one possible implementation manner, the test picture includes a plurality of pictures, and the number of the verification pictures is determined by the number of the test pictures.
In one possible implementation manner, the test picture is a single special picture, the single special picture is used for correcting white balance, and the display module is further used for displaying the single special picture.
In one possible implementation manner, the test pictures are multiple normal pictures, the normal pictures are used for correcting color precision and color gamut, and the display module is further used for sequentially displaying the multiple normal pictures.
In a third aspect, an embodiment of the present application provides a first device, including:
a memory for storing computer program code, the computer program code including instructions that, when read from the memory, cause the first device to perform the steps of:
responding to preset operation of a user, and establishing communication connection with second equipment;
sending a display instruction to the second equipment, wherein the display instruction is used for indicating the second equipment to display the test picture;
in response to a received first shooting instruction sent by the second equipment, shooting a test picture displayed by the second equipment to obtain a test image;
and calculating the image quality parameters of the test image, and sending the image quality parameters of the test image to the second equipment.
In one possible implementation manner, when the instruction is executed by the first device, before the first device executes the step of sending the display instruction to the second device, the following steps are further executed:
and detecting whether the ambient light and the test distance meet preset conditions, and determining whether to send a display instruction to the second equipment according to a detection result.
In one possible implementation manner, when the instruction is executed by the first device, after the first device executes the step of sending the image quality parameter of the test image to the second device, the following steps are further executed:
in response to a received second shooting instruction sent by the second equipment, shooting a verification picture displayed by the second equipment to obtain a verification image;
and calculating the image quality parameters of the verification image, and sending the image quality parameters of the verification image to the second equipment.
In one possible implementation manner, when the instruction is executed by the first device, the first device further performs the following steps:
and receiving an image quality adjustment completion notification sent by the second device, wherein the image quality adjustment completion notification is used for notifying the first device that the image quality adjustment is completed.
In one possible implementation, the image quality parameters include color coordinates and brightness.
An embodiment of the present application further provides a second device, including:
a memory for storing computer program code, the computer program code including instructions that, when read from the memory by the second device, cause the second device to perform the steps of:
responding to a received display instruction sent by the first equipment, and displaying a test picture;
sending a first shooting instruction to first equipment;
receiving image quality parameters of a test image sent by first equipment; wherein the test image corresponds to the test picture;
and adjusting the quality of the test picture based on the quality parameters of the test picture.
In one possible implementation manner, when the instruction is executed by the second device, the second device further performs the following steps:
displaying the verification picture; the verification picture is obtained by adjusting the image quality of the test picture;
receiving image quality parameters of a verification image sent by first equipment; the verification image corresponds to the verification picture;
and adjusting the quality of the verification picture based on the quality parameters of the verification picture.
In one possible implementation manner, when executed by the second device, the instruction causes the second device to perform the step of adjusting the quality of the verification picture based on the quality parameter of the verification picture, including:
comparing the image quality parameters of the verification image with preset image quality parameters, and judging whether the image quality of the verification image needs to be adjusted according to the comparison result;
if the adjustment is needed, the test picture is displayed again to complete the picture quality adjustment;
and if the adjustment is not needed, sending an image quality adjustment completion notification to the first equipment.
In one possible implementation manner, the test picture includes a plurality of pictures, and the number of the verification pictures is determined by the number of the test pictures.
In one possible implementation manner, the test picture is a single special picture, the single special picture is used for correcting white balance, and when the instruction is executed by the second device, the step of displaying the test picture by the second device includes:
a single special picture is displayed.
In one possible implementation manner, the test pictures are multiple normal pictures, and the normal pictures are used for correcting color accuracy and color gamut, and when the instruction is executed by the second device, the step of displaying the test pictures by the second device includes:
and displaying a plurality of normal pictures in sequence.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon a computer program, which, when run on a computer, causes the computer to perform the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program, which is configured to perform the method according to the first aspect when the computer program is executed by a computer.
In a possible design, the program in the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic diagram of a display effect provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of the colorimeter according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a color correction system according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating an effect of a filter configuration according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a CIE power response curve provided by an embodiment of the present application;
fig. 6 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 7 is a schematic flowchart illustrating an embodiment of a method for adjusting image quality according to the present application;
fig. 8 is a schematic view of a WIFI connection process provided in the embodiment of the present application;
fig. 9 is a schematic diagram of a chromaticity mapping provided in an embodiment of the present application;
FIG. 10 is a schematic representation of Moire patterns provided by embodiments of the present application;
fig. 11 is a schematic structural diagram of a CNN model provided in an embodiment of the present application;
FIG. 12 is a diagram illustrating the effect of logarithmic distribution of signals according to an embodiment of the present application;
FIG. 13 is a schematic diagram of training effects provided by an embodiment of the present application;
fig. 14 is a schematic flowchart illustrating another embodiment of a method for adjusting image quality according to the present application;
fig. 15 is a schematic structural diagram of an embodiment of an image quality adjustment apparatus according to the present application;
fig. 16 is a schematic structural diagram of another embodiment of an image quality adjustment apparatus according to the present application;
fig. 17 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, the meaning of "a plurality" is two or more unless otherwise specified.
At present, the screen of the smart television is generally of two types: LCD and OLED. The LCD screen substrate is made of inorganic glass materials, and the OLED substrate is made of organic polymer materials. The OLED screen will gradually age, and the self-luminous property of the OLED screen also has the following disadvantages:
a. the brightness is declined;
b. color cast of a screen;
c. and (5) image retention.
Due to the above various factors, the user's viewing experience may be poor, and fig. 1 is a view of the viewing effect.
In the prior art, in order to solve the above problems, one way is to use a professional device (e.g., a colorimeter) to collect images of RGB three primary color coordinates and luminance values of each pixel point, and then connect the images with a smart television through a computer, and perform manual adjustment by a professional.
The colorimeter is a mode simulating the color perceived by human eyes, and fig. 2 is a working principle diagram of the colorimeter. As shown in fig. 2, incident light is emitted from the light source of the colorimeter and strikes the surface of the sample being measured, and when the light is absorbed by the sample and reflected back, it is filtered by a three-color filter (e.g., RGB red green blue filter) that extracts three stimulus (RGB) values to match the color seen by the user's eye.
Fig. 3 is a color correction system. As shown in fig. 3, the mode selection device may transmit standard patterns (pure green/pure red/pure white, etc.) to the tv, and the computer may obtain RGB values of the standard patterns from the pattern terminal. The colorimeter can detect the RGB values of the current screen. The color software acquires the RGB value of the pattern and the RGB value of the color analyzer on one hand, and calculates the difference value of the RGB on the other hand to generate a configuration file comprising information such as color characteristics, color gamut range and the like. The computer may then transmit the configuration file to the television, causing the television to generate a corresponding Picture Quality (PQ) file.
However, the colorimeter and other devices used in the above method have a high value, which may cause huge cost to enterprises or individuals, and require professional color software and professional personnel to perform the test, which is inconvenient to operate.
Another way is to use an imaging spectral color luminance meter. In an optical structure, in order to obtain a light and thin design, a camera of a mobile phone generally adopts a bayer array filter, and is directly integrated onto a pixel (pixel) point of a Complementary Metal-Oxide-Semiconductor (CMOS), so as to obtain a complete image through interpolation. The imaging spectral chromaticity luminance meter is used as a professional device, a rotating disc type optical filter can be selected, and the optical filter conforms to a CIE response curve, so that effective pixels are higher, and the chromaticity reduction degree and the accuracy are higher. FIG. 4 is a diagram illustrating the effect of filter configuration.
On the basis of a subsequent processing algorithm of an acquired image, the image looks better by taking pictures with a mobile phone, namely a visually more exquisite image is generated, but a large amount of nonlinear transformation is introduced in the process, so that a real optical signal is seriously distorted and compressed to a narrow human eye sensitive spectrum band; and the post-processing algorithm of the imaging spectral chromaticity luminance meter can reduce the real information of the object to the maximum extent, thereby being beneficial to subsequent scientific analysis.
The basic principle of the imaging spectral colorimeter is that three primary colors, i.e., so-called three primary colors, which cannot be decomposed any more, can be obtained after any color decomposition, and the primary colors used in the field of image capture and display are the three optical primary colors, i.e., red (G), green (G), and blue (B). The CIE power response curves of the three primary colors are shown in fig. 5.
The tristimulus values can then be calculated by the following integral formula:
Figure BDA0002983518620000071
Figure BDA0002983518620000072
Figure BDA0002983518620000073
the measurement of chromaticity can be converted into the measurement of tristimulus values, and from the view point of the CIE power response curve, the tristimulus curves are mixed, so that the measurement is needed after the separation of the tristimulus by the red light filter, the green light filter and the blue light filter.
However, the imaging spectral color brightness meter and other devices used in the above method have a high value, which may cause huge cost for enterprises or individuals, and require professional color software and professional personnel for testing, which makes the operation inconvenient.
Based on the above problems, an embodiment of the present application provides an image quality adjustment method.
Now, the image quality adjustment method according to the embodiment of the present application is described with reference to fig. 6 to fig. 14, and fig. 6 shows an application scenario according to the embodiment of the present application, where the application scenario includes the first device 10 and the second device 20, with reference to fig. 6. Here, the first device 10 may be a mobile terminal (e.g., a mobile phone) having a photographing function, and the second device 20 may be a display device (e.g., a smart screen) having a large screen.
A mobile terminal can also be called a terminal device, User Equipment (UE), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a User terminal, a wireless communication device, a User agent, or a User Equipment. The Mobile terminal may be a cellular telephone, a cordless telephone, a Personal Digital Assistant (PDA) device, a handheld device, computing device or handheld communication device having wireless communication capabilities, a handheld computing device, a satellite radio, and/or other devices for communicating over a wireless system as well as next generation communication systems, e.g., a Mobile terminal in a 5G Network or a Mobile terminal in a future evolved Public Land Mobile Network (PLMN) Network, etc.
Fig. 7 is a schematic flowchart illustrating an embodiment of a method for adjusting image quality according to an embodiment of the present application, including:
in step 701, the first device 10 sends a connection request to the second device 20 to establish a connection between the first device 10 and the second device 20.
Specifically, the user may operate on the first device 10 to initiate an adjustment of the picture quality of the second device 20. Illustratively, the user may open a picture quality adjustment Application (APP) in the first device 10. In response to the user's operation, the first device 10 may send a connection request to the second device 20, which may be used to establish a connection between the first device 10 and the second device 20. Wherein, above-mentioned connection can be internet connection, and above-mentioned connection also can be WIFI local connection. It is to be understood that the above-described connection modes do not limit the embodiments of the present application.
Fig. 8 is a flowchart of establishing a WIFI connection. As shown in fig. 8, the first device 10 includes a communication module 11 and an interworking service module 12, the second device 20 includes a communication module 21 and an interworking service module 22, and the establishing of the WIFI connection may include the following sub-steps:
in response to the user's operation, the communication module 11 sends a scan request to the interworking service module 12, step 7011.
In step 7012, the interworking service module 12 receives the scan request from the communication module 11 and performs local area network discovery.
Step 7013, interworking service module 12 performs handshake communication with interworking service module 22.
At step 7014, the interworking service module 12 obtains the device ID of the second device 20.
In step 7015, the interworking service module 12 transmits the device ID of the second device 20 to the communication module 11.
In step 7016, the second device 20 to be connected is determined in response to the selection operation of the user.
At step 7017, the communication module 11 sends a connection request to the interworking service module 12.
At step 7018, interworking service module 12 establishes a connection with interworking service module 22 based on the connection request.
At step 7019, interworking service module 22 sends a connection confirmation message to communication module 21 to notify communication module 21 that the connection has been successful.
In step 701A, the interworking service module 12 sends a connection confirmation message to the communication module 11 to notify the communication module 11 that the connection is successful.
In step 702, the first device 10 sends a test picture playing request to the second device 20.
Specifically, after the first device 10 establishes a connection with the second device 20, a test picture playing request may be sent to the second device 20, where the test picture playing request is used to request the second device 20 to display a preset test picture. The test picture may be a standard picture taken by a PQ engineer in a laboratory.
It is understood that the test pictures may include normal pictures and special pictures. The special picture may be a white picture, and the white picture may be used for correcting white balance. The normal picture may include all pictures except the above-described white picture, and the normal picture may be used to correct color accuracy and color gamut. In the embodiment of the present application, the user requests the second device 20 to display a special picture as an example, so that the white balance can be corrected.
Optionally, before the first device 10 sends the test picture playing request to the second device 20, the first device 10 may also detect the current test environment. For example, the first device 10 may calculate a test distance through a Time of Flight (TOF) lens and calculate ambient light through an ambient light sensor, compare the calculated test distance and ambient light with a preset test distance threshold and compare the calculated test distance and ambient light with the preset test distance threshold and send a test picture playing request to the second device 20 if the calculated test distance meets a condition of the test distance threshold and the calculated ambient light meets a condition of the ambient light threshold.
In step 703, the second device 20 receives the test picture playing request sent by the first device 10, and displays the test picture P1.
In step 704, the second device 20 sends a shooting instruction to the first device 10.
Specifically, after the second device 20 displays the test picture P1, a photographing instruction for photographing the test picture P1 displayed in the second device 20 may be transmitted to the first device 10.
In step 705, the first device 10 takes a test picture displayed in the second device 20.
Specifically, the first device 10 may photograph the test picture P1 displayed in the second device 20 through the camera after receiving the photographing instruction transmitted from the second device 20, thereby obtaining the first image P1 ', wherein the first image P1' corresponds to the test picture P1 displayed in the second device 20.
In step 706, the first device 10 calculates the color coordinates and brightness of the first image P1'.
Specifically, after the first device 10 captures the first image P1 ', the first image P1 ' may be input to a preset image quality recognition model for image quality recognition, so that the color coordinates and the brightness of the first image P1 ' may be obtained.
The preset image quality recognition model can be obtained through training. In general, the camera in the first device 10 may include a photoreceptor (sensor) and a Color Filter Array (CFA) through which incident light is converted into an electrical signal (e.g., CFA three primary Color signal), and then subjected to a series of post-processing (e.g., CFA demosaicing, white balancing, gamma correction, and image enhancement steps) to form a picture that is actually seen by the user. Among them, the CFA three primary color signals depend on incident light. Ideally, there is a mapping between the three primary color signals of a unit in the CFA and the spectrum of light impinging on that unit. And the spectrum determines the perception of color by the human eye. Therefore, modeling can be performed according to the mapping relationship, and chromaticity measurement can be performed through the established model, that is, chromaticity sensed by the corresponding human eye can be predicted through the CFA three primary color signals.
In the field of signaling, the perception of color by the human eye is commonly modeled by CIEXYZ controls. Wherein each color perceived by the human eye corresponds to a unique (X, Y, Z) coordinate. Thus, the colorimetric measurement problem may be equivalent to the mapping of the CFA tristimulus signals to XYZ.
Fig. 9 is a schematic diagram of chroma mapping. In the embodiment of the present application, since the second device 20 displays an image, the processing procedure is equivalent to the data processing procedure of the camera in the first device 10, for example, converting an input electrical signal into an optical signal. Wherein also complex signal processing and calibration procedures are required. Due to the presence of internal factors (component errors) and external factors (the lighting environment in which the user is located), the color signal received by the second device 20 often differs from the color perceived by the user, which affects the visual experience. The significance of the colorimetric measurement of the second device 20 is therefore to quantify this difference.
In a specific implementation, the second device 20 may be configured in a dark room, with the second device 20 as the sole light source. The second device 20 may then be caused to display a series of colors (e.g., may be a single color picture) in the sRGB color gamut. Then, a picture displayed by the second device 20 may be photographed using the first device 10, whereby corresponding RAW data (e.g., CFA three primary color signals) may be generated. By taking photographs of different colors, several data sets can be collected, where the relationship between sRGB coordinates and CIEXYZ coordinates for each color is known, from which pairs of CFA-CIEXYZ data can be derived and the model can be trained.
However, due to various limitations of the first device 10 and the environment, the RAW data output by the first device 10 is not a single color. The single color picture has the following limitations:
a. wide angle distortion, the picture is more uniform in color only in the central portion, and the peripheral brightness is darker.
b. Moire (Moire) fringes are formed in the picture, and the light intensity of the picture is periodically changed to form a roughly transverse fringe. The fringe period is large. FIG. 10 is a schematic diagram of the effect of Moire patterns.
c. The LEDs are arranged so that the RAW data can be viewed in a magnified fashion with clear longitudinal striped texture, and since the display elements of the second device 20 are composed of LED light sources of three primary colors, the distance between the light sources can be viewed clearly at high resolution.
In consideration of the above problems of the RAW data, 256 × 256 CFA units in the center of the whole RAW data graph can be extracted as a basis for performing chromaticity calibration on the whole graph. Here, each unit includes 4 sensors (for convenience of explanation, "4 sensors" will be simply referred to as "4 channels" hereinafter), which are BYYR colors, respectively.
Next, a training set and a validation set may be constructed. Illustratively, for each color, three RAW data (e.g., A, B and C), each of 256 × 4 regions in the center may be truncated, and the region may be divided into 8 × 64 non-overlapping partitions, that is, each partition has a dimension of 32 × 4. Then, the blocks of a and B may be placed into a training set, and the blocks of C may be placed into a validation set, thereby yielding a validation set of 133 x 64 validation data and a training set of 133 x 64 training data.
After the training set and the verification set are constructed, the image quality recognition model can be constructed. The image quality recognition model may adopt a CNN model. Illustratively, the CNN model may include 3 × 3 convolutional layers, normalization (BatchNorm) layers, excitation (ReLU) layers, and pooling layers, wherein pooling layers may include max pooling (MaxPool) layers and average pooling (AveragePool) layers. Fig. 11 is a diagram of the structure of the CNN model. As shown in FIG. 11, the sample rate and step size for the maximum pooling layer is 2, and the sample rate and step size for the average pooling layer is 4. The input to the CNN model may be the CFA signal at 32 x 4, and the output of the CNN model may be the CIELab color coordinates L, a and b.
Further, since the distribution of the values of the RAW signal in each channel is more symmetrical, the distribution of the signal logarithm is more symmetrical. Therefore, before the CFA signal is input to the CNN, the CFA signal may also be subjected to normalization processing. Illustratively, the CFA signal may be logarithmized and then normalized after the logarithm is taken. Fig. 12 is a diagram showing the effect of signal log distribution.
Since CIELab has the following characteristics:
1. the visual average, the Euclidean distance of two colors in the space and the color difference sensed by human eyes are approximately linear, so the prediction result and the Euclidean distance of the ground channel can be used as the loss function of the network.
2. The values of symmetry, three parameters L of Lab coordinates are taken as [0,100], values of a and b are taken as [128,128], normalization can be well carried out, and the values of a and b just represent gray when the values of a and b are taken as 0.
Therefore, the parameters can be updated directly by using the CIELab coordinates during training, and the CIELab coordinates output by the CNN can be converted into CIEXYZ by formula during prediction.
Further, in the actual prediction, the region of 256 × 4 at the center may be cut out and divided into 64 blocks, and XYZ coordinates may be predicted for the 64 blocks, respectively, and then averaged, so as to enhance robustness.
Fig. 13 is a training effect diagram. As shown in FIG. 13, after 500 cycles of training, the verification set average L2-loss <4e-4, that is, the error of the output CIELab coordinate (Euclidean distance from ground truth) is less than 2 on average. Generally, if the average noticeable error under the CIELab control is about 2.3, the result is better on this data set.
In step 707, the first device 10 sends the color coordinates and the brightness of the first image P1' to the second device 20.
In step 708, the second device 20 receives and stores the color coordinates and brightness of the first image P1'.
In step 709, image quality adjustment is performed based on the color coordinates and luminance of the first image P1'.
Specifically, the image quality may include color coordinates and brightness. The second device 20, having received the color coordinates and brightness of the first image P1' sent by the first device 10, may match the color coordinates and brightness with preset color coordinates and brightness to adjust the quality of the image. That is, the color coordinates and the brightness of the test picture P1 can be adjusted, and thus the test picture T1 with the adjusted image quality can be obtained. The preset standard color coordinates may be obtained from the standard color coordinates of the test picture P1, and the preset standard brightness may be obtained from the standard brightness of the test picture P1. In a specific implementation, the preset color coordinates and brightness may be reference values obtained in a laboratory, and in addition, the deviation values may be obtained through the laboratory. The reference value may be a color coordinate reference value and a luminance reference value acquired in a laboratory. The deviation value is used for representing the deviation between the color coordinate original value and the actual display value and the deviation between the brightness original value and the actual display value. For example, taking color coordinates as an example, assuming that color coordinates with an original value of 128 are to be displayed, due to factors such as the surrounding environment, the value actually displayed by the second device 20 is to be 131, and the effect that the second device 20 displays the value of 128 is achieved, that is, the color coordinates with the value of 131 are actually displayed by the second device 20, and the color coordinates perceived by the user is 128. And the deviation value may be 131-.
When the offset value is acquired, the second device 20 may calculate a first corrected color coordinate of the first image P1 'based on the offset value, and a first corrected luminance of the first image P1' based on the offset value. Next, the second device 20 may compare the first corrected color coordinates with the reference color coordinates and the first corrected luminance with the reference luminance, thereby determining whether the current image quality needs to be adjusted. And if the first correction color coordinate is not matched with the reference color coordinate or the first correction brightness is not matched with the reference brightness, determining that the current image quality needs to be adjusted. If the first correction color coordinate matches the reference color coordinate and the first correction brightness matches the reference brightness, no image quality adjustment is required.
If the image quality needs to be adjusted, the color coordinate of the test picture P1 can be adjusted based on the color coordinate comparison result; and the brightness of the test picture P1 may be adjusted based on the brightness comparison result. For example, if the first corrected color coordinate is smaller than the reference color coordinate, the value of the color coordinate of the test picture may be increased based on a preset step size, and if the first corrected color coordinate is larger than the reference color coordinate, the value of the color coordinate of the test picture may be decreased based on a preset step size; similarly, if the first correction luminance is smaller than the reference luminance, the value of the luminance of the test picture may be increased based on a preset step size, and if the first correction luminance is larger than the reference luminance, the value of the luminance of the test picture may be decreased based on a preset step size. When the color coordinates and brightness of the test picture P1 are adjusted, the verification picture T1 is obtained, and the verification picture T1 is displayed.
In step 710, the second device 20 takes the adjusted test picture T1 as the verification picture T1.
In step 711, the second device 20 displays the verification picture T1.
In step 712, the second device 20 sends a shooting instruction to the first device 10.
In step 713, the first device 10 takes the verification picture T1 displayed in the second device 20.
Specifically, the first device 10 takes the authentication picture T1 displayed in the second device 20, whereby a second image T1' may be obtained.
In step 714, the first device 10 calculates the color coordinates and brightness of the second image T1'.
In step 715, the first device 10 sends the color coordinates and brightness of the second image T1' to the second device 20.
In step 716, the second device 20 receives and stores the color coordinates and brightness of the second image T1'.
In step 717, the image quality is adjusted based on the color coordinates and the brightness of the second image T1'.
Specifically, the second device 20 may perform image quality adjustment based on the color coordinates and the brightness of the second image T1'. That is, the second device 20 may determine whether the quality adjustment of the verification picture T1 is required based on the color coordinates and the brightness of the second image T1' described above. It should be noted that, in the step, reference may be made to step 709 for a specific process of adjusting the image quality of the verification picture T1, which is not described herein again.
In step 718, the second device 20 determines the result of adjusting the image quality of the verification picture T1.
Specifically, the second device 20 completes the image quality adjustment of the verification picture T1. The result of the image quality adjustment of the verification picture T1 can be obtained. The image quality adjustment result may include a required adjustment and an unnecessary adjustment. Next, the second device 20 may determine, according to the result of the image quality adjustment of the verification picture T1:
if the adjustment of the image quality of the verification picture T1 is unnecessary, step 719 may be further executed.
If the result of the image quality adjustment of the verification picture T1 is that adjustment is required, the steps 703-717 can be repeatedly executed until no adjustment is required in any image quality adjustment result, or the image quality adjustment process is stopped when the preset maximum times of the steps 703-717 are repeatedly executed.
In step 719, the second device 20 sends a notification of completion of image quality adjustment to the first device 10.
In the embodiment of the application, a user shoots a picture displayed on a large screen through a mobile phone, calculates the image quality parameters and sends the image quality parameters to the large screen, so that the large screen performs image quality adjustment based on the image quality parameters, the operation of the user can be facilitated, the white balance correction is realized, and the cost of image quality adjustment is saved.
Fig. 14 is a schematic flowchart of another embodiment of the image quality adjustment method provided in the present application, including:
in step 1401, the first device 10 sends a connection request to the second device 20 to establish a connection between the first device 10 and the second device 20.
In step 1402, the first device 10 sends a test picture playing request to the second device 20.
In particular, the test picture may be a normal picture. That is, the embodiment of the present application takes an example in which the user requests the second device 20 to display a normal picture, and thus correction of color accuracy and color gamut can be achieved.
In step 1403, the second device 20 displays the test picture.
Specifically, after receiving the test picture playing request sent by the first device 10, the second device 20 may obtain a plurality of preset test pictures. Preferably, there may be 128 normal pictures.
Then, the second device 20 can select one of the test pictures (e.g., 128 test pictures).
In step 1404, the second device 20 sends a shooting instruction to the first device 10.
Specifically, the shooting instruction is used to take a test picture displayed in the second device 20. Illustratively, the test picture may be a first test picture P1.
In step 1405 the first device 10 takes a first test picture P1 displayed in the second device 20.
Specifically, after the first device 10 takes the first test picture P1 displayed in the second device 20, a first image P1' may be obtained.
In step 1406, the first device 10 calculates the color coordinates and brightness of the first image P1'.
In step 1407, the first device 10 transmits the color coordinates and the brightness of the first image P1' to the second device 20.
In step 1408, the second device 20 receives and stores the color coordinates and brightness of the first image P1', and displays the remaining test pictures in sequence, so as to complete the storage of the color coordinates and brightness of all the test pictures.
Specifically, after the second device 20 receives the color coordinates and the luminance of the first image P1 ', the color coordinates and the luminance of the first image P1' may be stored. Then, the remaining test pictures (for example, the remaining 127 test pictures) may be sequentially displayed, and the first device 10 may take a picture based on the sequentially displayed test pictures, sequentially calculate the color coordinates and the brightness of the remaining displayed test pictures, and sequentially transmit the color coordinates and the brightness to the second device 20, so that the second device 20 may store the color coordinates and the brightness of all the test pictures. In a specific implementation, steps 1403 to 1407 may be repeatedly performed, so that the second device 20 may acquire the color coordinates and the brightness of all the test pictures.
Step 1409, adjusting the image quality based on the color coordinates and brightness of all the test pictures.
Specifically, the second device 20 may perform image quality adjustment based on the color coordinates and brightness of all the test pictures. For example, the second device 20 may combine the color coordinates and the brightness of all the stored test pictures into a color coordinate sequence and a brightness sequence, compare the color coordinate sequence and the brightness sequence with a preset standard color coordinate sequence and a preset standard brightness sequence to determine the current picture quality, and adjust the picture quality of all the test pictures according to the determination result. The preset standard color coordinate sequence can be obtained from the standard color coordinates of the test picture, and the preset standard brightness sequence can be obtained from the standard brightness of the test picture.
In step 1410, the second device 20 selects multiple verification pictures from the adjusted multiple test pictures.
Specifically, after the second device 20 completes the image quality adjustment of all the test pictures, a plurality of adjusted test pictures can be obtained, and a part of the adjusted test pictures can be selected as verification pictures. For example, 32 test pictures may be selected from the 128 adjusted test pictures as the verification pictures. It is understood that the above example only shows an exemplary scenario of selecting 32 verification pictures, and does not constitute a limitation on the embodiments of the present application, and in some embodiments, other numbers of verification pictures may be selected.
In step 1411, the second device 20 displays the verification picture.
Specifically, the second device 20 may display any one of the 32 verification pictures. Illustratively, the verification picture may be the first verification picture Q1.
In step 1412, the second device 20 sends a shooting instruction to the first device 10.
In step 1413, the first device 10 takes a first verification picture Q1 displayed in the second device 20.
Specifically, the first device 10 takes the first verification picture Q1 displayed in the second device 20, whereby the second image Q1' can be obtained.
At step 1414, the first device 10 calculates the color coordinates and brightness of the second image Q1'.
In step 1415, the first device 10 transmits the color coordinates and the brightness of the second image Q1' to the second device 20.
In step 1416, the second device 20 receives and stores the color coordinates and luminance of the second image Q1' and displays the remaining verification pictures in sequence to complete the storage of the color coordinates and luminance of all the verification pictures.
Specifically, after the second device 20 receives the color coordinates and the luminance of the second image Q1 ', the color coordinates and the luminance of the second image Q1' may be stored. Then, the remaining verification pictures (for example, the remaining 31 verification test pictures) may be sequentially displayed, and the first device 10 may perform photographing based on the sequentially displayed verification pictures, sequentially calculate the color coordinates and the brightness of the remaining verification pictures, and sequentially transmit the color coordinates and the brightness to the second device 20, thereby enabling the second device 20 to store the color coordinates and the brightness of all the verification pictures. In a specific implementation, steps 1411 to 1415 may be repeatedly executed, so that the second device 20 may acquire the color coordinates and the brightness of all the verification pictures.
In step 1417, the image quality is adjusted based on the color coordinates and brightness of all the verification pictures.
Specifically, the second device 20 may perform image quality adjustment based on the color coordinates and brightness of all the verification pictures. For example, the second device 20 may combine the color coordinates and the luminances of all the stored verification pictures into a color coordinate sequence and a luminance sequence, compare the color coordinate sequence and the luminance sequence with a preset standard color coordinate sequence and a preset standard luminance sequence to determine the current picture quality, and adjust the picture quality of all the verification pictures according to the determination result.
In step 1418, the second device 20 determines the result of adjusting the image quality of the verification picture.
Specifically, the second device 20 completes the image quality adjustment of all the verification pictures (for example, 32 verification pictures). The image quality adjustment results of all the verification pictures can be obtained. The image quality adjustment result may include a required adjustment and an unnecessary adjustment. Then, the second device 20 may determine by integrating the image quality adjustment results of the 32 verification pictures:
if the adjustment result of the image quality of all the verification pictures is that no adjustment is needed, step 1419 may be further performed.
If the result of the image quality adjustment of at least one verification picture is that adjustment is required, the steps 1403-1417 may be repeatedly executed until the result of the image quality adjustment of all the verification pictures is that adjustment is not required, or the image quality adjustment process is stopped when the maximum preset times of the steps 1403-1417 are repeatedly executed.
In step 1419, the second device 20 sends a notification of completion of the image quality adjustment to the first device 10.
In the embodiment of the application, the mobile phone sequentially shoots a plurality of pictures displayed on the large screen, calculates the image quality parameters and sends the image quality parameters to the large screen, so that the large screen performs image quality adjustment based on the image quality parameters; and then, selecting a verification picture from the adjusted pictures by the large screen, and shooting the verification picture by the mobile phone again to obtain the picture quality parameters of the verification picture so as to adjust the picture quality of the large screen, thereby facilitating the operation of a user, realizing the correction of color precision and color gamut and saving the cost of picture quality adjustment.
Fig. 15 is a schematic structural diagram of an embodiment of the image quality adjusting apparatus according to the present application, and as shown in fig. 15, the image quality adjusting apparatus 1500 may include: a communication connection module 1510, an indication module 1520, a first photographing module 1530, a first calculation module 1540, and a first transmission module 1550; wherein the content of the first and second substances,
a communication connection module 1510, configured to establish a communication connection with a second device in response to a preset operation by a user;
an indicating module 1520, configured to send a display instruction to the second device, where the display instruction is used to instruct the second device to display the test picture;
the first shooting module 1530 is configured to, in response to a received first shooting instruction sent by the second device, shoot a test picture displayed by the second device to obtain a test image;
a first calculating module 1540 for calculating the image quality parameter of the test image;
the first sending module 1550 is configured to send the quality parameter of the test image to the second device.
In one possible implementation manner, the apparatus 1500 further includes: a detection module 1560; wherein, the first and the second end of the pipe are connected with each other,
the detecting module 1560 is configured to detect whether the ambient light and the test distance meet a preset condition, and determine whether to send a display instruction to the second device according to a detection result.
In one possible implementation manner, the apparatus 1500 further includes: a second shooting module 1570, a second calculation module 1580, and a second sending module 1590; wherein the content of the first and second substances,
a second shooting module 1570, configured to, in response to a received second shooting instruction sent by the second device, shoot a verification picture displayed by the second device, so as to obtain a verification image;
the second calculation module 1580 is configured to calculate image quality parameters of the verification image;
a second sending module 1590, configured to send the image quality parameter of the verification image to the second device.
In one possible implementation manner, the apparatus 1500 further includes: the receiving module 15a 0; wherein, the first and the second end of the pipe are connected with each other,
the receiving module 15a0 is configured to receive an image quality adjustment completion notification sent by the second device, where the image quality adjustment completion notification is used to notify the first device that image quality adjustment is completed.
In one possible implementation, the image quality parameters include color coordinates and brightness.
The image quality adjusting apparatus provided in the embodiment shown in fig. 15 can be used to implement the technical solutions of the method embodiments shown in fig. 6 to 14 of the present application, and the implementation principles and technical effects thereof can be further referred to the related descriptions in the method embodiments.
Fig. 16 is a schematic structural diagram of another embodiment of the image quality adjustment apparatus according to the present application, and as shown in fig. 16, the image quality adjustment apparatus 1600 may include: a display module 1610, an indication module 1620, a receiving module 1630 and an image quality adjusting module 1640; wherein, the first and the second end of the pipe are connected with each other,
a display module 1610, configured to display a test picture in response to a received display instruction sent by the first device;
an indication module 1620, configured to send a first shooting instruction to the first device;
a receiving module 1630, configured to receive a quality parameter of the test image sent by the first device; wherein the test image corresponds to the test picture;
the image quality adjusting module 1640 is configured to adjust the image quality of the picture based on the image quality parameter of the test image.
In one possible implementation manner, the apparatus 1600 further includes: a verification module 1650; wherein the content of the first and second substances,
a verification module 1650 for displaying a verification picture; the verification picture is obtained by adjusting the image quality of the test picture; receiving image quality parameters of a verification image sent by first equipment; wherein the verification image corresponds to the verification picture; and adjusting the quality of the verification picture based on the quality parameters of the verification picture.
In one possible implementation, the image quality adjusting module 1640 is further configured to perform the following operations
Comparing the image quality parameters of the verification image with preset image quality parameters, and judging whether the image quality of the verification image needs to be adjusted according to the comparison result;
if the adjustment is needed, displaying the test picture again to finish the image quality adjustment;
and if the adjustment is not needed, sending an image quality adjustment completion notification to the first equipment.
In one possible implementation manner, the test picture includes a plurality of pictures, and the number of the verification pictures is determined by the number of the test pictures.
In one possible implementation manner, the test picture is a single special picture, the single special picture is used for correcting white balance, and the display module 1610 is further used for displaying the single special picture.
In one possible implementation manner, the test pictures are multiple normal pictures, the normal pictures are used for correcting color precision and color gamut, and the display module 1610 is further used for sequentially displaying the multiple normal pictures.
The image quality adjusting apparatus provided in the embodiment shown in fig. 16 may be used to implement the technical solutions of the method embodiments shown in fig. 6 to 14 of the present application, and the implementation principles and technical effects thereof may further refer to the related descriptions in the method embodiments.
It should be understood that the division of the modules of the image quality adjusting apparatus shown in fig. 15 and 16 is merely a logical division, and all or part of the modules may be integrated into one physical entity or may be physically separated in actual implementation. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling by the processing element in software, and part of the modules can be realized in the form of hardware. For example, the detection module may be a separate processing element, or may be integrated into a chip of the electronic device. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, these modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
Fig. 17 exemplarily shows a schematic structure of the electronic device 100, and the electronic device 100 may be the first device 10 or the second device 20 shown in fig. 6.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. Wherein the controller may be a neural center and a command center of the electronic device 200. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The execution of the application sharing method provided in the embodiment of the present application may be controlled by the processor 110 or completed by calling another component, for example, calling a processing program of the embodiment of the present application stored in the internal memory 121, or calling a processing program of the embodiment of the present application stored in a third-party device through the external memory interface 120, so as to control the wireless communication module 160 to perform data communication to another electronic device, so as to implement application sharing among multiple electronic devices, and improve user experience.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 170 and wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
In this embodiment, the electronic device 100 may display the user interface through the display screen 194.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
In this embodiment, the electronic device 100 may receive an operation of the user, for example, a single click, a double click, or a slide operation, through the touch sensor 180K.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
Embodiments of the present application further provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the method provided by the embodiments of the present application.
Embodiments of the present application also provide a computer program product, which includes a computer program, when the computer program runs on a computer, the computer is caused to execute the method provided by the embodiments of the present application.
The electronic device, the computer storage medium, or the computer program product provided in the embodiments of the present application are all configured to execute the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device, the computer storage medium, or the computer program product may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
It is understood that the electronic device 100 includes hardware structures and/or software modules for performing the functions in order to implement the functions. Those of skill in the art will readily appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
In the embodiment of the present application, the electronic device 100 may be divided into functional modules according to the method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (24)

1. An image quality adjustment method is applied to a first device, and is characterized by comprising the following steps:
responding to preset operation of a user, and establishing communication connection with second equipment;
sending a display instruction to the second device, wherein the display instruction is used for indicating the second device to display a test picture;
responding to a received first shooting instruction sent by the second equipment, shooting a test picture displayed by the second equipment to obtain a test image;
and calculating the image quality parameters of the test image, and sending the image quality parameters of the test image to the second equipment.
2. The method of claim 1, wherein prior to sending the display instruction to the second device, the method further comprises:
and detecting whether the ambient light and the test distance meet preset conditions, and determining whether to send a display instruction to the second equipment according to a detection result.
3. The method of claim 1, wherein after sending the quality parameter of the test image to the second device, the method further comprises:
in response to a received second shooting instruction sent by the second equipment, shooting a verification picture displayed by the second equipment to obtain a verification image;
and calculating the image quality parameters of the verification image, and sending the image quality parameters of the verification image to the second equipment.
4. The method according to any one of claims 1-3, further comprising:
and receiving an image quality adjustment completion notification sent by the second device, wherein the image quality adjustment completion notification is used for notifying the first device that image quality adjustment is completed.
5. The method according to any one of claims 1-4, wherein the image quality parameters comprise color coordinates and brightness.
6. A method for adjusting image quality is applied to a second device, the second device and a first device are in communication connection, and the method is characterized by comprising the following steps:
responding to a received display instruction sent by the first equipment, and displaying a test picture;
sending a first shooting instruction to the first device;
receiving image quality parameters of a test image sent by the first equipment; wherein the test image corresponds to the test picture;
and adjusting the quality of the test picture based on the quality parameter of the test picture.
7. The method of claim 6, further comprising:
displaying the verification picture; the verification picture is obtained by adjusting the image quality of the test picture;
receiving image quality parameters of a verification image sent by the first equipment; wherein the verification image corresponds to the verification picture;
and adjusting the quality of the verification picture based on the quality parameter of the verification image.
8. The method of claim 7, wherein the adjusting the quality of the verification picture based on the quality parameter of the verification image comprises:
comparing the image quality parameters of the verification image with preset image quality parameters, and judging whether the image quality of the verification image needs to be adjusted according to a comparison result;
if the adjustment is needed, displaying the test picture again to finish the picture quality adjustment;
and if the adjustment is not needed, sending an image quality adjustment completion notification to the first equipment.
9. The method of claim 7, wherein the test picture comprises a plurality of pictures, and wherein the number of verification pictures is determined by the number of test pictures.
10. The method of claim 6, wherein the test picture is a single special picture, the single special picture is used for correcting white balance, and the displaying the test picture comprises:
and displaying the single special picture.
11. The method of claim 6, wherein the test pictures are a plurality of normal pictures, the normal pictures are used for correcting color accuracy and color gamut, and the displaying the test pictures comprises:
and sequentially displaying a plurality of normal pictures.
12. A first device, comprising: a memory for storing computer program code, the computer program code comprising instructions that, when read from the memory by the first device, cause the first device to perform the steps of:
responding to preset operation of a user, and establishing communication connection with second equipment;
sending a display instruction to the second device, wherein the display instruction is used for indicating the second device to display a test picture;
responding to a received first shooting instruction sent by the second equipment, shooting a test picture displayed by the second equipment to obtain a test image;
and calculating the image quality parameters of the test image, and sending the image quality parameters of the test image to the second equipment.
13. The first device of claim 12, wherein the instructions, when executed by the first device, cause the first device to perform the following steps prior to the step of sending display instructions to the second device:
and detecting whether the ambient light and the test distance meet preset conditions, and determining whether to send a display instruction to the second equipment according to a detection result.
14. The first device of claim 12, wherein the instructions, when executed by the first device, cause the first device to perform the following steps after the step of sending the quality parameters of the test image to the second device:
in response to a received second shooting instruction sent by the second equipment, shooting a verification picture displayed by the second equipment to obtain a verification image;
and calculating the image quality parameters of the verification image, and sending the image quality parameters of the verification image to the second equipment.
15. The first device of any of claims 12-14, wherein the instructions, when executed by the first device, cause the first device to further perform the steps of:
and receiving an image quality adjustment completion notification sent by the second device, wherein the image quality adjustment completion notification is used for notifying the first device that image quality adjustment is completed.
16. The first apparatus of any one of claims 12-15, wherein the image quality parameters comprise color coordinates and brightness.
17. A second device, the second device having established a communication connection with a first device, comprising: a memory for storing computer program code, the computer program code comprising instructions that, when read from the memory by the second device, cause the second device to perform the steps of:
responding to a received display instruction sent by the first equipment, and displaying a test picture;
sending a first shooting instruction to the first equipment;
receiving image quality parameters of a test image sent by the first equipment; wherein the test image corresponds to the test picture;
and adjusting the quality of the test picture based on the quality parameter of the test image.
18. The second device of claim 17, wherein the instructions, when executed by the second device, cause the second device to further perform the steps of:
displaying the verification picture; the verification picture is obtained by adjusting the image quality of the test picture;
receiving image quality parameters of a verification image sent by the first equipment; wherein the verification image corresponds to the verification picture;
and adjusting the quality of the verification picture based on the quality parameter of the verification picture.
19. The second device of claim 18, wherein the instructions, when executed by the second device, cause the second device to perform adjusting the quality of the verification picture based on quality parameters of the verification image comprise:
comparing the image quality parameters of the verification image with preset image quality parameters, and judging whether the image quality of the verification image needs to be adjusted or not according to a comparison result;
if the adjustment is needed, displaying the test picture again to finish the picture quality adjustment;
and if the adjustment is not needed, sending an image quality adjustment completion notification to the first equipment.
20. The second device of claim 18, wherein the test picture comprises a plurality of pictures, and wherein the number of verification pictures is determined by the number of test pictures.
21. The second device of claim 17, wherein the test picture is a single special picture used for correcting white balance, and wherein the instructions, when executed by the second device, cause the second device to perform the step of displaying the test picture comprises:
and displaying the single special picture.
22. The second device of claim 17, wherein the test pictures are normal pictures, and wherein the normal pictures are used for correcting color accuracy and color gamut, and wherein the instructions, when executed by the second device, cause the second device to perform the step of displaying the test pictures comprises:
and displaying a plurality of normal pictures in sequence.
23. A computer readable storage medium comprising computer instructions which, when run on the first device, cause the first device to perform the method of any of claims 1-5, or which, when run on the second device, cause the second device to perform the method of any of claims 6-11.
24. A computer program product, characterized in that, when the computer program product is run on a computer, it causes the computer to perform the method according to any of claims 1-11.
CN202110293776.7A 2021-03-19 2021-03-19 Image quality adjusting method, electronic device and storage medium Pending CN115118963A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110293776.7A CN115118963A (en) 2021-03-19 2021-03-19 Image quality adjusting method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110293776.7A CN115118963A (en) 2021-03-19 2021-03-19 Image quality adjusting method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN115118963A true CN115118963A (en) 2022-09-27

Family

ID=83323087

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110293776.7A Pending CN115118963A (en) 2021-03-19 2021-03-19 Image quality adjusting method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN115118963A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135268A (en) * 2023-02-23 2023-11-28 荣耀终端有限公司 Shooting method and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117135268A (en) * 2023-02-23 2023-11-28 荣耀终端有限公司 Shooting method and electronic equipment

Similar Documents

Publication Publication Date Title
CN107438163B (en) Photographing method, terminal and computer readable storage medium
WO2011137731A2 (en) Method for controlling light-emitting device in terminal equipment, apparatus thereof and terminal equipment
CN113810600A (en) Terminal image processing method and device and terminal equipment
WO2022116930A1 (en) Content sharing method, electronic device, and storage medium
CN116052568B (en) Display screen calibration method and related equipment
CN113709464A (en) Video coding method and related device
CN114205336A (en) Cross-device audio playing method, mobile terminal, electronic device and storage medium
JP2015177308A (en) Portable terminal device, image correction method, and image correction program
CN113727085B (en) White balance processing method, electronic equipment, chip system and storage medium
CN112099741B (en) Display screen position identification method, electronic device and computer readable storage medium
CN115118963A (en) Image quality adjusting method, electronic device and storage medium
CN111918047A (en) Photographing control method and device, storage medium and electronic equipment
CN115412678B (en) Exposure processing method and device and electronic equipment
CN116091392B (en) Image processing method, system and storage medium
WO2023005706A1 (en) Device control method, electronic device, and storage medium
CN115631250B (en) Image processing method and electronic equipment
WO2022068598A1 (en) Imaging method and apparatus
CN111885768B (en) Method, electronic device and system for adjusting light source
CN114915359A (en) Method, device, electronic equipment and readable storage medium for selecting channel
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN117119314B (en) Image processing method and related electronic equipment
CN115705663B (en) Image processing method and electronic equipment
CN115442536B (en) Method and device for determining exposure parameters, image system and electronic equipment
US20240155254A1 (en) Image Processing Method and Related Electronic Device
CN116437060B (en) Image processing method and related electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination