CN112581903A - Pixel compensation method and electronic equipment - Google Patents

Pixel compensation method and electronic equipment Download PDF

Info

Publication number
CN112581903A
CN112581903A CN201910945742.4A CN201910945742A CN112581903A CN 112581903 A CN112581903 A CN 112581903A CN 201910945742 A CN201910945742 A CN 201910945742A CN 112581903 A CN112581903 A CN 112581903A
Authority
CN
China
Prior art keywords
screen
display area
user
picture
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910945742.4A
Other languages
Chinese (zh)
Other versions
CN112581903B (en
Inventor
龚铮
吴蕾
文锦松
艾金钦
张亦扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN201910945742.4A priority Critical patent/CN112581903B/en
Publication of CN112581903A publication Critical patent/CN112581903A/en
Application granted granted Critical
Publication of CN112581903B publication Critical patent/CN112581903B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/30Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
    • G09G3/32Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
    • G09G3/3208Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]

Abstract

The embodiment of the application provides a pixel compensation method and electronic equipment, relates to the field of terminals, and can solve the problems of insufficient compensation and transitional compensation in an online automatic compensation scheme. The electronic equipment comprises a display screen, wherein the display screen comprises a first display area and a second display area, and the method comprises the following steps: displaying a first interface in a first display area and a second display area, wherein the first interface comprises a first picture, and a display boundary exists at the boundary of the first display area and the second display area; displaying a first color component of a first picture in a first display area and a second display area in response to a first operation for performing screen calibration; receiving a second operation of the user, wherein the second operation is used for adjusting the gray scale of the first color component displayed on the second display area; and determining a compensation value of the first sub-pixel on the second display area according to the first difference value before and after the first color component is adjusted, and compensating the first sub-pixel on the second display area according to the compensation value.

Description

Pixel compensation method and electronic equipment
Technical Field
The present application relates to the field of terminals, and in particular, to a pixel compensation method and an electronic device.
Background
With the development of Organic Light Emitting Diodes (OLEDs), various end products using flexible OLED screens are becoming a current hot spot and popular trend. The flexible OLED screen has the characteristic of being foldable, and can meet the display requirements of a user on single screens or multiple screens in various different scenes. Realize multiple display form on same flexible screen, the length of time of use of different display areas is different, the decay of different degrees can appear, and when this kind of decay difference increased along with time, the user will perceive the display effect difference between the different display areas, influences user experience. Therefore, the display effect compensation for different display areas is required.
At present, a screen attenuation model can be established according to the use habit of a user, and the different display areas of the folding screen are automatically compensated on line according to an algorithm model according to the use duration, the pixel value, the backlight and other information of the different display areas of the folding screen.
However, due to the limitation of the manufacturing process, actually, the attenuation speed of each OLED panel is different, and the model of the online automatic compensation scheme is uniform, so that the situation of insufficient compensation or even transitional compensation may occur, and the user requirements cannot be met.
Disclosure of Invention
The embodiment of the application provides a pixel compensation method and electronic equipment, which can avoid the problems of insufficient compensation and compensation transition in an online automatic compensation scheme, thereby improving user experience.
In a first aspect, an embodiment of the present application provides a pixel compensation method applied to an electronic device including a display screen, where the display screen includes a first display area and a second display area, and the method includes: displaying a first interface in a first display area and a second display area, wherein the first interface comprises a first picture, and a display boundary exists at the boundary of the first display area and the second display area; receiving a first operation of a user, wherein the first operation is used for determining screen calibration according to a first picture; in response to a first operation, displaying a first color component of a first picture in a first display area and a second display area; receiving a second operation of the user, wherein the second operation is used for adjusting the gray scale of the first color component displayed on the second display area according to the gray scale of the first color component displayed on the first display area; the gray scale of the first color component displayed on the first display area is unchanged; and determining a compensation value of a first sub-pixel on the second display area according to a first difference value between the gray scale of the first color component displayed on the adjusted second display area and the gray scale of the first color component displayed on the second display area before adjustment, wherein the first sub-pixel is a sub-pixel corresponding to the first color component, and compensating the first sub-pixel on the second display area according to the compensation value of the first sub-pixel on the second display area.
Based on the method provided by the embodiment of the application, if the user determines that a display screen has a relatively obvious display boundary when the display screen displays the first picture, the user can adjust the first color component of the first picture and compensate the sub-pixels of the second display area based on the difference value before and after the adjustment of the first color component, so that the requirements of users with different sensitivities and different use habits on the screen consistency are met.
In one possible implementation, the first picture is a pure color picture including one color; or the first picture is a picture with patterns comprising at least two colors. It can be understood that the pure color picture has less visual interference, and can help the user to better judge whether a display boundary exists between different display areas, so that the user can make corresponding adjustment. The picture with the pattern can better reflect the real interface display and can better meet the adjustment requirements of some users with higher perception.
In one possible implementation, the first picture is one of a preset group of pure color pictures with different gray scales; therefore, the user can select the picture which does not meet the screen consistency (has a boundary) from the preset pure color picture, and the screen calibration is carried out according to the picture so as to meet the requirements of users with different sensitivities and different use habits on the screen consistency. Or the first picture is selected from a pure color picture in a preset gray scale range by a user; therefore, the user can check all the pure color pictures in the preset gray scale range, when the screen consistency is checked, the pure color pictures which do not meet the screen consistency (have boundaries) can be selected more flexibly and more accurately, and the screen calibration is carried out according to the pure color pictures, so that the requirements of users with different sensitivities and different use habits on the screen consistency are met. Or the first picture is a first user screenshot; or the first picture is obtained by performing preset processing on the first user screenshot, and when the first user screenshot is displayed in the first display area and the second display area, a display boundary exists at the boundary of the first display area and the second display area. Therefore, the user can carry out screen calibration more pertinently according to the first user screenshot or the first picture corresponding to the first user screenshot so as to meet the requirements of users with different sensitivities and different use habits on screen consistency.
In one possible implementation, the first color component is a red component, a green component, a blue component, a yellow component, a cyan component, or a magenta component.
In a possible implementation manner, after receiving the second operation of the user, the method further includes: responding to the second operation, and displaying the adjustment effect of the gray scale of the first color component in the second display area in real time; in this way, the user can view the adjustment effect in real time to determine whether the adjustment effect satisfies the screen consistency. And receiving a third operation of the user, wherein the third operation is used for determining that the current adjusting effect meets the screen consistency.
In one possible implementation, receiving the first operation of the user includes: receiving click operation of a user on a calibration button displayed on a display screen; or receiving a voice operation instruction of a user, wherein the voice operation instruction is used for indicating to calibrate the first picture.
In one possible implementation, the receiving the second operation of the user includes: receiving the dragging operation of a user on a sliding block on a sliding strip displayed on a second display area and used for adjusting the gray scale; or receiving input operation of a user on a numerical value selection frame for adjusting gray scale displayed on the second display area.
In a possible implementation manner, before receiving the second operation of the user, the method further includes: and receiving a fourth operation of the user, wherein the fourth operation is used for adjusting the backlight brightness of the display screen. Therefore, each picture under each backlight brightness can be checked by adjusting the backlight brightness, and pictures which do not meet the screen consistency are adjusted to meet the requirement of a user on the screen consistency.
In a possible implementation manner, the adjusted gray scale of the first color component displayed on the second display area is smaller than or larger than the gray scale of the first color component displayed on the first display area. Illustratively, the first screen may be a primary screen and the second screen may be a secondary screen. That is, the gray scale of the color component of the sub-screen can be adjusted according to the gray scale of the color component of the main screen. The main screen is probably darker due to frequent use, the auxiliary screen is probably brighter due to lower use frequency, and the brightness of the auxiliary screen can be adjusted to be lower so as to keep the brightness of the auxiliary screen consistent with that of the main screen, thereby ensuring the consistency of the screens. And the gray scale of the first color component displayed on the adjusted auxiliary screen is smaller than that of the first color component displayed on the main screen. Alternatively, the first screen may be a secondary screen and the second screen may be a primary screen. The gray scale of the color component of the main screen can be adjusted according to the gray scale of the color component of the auxiliary screen, and the brightness of the main screen can be increased to keep the brightness of the main screen and the brightness of the auxiliary screen consistent, so that the consistency of the screens is ensured. And the gray scale of the first color component displayed on the adjusted main screen is larger than that of the first color component displayed on the auxiliary screen.
In one possible implementation, the method further includes: displaying a second color component of the first picture in the first display area and the second display area; receiving a fifth operation of the user, wherein the fifth operation is used for adjusting the gray scale of the second color component displayed on the second display area according to the gray scale of the second color component displayed on the first display area; the gray scale of the second color component displayed on the first display region is unchanged.
In a possible implementation manner, before determining the compensation value of the first sub-pixel on the second display area according to the first difference value, the method further includes: displaying a second interface in the first display area and the second display area, wherein the second interface comprises a second picture, the second picture is different from the first picture, and a display boundary exists at the boundary of the first display area and the second display area; receiving a sixth operation of the user, wherein the sixth operation is used for determining screen calibration according to the second picture; in response to a sixth operation, displaying a first color component of a second picture in the first display area and the second display area; receiving a seventh operation of the user, wherein the seventh operation is used for adjusting the gray scale of the first color component displayed on the second display area according to the gray scale of the first color component displayed on the first display area; determining a compensation value of a first sub-pixel on the second display area according to a first difference value between the gray scale of the first color component displayed on the adjusted second display area and the gray scale of the first color component displayed on the second display area before adjustment, including: and determining a compensation value of the first sub-pixel on the second display area according to the first difference value and a second difference value between the gray scale of the first color component after the second picture is adjusted and the gray scale of the first color component before the second picture is adjusted. Since there may be a large error in performing the screen calibration according to only one picture, the user can adjust multiple pictures (the first picture and the second picture) to improve the accuracy of the screen consistency calibration.
In one possible implementation, the second picture is one of a preset group of pure color pictures with different gray scales; or the second picture is selected from the pure color pictures in the preset gray scale range by the user; or the second picture is a second user screenshot; or the second picture is obtained by performing preset processing on the second user screenshot, and when the second user screenshot is displayed in the first display area and the second display area, a display boundary exists at the boundary of the first display area and the second display area.
In one possible implementation manner, the displaying the first interface in the first display area and the second display area further includes: displaying a first interface in a first display area, a second display area and a third display area, wherein the first interface comprises a first picture, and a display boundary exists at the boundary of the first display area and the third display area; displaying a first color component of a first picture in a first display area and a second display area, comprising: displaying a first color component of a first picture in a first display area, a second display area and a third display area; after receiving the first operation of the user, the method further comprises: receiving an eighth operation of the user, wherein the eighth operation is used for adjusting the gray scale of the first color component displayed on the third display area according to the gray scale of the first color component displayed on the first display area; and determining a compensation value of a first sub-pixel on the third display area according to a third difference value between the gray scale of the first color component displayed on the adjusted third display area and the gray scale of the first color component displayed on the third display area before adjustment, wherein the first sub-pixel is a sub-pixel corresponding to the first color component, and compensating the first sub-pixel on the third display area according to the compensation value of the first sub-pixel on the third display area.
In one possible implementation, the display screen is a folding screen. The folded screen may be folded to form a plurality of screens, each of which may correspond to a display area. For example, the foldable screen may include two screens, and the foldable screen may be folded along a folding edge (folding axis) to form a first screen and a second screen, and the first screen may be considered as a first display area and the second screen may be considered as a second display area.
In a second aspect, an embodiment of the present application provides an electronic device, which includes a display screen, where the display screen includes a first display area and a second display area, and includes: the display unit is used for displaying a first interface in a first display area and a second display area, the first interface comprises a first picture, and a display boundary exists at the boundary of the first display area and the second display area; the receiving unit is used for receiving a first operation of a user, and the first operation is used for determining screen calibration according to a first picture; a display unit further configured to display a first color component of the first picture in the first display area and the second display area in response to a first operation; the receiving unit is further used for receiving a second operation of the user, and the second operation is used for adjusting the gray scale of the first color component displayed on the second display area according to the gray scale of the first color component displayed on the first display area; the gray scale of the first color component displayed on the first display area is unchanged; and the processing unit is used for determining a compensation value of a first sub-pixel on the second display area according to a first difference value between the gray scale of the first color component displayed on the adjusted second display area and the gray scale of the first color component displayed on the second display area before adjustment, wherein the first sub-pixel is a sub-pixel corresponding to the first color component, and the first sub-pixel on the second display area is compensated according to the compensation value of the first sub-pixel on the second display area.
In one possible implementation, the first picture is a pure color picture including one color; or the first picture is a picture with patterns comprising at least two colors.
In one possible implementation, the first picture is one of a preset group of pure color pictures with different gray scales; or the first picture is selected from a pure color picture in a preset gray scale range by a user; or the first picture is a first user screenshot; or the first picture is obtained by performing preset processing on the first user screenshot, and when the first user screenshot is displayed in the first display area and the second display area, a display boundary exists at the boundary of the first display area and the second display area.
In one possible implementation, the first color component is a red component, a green component, a blue component, a yellow component, a cyan component, or a magenta component.
In one possible implementation, the display unit is further configured to: responding to the second operation, and displaying the adjustment effect of the gray scale of the first color component in the second display area in real time; the receiving unit is further used for receiving a third operation of the user, and the third operation is used for determining that the current adjustment effect meets the screen consistency.
In one possible implementation, the receiving unit is configured to: receiving click operation of a user on a calibration button displayed on a display screen; or receiving a voice operation instruction of a user, wherein the voice operation instruction is used for indicating to calibrate the first picture.
In one possible implementation, the receiving unit is configured to: receiving the dragging operation of a user on a sliding block on a sliding strip displayed on a second display area and used for adjusting the gray scale; or receiving input operation of a user on a numerical value selection frame for adjusting gray scale displayed on the second display area.
In one possible implementation, the receiving unit is further configured to: and receiving a fourth operation of the user, wherein the fourth operation is used for adjusting the backlight brightness of the display screen.
In a possible implementation manner, the adjusted gray scale of the first color component displayed on the second display area is smaller than or larger than the gray scale of the first color component displayed on the first display area.
In one possible implementation, the display unit is further configured to: displaying a second color component of the first picture in the first display area and the second display area; the receiving unit is further configured to: receiving a fifth operation of the user, wherein the fifth operation is used for adjusting the gray scale of the second color component displayed on the second display area according to the gray scale of the second color component displayed on the first display area; the gray scale of the second color component displayed on the first display region is unchanged.
In one possible implementation, the display unit is further configured to: displaying a second interface in the first display area and the second display area, wherein the second interface comprises a second picture, the second picture is different from the first picture, and a display boundary exists at the boundary of the first display area and the second display area; the receiving unit is further configured to: receiving a sixth operation of the user, wherein the sixth operation is used for determining screen calibration according to the second picture; the display unit is further configured to: in response to a sixth operation, displaying a first color component of a second picture in the first display area and the second display area; the receiving unit is further configured to: receiving a seventh operation of the user, wherein the seventh operation is used for adjusting the gray scale of the first color component displayed on the second display area according to the gray scale of the first color component displayed on the first display area; the processing unit is used for: and determining a compensation value of the first sub-pixel on the second display area according to the first difference value and a second difference value between the gray scale of the first color component after the second picture is adjusted and the gray scale of the first color component before the second picture is adjusted.
In one possible implementation, the second picture is one of a preset group of pure color pictures with different gray scales; or the second picture is selected from the pure color pictures in the preset gray scale range by the user; or the second picture is a second user screenshot; or the second picture is obtained by performing preset processing on the second user screenshot, and when the second user screenshot is displayed in the first display area and the second display area, a display boundary exists at the boundary of the first display area and the second display area.
In one possible implementation manner, the display screen further includes a third display area, and the display unit is configured to: displaying a first interface in a first display area, a second display area and a third display area, wherein the first interface comprises a first picture, and a display boundary exists at the boundary of the first display area and the third display area; displaying a first color component of a first picture in a first display area, a second display area and a third display area; the receiving unit is further configured to: receiving an eighth operation of the user, wherein the eighth operation is used for adjusting the gray scale of the first color component displayed on the third display area according to the gray scale of the first color component displayed on the first display area; the processing unit is further configured to: and determining a compensation value of a first sub-pixel on the third display area according to a third difference value between the gray scale of the first color component displayed on the adjusted third display area and the gray scale of the first color component displayed on the third display area before adjustment, wherein the first sub-pixel is a sub-pixel corresponding to the first color component, and compensating the first sub-pixel on the third display area according to the compensation value of the first sub-pixel on the third display area.
In one possible implementation, the display screen is a folding screen.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, which includes instructions that, when executed on a computer, cause the computer to perform any one of the methods provided in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product containing instructions, which when run on a computer, cause the computer to perform any one of the methods provided in the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip system, where the chip system includes a processor and may further include a memory, and is configured to implement any one of the methods provided in the foregoing first aspect. The chip system may be formed by a chip, and may also include a chip and other discrete devices.
In a sixth aspect, embodiments of the present application further provide a pixel compensation device, which may be a processing device, an electronic device, or a chip. The apparatus comprises a processor configured to implement any one of the methods provided by the first aspect. The apparatus may also include a memory for storing program instructions and data, which may be memory integrated within the apparatus or off-chip memory disposed external to the apparatus. The memory is coupled to the processor, and the processor can call and execute the program instructions stored in the memory, so as to implement any one of the methods provided by the first aspect. The apparatus may also include a communication interface for the apparatus to communicate with other devices.
Drawings
FIG. 1A is a schematic diagram of a multi-window display according to an embodiment of the present disclosure;
fig. 1B is a schematic view of a video window display provided in an embodiment of the present application;
fig. 2A is a schematic product form view of an external folding screen according to an embodiment of the present disclosure;
fig. 2B is a schematic product form view of another folding screen according to an embodiment of the present application;
fig. 2C is a schematic product form view of a foldable screen according to an embodiment of the present disclosure;
fig. 3A is a schematic product form view of another folding screen according to an embodiment of the present application;
fig. 3B is a schematic product form view of another foldable screen according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 5 is a schematic flowchart of a pixel compensation method according to an embodiment of the present disclosure;
fig. 6 is a schematic display diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic display diagram of another electronic device provided in the embodiment of the present application;
fig. 8 is a schematic display diagram of another electronic device provided in an embodiment of the present application;
fig. 9 is a schematic display diagram of another electronic device provided in an embodiment of the present application;
fig. 10A is a schematic display diagram of another electronic device according to an embodiment of the present application;
fig. 10B is a schematic display diagram of another electronic device according to an embodiment of the present application;
fig. 11 is a schematic display diagram of another electronic device provided in an embodiment of the present application;
fig. 12A is a schematic display diagram of another electronic device according to an embodiment of the disclosure;
fig. 12B is a schematic display diagram of another electronic device according to an embodiment of the disclosure;
FIG. 13 is a schematic flowchart of another pixel compensation method according to an embodiment of the present disclosure;
fig. 14 is a schematic display diagram of another electronic device provided in an embodiment of the present application;
fig. 15 is a schematic display diagram of another electronic device provided in an embodiment of the present application;
fig. 16 is a schematic display diagram of another electronic device provided in an embodiment of the present application;
fig. 17 is a schematic software architecture diagram of an electronic device according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of another electronic device provided in the embodiment of the present application;
fig. 19 is a schematic diagram of a chip system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the present application, unless otherwise specified, "at least one" means one or more, "a plurality" means two or more. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
The embodiment of the application provides a pixel compensation method, which can be applied to electronic equipment with a display screen, wherein the display screen can comprise a plurality of display areas, and the plurality of display areas can have different display states in different use scenes.
For example, in a multi-window scenario, multiple windows may be displayed on the mobile terminal. For example, as shown in fig. 1A, the plurality of windows displayed on the tablet computer include a video play window 01 and a window 02 for setting an application. The video playing window 01 is displayed in the first display area, and the window 02 for setting the application is displayed in the second display area. Therefore, over time, the first display area and the second display area are attenuated in different degrees (namely, the aging degrees of the different display areas are different) due to different display contents, and when the attenuation difference is increased along with time, a user can perceive the display effect difference between the different display areas, for example, the phenomenon of yin and yang screens appears, the user experience is influenced, and therefore the display effect compensation is required to be carried out on the different display areas, so that the user can not perceive that the display boundaries exist in the different areas of the screen.
For another example, in a video playing scene, as shown in fig. 1B, a video playing window is displayed in a display area 202 (a first display area), and a display area 201 (a second display area) and a display area 203 (a third display area) are in a black screen state, so that different display areas will be attenuated in different degrees due to different lighting time durations of a screen over time, and when the attenuation difference increases with time, a user will perceive a difference in display effect between the different display areas to affect user experience, and therefore, compensation of the display effect for the different display areas is required to be performed, so that the user cannot perceive that a display boundary exists in the different areas of the screen.
In some embodiments, the display screen of the electronic device may be a foldable screen that is foldable to form at least two screens, each of which may correspond to a display area. For example, the foldable screen may include two screens, and the foldable screen may be folded along a folding edge (folding axis) to form a first screen and a second screen, and the first screen may be considered as a first display area and the second screen may be considered as a second display area. The first and second panels may be connected by a first folded edge, which may be considered as the demarcation (demarcation) between the first and second panels. For another example, the foldable screen may include three screens, the foldable screen may be foldable along two foldable edges to form a first screen, a second screen, and a third screen (a third display region), the first screen and the second screen may be connected by a first foldable edge, the second screen and the third screen may be connected by a second foldable edge, and the first foldable edge may be considered as a boundary between the first screen and the second screen, and the second foldable edge may be considered as a boundary between the second screen and the third screen. For another example, the foldable screen may include four screens, five screens, six screens, and so on, which are not described herein.
It should be noted that the folding screen in the embodiment of the present application may be a flexible folding screen. The flexible folding screen comprises a folding edge made of flexible materials. Part or all of the flexible folding screen is made of flexible materials. At least two screens formed by folding the flexible folding screen are a whole screen of an integral structure, and only the flexible folding screen is folded to form at least two parts.
In one possible design, the size of each of the at least two screens included in the folded screen may be the same or different, for example, assuming that the folded screen includes two screens, a first screen and a second screen, respectively, the first screen may be larger than the second screen, or the first screen and the second screen may be the same size.
In one possible design, the at least two screens included in the folding screen may include one or more primary screens and one or more secondary screens. For example, assume that the folding screen includes two screens, one of which (the first screen) may be the main screen and the other (the second screen) may be the sub-screen. When the folding screen is in the folded state, the user may wake up the main screen first by a corresponding operation (e.g., pressing a power key, double-clicking the screen, performing fingerprint recognition), and then may wake up the sub-screen again (e.g., an operation of unfolding the folding screen by the user, or an operation for waking up the sub-screen by the user on the main screen).
The types of the folding screen may include an outward-folded folding screen (hereinafter, an outward-folded folding screen) and an inward-folded folding screen (hereinafter, an inward-folded folding screen). If the folding screen includes two screens, folding screen collapsible formation first screen and second screen promptly, folding screen is folded outward, and first screen and second screen back on the back are carried on the back mutually. The range of the included angle alpha between the first screen and the second screen of the folded-out screen can be [0 degrees, 180 degrees ]. The first screen and the second screen are opposite after the inward folding screen is folded. The range of the included angle alpha between the first screen and the second screen of the folded-in screen can be [0 degrees, 180 degrees ]. The types of the folding screens can also include folding screens which can be turned inwards and can also be turned outwards, namely the range of the included angle alpha between the first screen and the second screen can be [0 degrees, 360 degrees ].
Fig. 2A is a schematic product form diagram of an electronic device 100 with a fold-out folding screen according to an embodiment of the present disclosure. Fig. 2A (a) is a schematic view of the folded-out screen in a fully unfolded state. The outer folded screen may be folded along the first folded edge in the directions 101a and 101B as shown in fig. 2A (a) to form a screen a (i.e., a first screen) and a screen B (i.e., a second screen) as shown in fig. 2A (B). The outer folded screen may be further folded along the first folded edge in the directions 102A and 102b as shown in fig. 2A (b), and may be formed into a folded outer folded screen as shown in fig. 2A (c). As shown in fig. 2A (c), when the foldable screen of the electronic device 100 is completely folded, the a screen and the B screen are opposite to each other and visible to the user.
Under different use scenes, the display area corresponding to the A screen and the display area corresponding to the B screen can have different display states. For example, as shown in (a) of fig. 2A, when the folded screen is in the unfolded state, the a screen and the B screen are simultaneously in the bright screen state. For another example, as shown in (c) of fig. 2A, when the user uses only the a screen in the folded state, the B screen may be in the black screen state.
In one possible design, as shown in fig. 2B, the fold-out folding screen may include a screen a, a screen B, and a fold region, which may be considered to be a screen other than a screen a and a screen B. Fig. 2B (a) is a schematic view of the folded-out screen in a fully unfolded state. The folded-out panels can be folded in the directions 101B and 101B as shown in fig. 2B (a), forming panels a, B and a fold region as shown in fig. 2B (B). The folded-out folding screen can be folded back further in the directions 102a and 102B as shown in fig. 2B (B), and the folded-out folding screen as shown in fig. 2B (c) can be formed. As shown in fig. 2B (c), when the foldable screen of the electronic device 100 is completely folded, the a screen and the B screen are opposite to each other, and the bending region is located in the side region and visible to the user.
Under different use scenes, the display area corresponding to the screen A, the display area corresponding to the screen B and the display area corresponding to the bending area can have different display states. For example, as shown in (a) of fig. 2B, when the folded screen is in the unfolded state, the a screen, the fold area, and the B screen are simultaneously in the bright screen state. For another example, as shown in (c) of fig. 2B, when the user uses only the a-screen in the folded state, the bending region and the B-screen may be in the black screen state. Alternatively, the bending area may be independently lighted, for example, when the foldable screen is in the folded state, if a new incoming call or a new message is received, the bending area may be lighted to prompt the user.
Fig. 2C is a schematic product form diagram of an electronic device 100 with a foldable screen according to an embodiment of the present application. Fig. 2C (a) is a schematic view of the fully unfolded folded-in screen. The inner folded screen may be folded along the first folded edge in the directions 201a and 201B as shown in fig. 2C (a), and may form a screen a (i.e., a first screen) and a screen B (i.e., a second screen) as shown in fig. 2C (B). The fold-in screen may be further folded along the first folded edge in the directions 202a and 202b as shown in fig. 2C (b), to form an fold-out screen as shown in fig. 2C (C). As shown in fig. 2C (C), when the folding screen of the electronic device 100 is completely folded, the a screen and the B screen are opposite and invisible to the user. After the folded-in folding screen is completely folded, the a screen is opposite to the B screen, and a black line 203 shown in (C) of fig. 2C is a line on the contact surface of the a screen and the B screen.
Optionally, a display screen, which may be referred to as a third screen, may be further disposed on the back of the first screen or the back of the second screen of the fold-in folding screen provided in the embodiment of the present application. For example, as shown in (b) of fig. 2C, a C screen (i.e., a third screen) may be provided on the back of the a screen (i.e., the first screen). As shown in fig. 2C (C), after the fold-in folding screen is completely folded, the C screen is opposite and visible to the user. It is understood that for an electronic device having such a fold-in folding screen, an interface may be displayed on the third screen when the folding screen is in the folded state; when the folded screen is in the unfolded state, an interface can be displayed on the first screen and the second screen.
If the folding screen comprises three or more than three screens, every two screens connected by the folding edge can be folded inwards or outwards. For example, please refer to fig. 3A, which shows a product configuration diagram of an electronic device 100 with a triple-folding screen according to an embodiment of the present application. Fig. 3A (a) is a schematic view of a configuration of the triple-folding screen when it is completely unfolded. The a screen may be folded along the first folded edge in a direction 301a as shown in fig. 3A, and the C screen may be folded along the second folded edge in a direction 301B, so that the a screen, the B screen, and the C screen shown in fig. 3A may be formed. The screen A can continue to be turned over according to the direction 301a along the first folding edge, the screen C can continue to be turned over according to the direction 301b along the second folding edge, a three-folding-state folding screen in a folding state shown in (C) in fig. 3A can be formed, and the size of the three-folding-state folding screen in the folding state is close to that of a common flat plate or a mobile phone. As shown in (C) of fig. 3A, when the foldable screen of the electronic device 100 is completely folded, the a screen and the C screen are opposite to the B screen, respectively, and are visible to the user. Optionally, a bending region may exist between the a screen and the B screen, and a bending region may also exist between the B screen and the C screen.
For example, please refer to fig. 3B, which shows a product configuration diagram of an electronic device 100 with a triple-folding screen according to an embodiment of the present application. Fig. 3B (a) is a schematic view of the configuration of the triple-folding screen when it is completely unfolded. A first panel (e.g., a panel) may be folded along a first folded edge in a direction 401a as shown in fig. 3B (a), and a third panel (e.g., C panel) may be folded along a second folded edge in a direction 401B, which may form a panel a, B, and C panels as shown in fig. 3B (B). The A screen can be continuously folded along the first folding edge according to the direction 401a, the C screen can be continuously folded along the second folding edge according to the direction 401B, a three-folding screen in a folding state shown in (C) in fig. 3B can be formed, and the size of the three-folding screen in the folding state is similar to that of a common flat panel or a mobile phone. As shown in (C) of fig. 3B, when the folding screen of the electronic device 100 is completely folded, the a screen and the C screen are respectively opposite to the B screen and are not visible to the user.
It can be understood that, because the different display areas of the display screen are different in use duration or display content, the different display areas may be attenuated to different degrees (aging to different degrees), and when the attenuation difference increases with time, the user may perceive the display effect difference between the different display areas to affect the user experience, so that the display effect compensation is required to be performed on the different display areas to achieve the effect that the user cannot perceive that the different areas of the screen have the display boundaries.
In order to solve the above problem, embodiments of the present application provide a pixel compensation method, where a user may adjust deviations between different display areas (different screens) visually and effectively through manual adjustment, so as to simply and accurately complete compensation of screen consistency, eliminate display boundaries between different display areas, and meet requirements of users with different use habits and sensitivities on display effects of screens.
For example, the electronic device in the embodiment of the present application may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, and the like including the above-mentioned folding screen, and the embodiment of the present application does not particularly limit the specific form of the electronic device.
As shown in fig. 4, the electronic device may be a mobile phone 100. The mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a radio frequency module 150, a communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a SIM card interface 195, and the like. The sensor module may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor, and the like.
The structure illustrated in the embodiment of the present invention is not limited to the mobile phone 100. It may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be independent devices or may be integrated in the same processor.
The controller may be a decision maker directing the various components of the handset 100 to work in concert as instructed. Is the neural center and command center of the handset 100. The controller generates an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor is a cache memory. Instructions or data that have just been used or recycled by the processor may be saved. If the processor needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses and reducing the latency of the processor, thereby increasing the efficiency of the system.
In some embodiments, the processor 110 may include an interface. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor may include multiple sets of I2C buses. The processor may be coupled to the touch sensor, charger, flash, camera, etc. via different I2C bus interfaces. For example: the processor may be coupled to the touch sensor via an I2C interface, such that the processor and the touch sensor communicate via an I2C bus interface to implement the touch functionality of the cell phone 100.
The I2S interface may be used for audio communication. In some embodiments, the processor may include multiple sets of I2S buses. The processor may be coupled to the audio module via an I2S bus to enable communication between the processor and the audio module. In some embodiments, the audio module can transmit audio signals to the communication module through the I2S interface, so as to realize the function of answering the call through the bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module and the communication module may be coupled by a PCM bus interface. In some embodiments, the audio module may also transmit the audio signal to the communication module through the PCM interface, so as to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication, with different sampling rates for the two interfaces.
The UART interface is a universal serial data bus used for asynchronous communications. The bus is a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor with the communication module 160. For example: the processor communicates with the Bluetooth module through the UART interface to realize the Bluetooth function. In some embodiments, the audio module may transmit the audio signal to the communication module through the UART interface, so as to realize the function of playing music through the bluetooth headset.
The MIPI interface can be used to connect a processor with peripheral devices such as a display screen and a camera. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor and the camera communicate through a CSI interface to implement the camera function of the handset 100. The processor and the display screen communicate through a DSI interface to implement the display function of the mobile phone 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, the GPIO interface may be used to connect the processor with a camera, display screen, communication module, audio module, sensor, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 may be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc. The USB interface may be used to connect a charger to charge the mobile phone 100, or may be used to transmit data between the mobile phone 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. But may also be used to connect other electronic devices such as AR devices, etc.
The interface connection relationship between the modules in the embodiment of the present invention is only schematically illustrated, and does not limit the structure of the mobile phone 100. The mobile phone 100 may adopt different interface connection modes or a combination of multiple interface connection modes in the embodiment of the present invention.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module may receive charging input from a wired charger via a USB interface. In some wireless charging embodiments, the charging management module may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module can also supply power to the terminal device through the power management module 141 while charging the battery.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module receives the input of the battery and/or the charging management module and supplies power to the processor, the internal memory, the external memory, the display screen, the camera, the communication module and the like. The power management module may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In some embodiments, the power management module 141 may also be disposed in the processor 110. In some embodiments, the power management module 141 and the charging management module may also be disposed in the same device.
The wireless communication function of the mobile phone 100 can be implemented by the antenna module 1, the antenna module 2, the rf module 150, the communication module 160, a modem, and a baseband processor.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the cellular network antenna may be multiplexed into a wireless local area network diversity antenna. In some embodiments, the antenna may be used in conjunction with a tuning switch.
The RF module 150 may provide applications including second generation (2) for the handset 100thgeneration, 2G)/third generation (3)thgeneration, 3G)/fourth generation (4)thgeneration, 4G)/fifth generation (5)thgeneration, 5G), and the like. May include at least one filter, switch, power Amplifier, Low Noise Amplifier (LNA), etc. The radio frequency module receives electromagnetic waves through the antenna 1, and processes the received electromagnetic waves such as filtering, amplification and the like, and transmits the electromagnetic waves to the modem for demodulation. The radio frequency module can also amplify the signal modulated by the modem, and the signal is converted into electromagnetic wave by the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the rf module 150 may be disposed in the processor 150. In some embodiments, at least some functional modules of the rf module 150 may be disposed in the same device as at least some modules of the processor 110.
The modem may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to a speaker, a receiver, etc.) or displays an image or video through a display screen. In some embodiments, the modem may be a stand-alone device. In some embodiments, the modem may be separate from the processor, in the same device as the rf module or other functional module.
The communication module 160 may provide a communication processing module including a solution for wireless communication, such as Wireless Local Area Network (WLAN) (e.g., WiFi), bluetooth, Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like, which is applied to the mobile phone 100. The communication module 160 may be one or more devices integrating at least one communication processing module. The communication module receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor. The communication module 160 may also receive a signal to be transmitted from the processor, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the radio frequency module and the antenna 2 is coupled to the communication module. So that the handset 100 can communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), LTE, 5G New wireless communication (New Radio, NR), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, and the like. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing and is connected with a display screen and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an OLED, an active-matrix organic light emitting diode (AMOLED) or an active-matrix organic light emitting diode (AMOLED), a miniature, a Micro-oeld, a quantum dot light emitting diode (QLED), or the like. In some embodiments, the handset 100 may include 1 or N display screens, with N being a positive integer greater than 1.
As also shown in fig. 4, the cell phone 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen, an application processor, and the like.
The ISP is used for processing data fed back by the camera. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 100 may support one or more codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: MPEG1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor through the external memory interface to realize the data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. Further, the memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, other volatile solid-state storage devices, a universal flash memory (UFS), and the like.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module is used for converting digital audio information into analog audio signals to be output and converting the analog audio input into digital audio signals. The audio module may also be used to encode and decode audio signals. In some embodiments, the audio module may be disposed in the processor 110, or some functional modules of the audio module may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The cellular phone 100 can listen to music through a speaker or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the handset 100 receives a call or voice information, it can receive voice by placing the receiver close to the ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, a user can input a voice signal into the microphone by making a sound by approaching the microphone through the mouth of the user. The handset 100 may be provided with at least one microphone. In some embodiments, the handset 100 may be provided with two microphones to achieve a noise reduction function in addition to collecting sound signals. In some embodiments, the mobile phone 100 may further include three, four or more microphones to collect sound signals and reduce noise, and may further identify sound sources and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The earphone interface may be a USB interface, or may be an open mobile platform (OMTP) standard interface of 3.5mm, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor may be disposed on the display screen. There are many types of pressure sensors, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor, the capacitance between the electrodes changes. The handset 100 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen, the mobile phone 100 detects the intensity of the touch operation according to the pressure sensor. The cellular phone 100 can also calculate the touched position based on the detection signal of the pressure sensor. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the cellular phone 100. In some embodiments, the angular velocity of the handset 100 about three axes (i.e., the x, y, and z axes) may be determined by a gyroscope sensor. The gyro sensor may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor detects the shake angle of the mobile phone 100, and calculates the distance to be compensated for the lens module according to the shake angle, so that the lens can counteract the shake of the mobile phone 100 through reverse movement, thereby achieving anti-shake. The gyroscope sensor can also be used for navigation and body feeling game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by a barometric pressure sensor.
The magnetic sensor 180D includes a hall sensor. The handset 100 may detect the opening and closing of the flip holster using a magnetic sensor. In some embodiments, when the handset 100 is a flip phone, the handset 100 may detect the opening and closing of the flip based on the magnetic sensor. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for recognizing the terminal gesture, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The handset 100 may measure distance by infrared or laser. In some embodiments, the scene is photographed and the cell phone 100 may utilize a range sensor to measure the distance to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. Infrared light is emitted outward through the light emitting diode. Infrared reflected light from nearby objects is detected using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the cell phone 100. When insufficient reflected light is detected, it can be determined that there is no object near the cellular phone 100. The mobile phone 100 can detect that the user holds the mobile phone 100 close to the ear by using the proximity light sensor, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor can also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The mobile phone 100 may adaptively adjust the display screen brightness according to the perceived ambient light level. The ambient light sensor can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor may also cooperate with the proximity light sensor to detect whether the cell phone 100 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the handset 100 implements a temperature processing strategy using the temperature detected by the temperature sensor. For example, when the temperature reported by the temperature sensor exceeds the threshold, the mobile phone 100 performs a reduction in the performance of the processor located near the temperature sensor, so as to reduce power consumption and implement thermal protection.
The touch sensor 180K is also referred to as a "touch panel". Can be arranged on the display screen. For detecting a touch operation acting thereon or thereabout. The detected touch operation may be passed to an application processor to determine the touch event type and provide a corresponding visual output via the display screen.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor may acquire a vibration signal of a human voice vibrating a bone mass. The bone conduction sensor can also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor may also be disposed in the earpiece. The audio module 170 may analyze a voice signal based on the vibration signal of the bone block vibrated by the sound part obtained by the bone conduction sensor, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor, and a heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, and the like. The keys may be mechanical keys. Or may be touch keys. The cellular phone 100 receives a key input, and generates a key signal input related to user setting and function control of the cellular phone 100.
The motor 191 may generate a vibration cue. The motor can be used for incoming call vibration prompt and can also be used for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The touch operation on different areas of the display screen can also correspond to different vibration feedback effects. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a Subscriber Identity Module (SIM). The SIM card can be attached to and detached from the cellular phone 100 by being inserted into or pulled out from the SIM card interface. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface can support a Nano SIM card, a Micro SIM card, a SIM card and the like. Multiple cards can be inserted into the same SIM card interface at the same time. The types of the plurality of cards may be the same or different. The SIM card interface may also be compatible with different types of SIM cards. The SIM card interface may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
In this embodiment, the mobile phone 100 may further include a gamma unit and a Display Driver Integrated Circuit (DDIC) unit, and the gamma unit and the DDIC unit may compensate for pixels according to electrical or optical characteristics of the pixels of the display screen. The gamma unit or the DDIC unit may be integrated on the processor or saved on the memory.
In the following description, taking an electronic device as a mobile phone, taking a display screen as a foldable screen, taking the foldable screen as an outward-folded foldable screen, and folding the outward-folded foldable screen into a first screen and a second screen, and there is no bending area between the first screen and the second screen, as shown in fig. 5, the pixel compensation method provided in the embodiment of the present application includes:
501. the electronic equipment displays a first interface on the first screen and the second screen, wherein the first interface comprises a first picture.
After a user uses a folding screen mobile phone for a long time (e.g., half a year), due to different use times of each screen of the mobile phone (e.g., the first screen is used more times and the second screen is used less times), the attenuation speeds of the screens of the mobile phone are different, so that the mobile phone may have a negative screen and a positive screen (e.g., the brightness and/or the color tone of the first screen and the second screen are not consistent, e.g., the brightness of the first screen is darker than that of the second screen, or the color tone of the first screen is yellowish than that of the second screen, etc.), and thus a display boundary (hereinafter referred to as a boundary for short) that the user can perceive may exist at the boundary of the first screen and the second screen, which affects user experience. At the moment, the user can adjust the screen consistency of the folding screen by setting the APP, so that the problem of the yin and yang screens can be eliminated to the greatest extent. The user may be a user (owner) of the mobile phone, or an after-sales service person, which is not limited herein.
For example, as shown in fig. 6, the user may click on an icon 602 of the setting APP of the mobile phone on a desktop 601 of the mobile phone. When the mobile phone detects that the user clicks an icon 602 of the setting APP on the desktop 601, the setting APP may be started, and a Graphical User Interface (GUI) shown in (a) in fig. 7 is displayed, where the GUI may be referred to as a setting interface 603. The settings interface 603 may display entries for relevant settings for networks and connections, for example, the relevant settings for networks and connections may include entries for mobile networks, WLAN, bluetooth, personal hotspots, and more (network and connection related) settings. The settings interface 603 may also display the relevant settings items for the individual, such as display, sound, wallpaper, and personality theme. Of course, the setting interface 603 may also display other setting items, which is not limited in this application.
As shown in fig. 7 (a), the user may click on the display control 604 on the setting interface 603, and when the mobile phone detects an operation of the user clicking on the display control 604 on the setting interface 603, a GUI as shown in fig. 7 (b) may be displayed, which may be referred to as a display interface 605. The user may click the screen consistency control 606 on the display interface 605, and when the mobile phone detects that the user clicks the screen consistency control 606 on the display interface 605, a GUI as shown in fig. 8 may be displayed, which may be referred to as a screen consistency interface 801.
Alternatively, as shown in fig. 9 (a), the user may click on the more setting control 607 on the setting interface 603, and when the cell phone detects an operation of clicking on the more setting control 607 by the user on the setting interface 603, a GUI as shown in fig. 9 (b) may be displayed, which may be referred to as a more setting interface 608. The user may click on the screen consistency control 609 on the more settings interface 608, and when the cell phone detects that the user has clicked on the screen consistency control 609 on the more settings interface 608, the screen consistency interface 801 as shown in fig. 8 may be displayed. That is, the screen consistency control may be located in different interfaces, and the application is not particularly limited.
As shown in fig. 8, the user may select different ways to adjust the screen consistency of the folded screen at the screen consistency interface 801. The user needs to adjust the screen consistency of the folding screen when the folding screen is in the unfolded state. If the user selects the mode 1, the user can check a group of preset pure color pictures with different gray scales provided by the system. When each pure color picture is displayed, if a display boundary exists at the boundary of the first screen and the second screen (for a user, the pure color picture can be considered to have the display boundary), the user can adjust the pure color picture, and the more the number of the adjusted pictures is, the better the screen consistency adjusting effect is; if the user selects the mode 2, the user can check the pure color picture in the preset gray scale range, when the pure color picture with a certain gray scale value is displayed, if a display boundary exists at the boundary of the first screen and the second screen, the user can adjust the screen consistency of the pure color picture, and the more the number of the adjusted pictures is, the better the screen consistency adjusting effect is; if the user selects the mode 3, the user can select one or more screenshot pictures with a display boundary during display, and screen consistency adjustment is performed according to the screenshot pictures. As shown in fig. 8, if the user selects the mode 3, the user may select one or more screenshot pictures in the system album by selecting the picture control 803, where the one or more screenshot pictures are obtained by screenshot when the user finds that a display boundary exists at a boundary between the first screen and the second screen of the mobile phone during the use process. Or, the user can directly jump to the corresponding application program and enter the corresponding interface (the user finds that the interface of the display boundary exists at the boundary between the first screen and the second screen of the mobile phone in the using process) to perform screenshot through the screenshot removing control 804, or jump to the desktop to perform screenshot, and after the screenshot is finished, the user can quickly return to the relevant interface of the screen calibration through the corresponding control.
After the user selects the screen consistency adjusting mode, the mobile phone can display a first interface on the first screen and the second screen, and the first interface comprises a first picture. It should be understood that the first picture may include a plurality of pixels, and each pixel may be composed of three Red Green Blue (RGB) sub-pixels. The three sub-pixels red (R), green (G) and blue (B) may each have 256 levels of gray scale/gray scale (from 0 to 255), with the gray scale/gray scale of the sub-pixel representing the gradation level of the sub-pixel from the darkest to the brightest with different brightness. The red, green and blue sub-pixels of different brightness levels are combined to finally form pixels of different colors. If the gray scale of each sub-pixel of each pixel is 255, that is, all sub-pixels are turned on to the maximum brightness, the screen is white, and it can be considered that a white picture is displayed on the screen, and the gray scale of the white picture is 255. The gray scales of the sub-pixels of all the pixels are reduced uniformly, the screen can display gray with different gray scales/gray scales, and the screen can be considered to display pictures with different gray scales. Different sub-pixels correspond to different color components, for example, the R sub-pixel corresponds to a red component, the gray scale of the R sub-pixel is the gray scale of the red component, the G sub-pixel corresponds to a green component, the gray scale of the G sub-pixel is the gray scale of the green component, the B sub-pixel corresponds to a blue component, and the gray scale of the B sub-pixel is the gray scale of the blue component.
The first picture is a pure-color picture comprising one color; or the first picture is a picture with patterns comprising at least two colors. The gray-scale values of all pixels included in the pure-color picture are the same, and the gray-scale values of the sub-pixels of each pixel are the same or different. For example, assuming a pure color picture includes 100 pixels, the R, G, B sub-pixels of each of the 100 pixels may have a gray scale value of 100, 100, 100; alternatively, the R, G, B sub-pixels for each of the 100 pixels may have a grayscale value of 50, 100, 150. The gray-scale values of all pixels included in the picture with the pattern are not completely the same, and the gray-scale values of the sub-pixels of each pixel are the same or different.
If the user selects the method 1, the first picture may be one of a set of preset pure color pictures with different gray scales. For example, 3 solid color pictures may be preset, the preset solid color pictures may be selected from a full gray scale range (e.g., 0-255), such as the preset 3 solid color pictures may be 60, 180, and 255, respectively, or the preset solid color pictures may be selected from a low gray scale range (e.g., 0-100), such as the preset 3 solid color pictures may be 30, 70, and 100. The preset R component (red component), G component (green component), and B component (blue component) of each pure color picture may have the same gray scale value, for example, the gray scale values of R, G, B components of a pure color picture with a gray scale value of 60 are all 60, and a pure color picture with R, G, B components having the same gray scale value may also be referred to as a gray scale picture.
For example, as shown in fig. 8, if the user selects the method 1, that is, the user clicks the control 802 on the screen consistency interface 801, after the mobile phone detects that the user clicks the control 802 on the screen consistency interface 801, the interface 1001 shown in fig. 10A may be displayed, where the interface 1001 is the first interface. The interface 1001 includes a first picture, which may be, for example, a solid color picture with a gray scale value of 60(R, G, B components having a gray scale value of 60). The first picture may fill (flood) the first screen and the second screen. On the first picture, some user prompt information and function buttons may be displayed. For example, a prompt message 1002, an OK button 1003, a calibration button 1004, and a dotted line 1005 and a dotted line 1006 may be displayed. An area between the dotted line 1005 and the dotted line 1006 may be referred to as a dotted line area 1007 (a folding area, i.e., an area where a boundary of the first screen and the second screen is located), and a user may view within the dotted line area 1007 to determine whether there is a faint boundary within the dotted line area 1007. The boundary may be due to a difference in decay rates of the first and second screens. The screen with the fast attenuation speed may be darker (or yellow), and the screen with the slow attenuation speed may be brighter, so that the situation of yin and yang screens may occur, and a light boundary may exist in the dotted line area 1007, which affects the user experience. If the user determines that a boundary exists in the dashed area 1007, the user can click the calibration button 1004 to calibrate the screen; if the user determines that there is no obvious boundary in the dashed area 1007, the user may click the OK button to enter a next step, for example, may enter a second interface, where the second interface includes a second picture, and the second picture is different from the first picture, and the user may check whether there is a boundary between the first screen and the second screen when displaying the second picture.
If the user selects the method 2, the first picture may be selected from the pure color picture within the preset gray scale range by the user. The gray scale values of the R component, the G component and the B component of each pure color picture in the preset gray scale range can be the same or different. Taking the gray values of the R component, the G component, and the B component of the pure color picture as an example, as shown in fig. 10B, a slider 1008 may be provided on the first interface 1001, and the user may select a pure color picture by dragging the slider 1009 on the slider 1008. The gray scale range corresponding to the sliding bar can be 0-255, and a user can check whether the pure color pictures of each gray scale meet the screen consistency (whether a boundary exists) and select one pure color picture with the boundary from the pure color pictures with the gray scale values of 0-255 (namely, when the pure color picture is displayed, a display boundary exists at the boundary of the first screen and the second screen). Or, a numerical value selection box can be provided on the first interface, the corresponding gray scale range of the data selection box can be 0-255, and a user can select a pure color picture with a boundary by modifying the numerical value of the data selection box. Therefore, a user can check all the pure color pictures in the preset gray scale range, when the screen consistency is checked, the pure color pictures which do not meet the screen consistency (have boundaries) can be selected more flexibly and more accurately, and the screen calibration is carried out according to the pure color pictures.
If the user selects the mode 3, the first picture may be the first user screenshot or obtained by performing preset processing on the first user screenshot. For example, a user may select a user screenshot with a display boundary from the system album (that is, when the user screenshot is displayed, a display boundary exists at a boundary between the first screen and the second screen), the mobile phone may use the user screenshot selected by the user (the first user screenshot) as the first picture, or the mobile phone may perform preset processing on the user screenshot selected by the user (the first user screenshot) to obtain the first picture. The preset processing may be graying processing of the first user screenshot, and the R component, the G component, and the B component of each pixel of the first picture obtained after the graying processing are respectively the same as the average values of the R component, the G component, and the B component of the first user screenshot. Wherein the average value of the R components of the first user screenshot is a quotient of the sum of the grayscale values (R components) of all R subpixels included in the first user screenshot and the number of all R subpixels included in the first user screenshot. Similarly, the average value of the G components of the first user screenshot is the quotient of the sum of the gray-scale values (G components) of all the G subpixels included in the first user screenshot and the number of all the G subpixels included in the first user screenshot, and the average value of the B components of the first user screenshot is the quotient of the sum of the gray-scale values (B components) of all the R subpixels included in the first user screenshot and the number of all the B subpixels included in the first user screenshot. Therefore, the user can carry out screen calibration according to the first user screenshot or the first picture corresponding to the first user screenshot in a more targeted manner.
In addition, in some embodiments, after the mobile phone detects the operation of the user clicking the screen consistency control 606 on the display interface 605, or after the mobile phone detects the operation of the user clicking the screen consistency control 609 on the more setting interfaces 608, as shown in fig. 10A, the mobile phone may directly display the first interface 1001, and the first picture in the first interface 1001 may be one of a preset set of solid-color pictures with different gray scales. Alternatively, as shown in fig. 10B, the user may drag the slider bar on the first interface to select a solid color picture. Alternatively, after the mobile phone detects the user's operation of clicking the screen consistency control 606 on the display interface 605, or after the mobile phone detects the user's operation of clicking the screen consistency control 609 on the more setting interface 608, a prompt box may pop up to prompt the user to select one or more screenshots with display boundaries from the system album for screen consistency calibration.
502. And receiving a first operation of a user, wherein the first operation is used for determining screen calibration according to the first picture.
When the first screen and the second screen display the first picture, if the user determines that a display boundary exists at the boundary of the first screen and the second screen, the user can perform a first operation on the first interface and determine to perform screen calibration. For example, as shown in fig. 10A, if the user determines that a boundary exists in the folding area 1007, the user may click the calibration button 1004, and the mobile phone receives a click operation of the user on the calibration button 1004 displayed on the folding screen, and determines that the user needs to perform screen calibration according to the first picture.
Or, the user may input a voice operation instruction, where the voice operation instruction is used to indicate that screen calibration is required, and the mobile phone may receive the voice operation instruction of the user and determine that the screen calibration is required by the user. For example, a user may speak "calibrate" or "i want to calibrate" near the microphone, and the handset receives voice information entered by the user and determines that the user needs to perform a screen calibration.
503. In response to the first operation, a first color component of the first picture is displayed on the first screen and the second screen.
After the mobile phone determines that screen calibration needs to be performed according to the first picture, the first picture can be decomposed into at least one color component, and each color component in the at least one color component is respectively displayed on the first screen and the second screen, so that a user can calibrate each color component respectively. Of course, if the first picture is a pure color picture with only one color component, no further decomposition is needed. For example, the gray scale of R, G components of each pixel of the first picture may be 100, 0, 0, i.e. the first picture is a pure red picture, and then the first picture does not need to be decomposed.
In some embodiments, the cell phone may decompose the first picture into one or more of a red component, a green component, a blue component, a yellow component, a cyan component, or a magenta component. For example, the handset may decompose a first picture into a red component, a green component, and a blue component; alternatively, the handset may decompose the first picture into a blue component and a yellow component, the yellow component being a mixture of the red and green components.
The mobile phone may display each of the color components according to a preset sequence, for example, if the mobile phone decomposes the first picture into a red component, a green component, and a blue component, the mobile phone may display each color component according to the sequence of the red component, the green component, and the blue component, that is, the first color component is the red component; or, the mobile phone may display each color component according to the order of the blue component, the green component, and the red component, that is, the first color component is the blue component, which is not limited in this application.
For example, assuming that the mobile phone can decompose the first picture into a red component, a green component, and a blue component, and display the color components according to the order of the red component, the green component, and the blue component, as shown in fig. 10A, in response to an operation of the user clicking on the calibration button 1004, as shown in (a) in fig. 11, the mobile phone can display the red component on the interface 1101, and on the interface 1101, some user prompt information and function buttons, such as a slider 1102, a save button 1103, prompt information 1104, and a dotted line area 1105, can also be displayed. The user can observe in the dashed-line area 1105, determine whether a light boundary exists in the dashed-line area 1105, if the boundary exists, the slider 1102 can be dragged to perform adjustment, and after the adjustment is completed, the user can click the save button 1103 to save the adjustment information and can continue to adjust the next color component. When the mobile phone detects an operation of clicking the save button 1103 by the user on the interface 1101, an interface 1106 as shown in (b) in fig. 11 may be displayed, the mobile phone may display a green component on the interface 1106, and may further display some user prompt information and function buttons on the interface 1106, for example, a slider 1107, a save button 1109, prompt information 1111, and a dotted line area 1110. The user can observe the dotted area 1110 to determine whether a light boundary exists in the dotted area 1110, if so, the slider 1107 can be dragged to perform adjustment, and after the adjustment is completed, the save button 1109 can be clicked to save the adjustment information and continue to adjust the next color component. When the mobile phone detects that the user clicks the save button 1109 on the interface 1106, an interface 1112 as shown in (c) of fig. 11 may be displayed, the mobile phone may display a blue component on the interface 1112, and may also display some user prompt information and function buttons on the interface 1102, for example, a slider 1113, a save button 1114, prompt information 1116 and a dotted line area 1115. The user can view the dashed area 1115 to determine whether there is a weak boundary within the dashed area 1115, and if so, drag the slider 1107 for adjustment.
504. And receiving a second operation of the user, wherein the second operation is used for adjusting the gray scale of the first color component displayed on the second screen according to the gray scale of the first color component displayed on the first screen, and the gray scale of the first color component displayed on the first screen is unchanged.
Illustratively, the first screen may be a primary screen and the second screen may be a secondary screen. That is, the gray scale of the color component of the sub-screen can be adjusted according to the gray scale of the color component of the main screen. The main screen is probably darker due to frequent use, the auxiliary screen is probably brighter due to lower use frequency, and the brightness of the auxiliary screen can be adjusted to be lower so as to keep the brightness of the auxiliary screen consistent with that of the main screen, thereby ensuring the consistency of the screens. And the gray scale of the first color component displayed on the adjusted auxiliary screen is smaller than that of the first color component displayed on the main screen. Alternatively, the first screen may be a secondary screen and the second screen may be a primary screen. The gray scale of the color component of the main screen can be adjusted according to the gray scale of the color component of the auxiliary screen, and the brightness of the main screen can be increased to keep the brightness of the main screen and the brightness of the auxiliary screen consistent, so that the consistency of the screens is ensured. And the gray scale of the first color component displayed on the adjusted main screen is larger than that of the first color component displayed on the auxiliary screen.
For example, as shown in fig. 11 (a), the user may adjust the gray level of the first color component (red color component) displayed on the second screen by dragging the slider 1102. In the adjustment process, the gray scale of the first color component displayed on the first screen is not changed, and the gray scale of the first color component displayed on the second screen is changed according to the change of the position of the slider 1102. Alternatively, the user may adjust the gray scale of the first color component displayed on the second screen by modifying the numerical value of the numerical value selection box for adjusting the gray scale displayed on the second screen.
505. And responding to the second operation, and displaying the adjustment effect of the gray scale of the first color component on the second screen in real time.
As shown in fig. 11 (a), the user may drag the slider 1102, the position of the slider 1102 is changed, and the gray scale of the first color component displayed on the second screen is changed according to the change of the position of the slider 1102.
506. And receiving a third operation of the user, wherein the third operation is used for determining that the current adjusting effect meets the screen consistency.
As shown in (a) and (b) of fig. 12A, in the interface 1101, after the user drags the slider 1102 from the first position to the second position, if it is determined that the faint boundary in the dashed-line area 1105 disappears, that is, the slider is located at the second position, the faint boundary does not exist in the dashed-line area 1105, that is, the screen consistency is satisfied, and the user may click the save button to save the current adjustment result.
After the mobile phone determines that the user finishes adjusting the first color component, the second color component can be displayed on the first screen and the second screen; the second color component is different from the first color component. For example, as shown in (a) in fig. 11, when the cellular phone detects an operation of the user clicking the save button 1103 on the interface 1101, an interface 1106 as shown in (b) in fig. 11 may be displayed, and the cellular phone may display a green component on the interface 1106.
Further, the mobile phone may receive a fifth operation of the user, where the fifth operation is used to adjust the gray scale of the second color component displayed on the second screen according to the gray scale of the second color component displayed on the first screen, and the process of the user adjusting the gray scale of the second color component displayed on the second screen may refer to the process of the user adjusting the gray scale of the first color component displayed on the second screen, which is not described herein again.
After the mobile phone determines that the user finishes adjusting the second color component, the third color component can be displayed on the first screen and the second screen; the third color component is different from the first color component and the second color component. For example, as shown in (b) in fig. 11, when the cellular phone detects an operation of the user clicking the save button 1109 on the interface 1106, an interface 1111 as shown in (c) in fig. 11 may be displayed, and the cellular phone may display a blue component on the interface 1112. Further, the mobile phone may receive an operation for the user to adjust the gray scale of the blue component displayed on the second screen.
It should be noted that, the user does not adjust the gray scale of each color component on the interface corresponding to each color component, and only when the user thinks that there is a boundary line on the interface corresponding to the color component, the user adjusts the gray scale of the color component, and if the user thinks that there is no boundary line on the interface corresponding to the color component, the user can directly click the save button to enter the next step (i.e., enter the adjustment interface of the next color component).
It can be understood that there may be a large error in performing the screen calibration according to only one picture, so that the user can adjust multiple pictures to improve the accuracy of the screen consistency calibration. For example, after the user has adjusted each color component of the first picture, the mobile phone may further display a second interface on the first screen and the second screen, where the second interface includes a second picture that is different from the first picture.
The second picture may be one of a set of predetermined solid color pictures with different gray scales, for example, the second picture may be a solid color picture with a gray scale value of 180, or the second picture is obtained by adding or subtracting a predetermined gray scale value to or from the first picture, for example, if the gray scale value of the second picture is 30 times the gray scale value of the first picture, if the gray scale value of the first picture is 60, the gray scale value of the second picture may be 60+ 30-90. Or the second picture may be obtained by performing preset processing on a second user screenshot, where the second user screenshot is different from the first user screenshot. Or, the second picture is selected from the pure color pictures in the preset grayscale range by the user, for example, after the user has adjusted each color component of the first picture, the interface shown in fig. 10B may be returned, and the user may drag the slider 1009 on the slider 1008 to reselect a pure color picture (i.e., the second picture) with a boundary.
Further, the mobile phone may receive a sixth operation of the user, where the sixth operation is used to determine to perform screen calibration according to the second picture, and in response to the sixth operation, the mobile phone may display the first color component of the second picture on the first screen and the second screen; then, the mobile phone may receive a seventh operation of the user, where the seventh operation is used to adjust the gray scale of the first color component displayed on the second screen according to the gray scale of the first color component displayed on the first screen. The process of adjusting each color component of the second picture by the user may refer to the process of adjusting each color component of the first picture by the user, which is not described herein again.
Finally, after the user finishes adjusting each color component of each pure color picture, the user waits for the screen calibration to take effect, and the screen calibration taking effect process may be as described in step 507:
507. and determining a compensation value of a first sub-pixel on the second screen according to a first difference value between the gray scale of the first color component displayed on the second screen after adjustment and the gray scale of the first color component displayed on the second screen before adjustment, wherein the first sub-pixel is a sub-pixel corresponding to the first color component, and compensating the first sub-pixel on the second screen according to the compensation value of the first sub-pixel on the second screen.
If the user only adjusts the first color component of a pure color picture (e.g., the first picture), the mobile phone may determine the compensation value of the first sub-pixel on the second screen according to a first difference between the adjusted gray scale of the first color component displayed on the second screen and the gray scale before the adjustment. As shown in (a) and (b) of fig. 12A, the first difference value may be a difference value (e.g., a gain value) between a gray level of the first color component when the slider 1102 is at the first position and a gray level of the first color component when the slider 1102 is at the second position.
For example, the mobile phone may determine a gain (gain) value (e.g., the gain value may be 8/100 ═ 0.08) of the first sub-pixel on the second screen according to a first difference value (e.g., assuming that the gray scale of the first sub-pixel on the second screen before the adjustment is 100 and the gray scale after the adjustment is 92, and the first difference value is 100-92 ═ 8), simulate a gamma adjustment curve for the first sub-pixel on the second screen according to the gain value, and determine a compensation value of the first sub-pixel on the second screen at all gray scales (e.g., 0-255) according to the gamma adjustment curve of the first sub-pixel.
Similarly, the mobile phone may determine a gain value of a second sub-pixel on the second screen according to a difference value before and after adjustment of a second color component of the first picture, simulate a gamma adjustment curve for the second sub-pixel on the second screen according to the gain value, and then determine a compensation value of the second sub-pixel on the second screen at all gray levels according to the gamma adjustment curve of the second sub-pixel, where the second color component corresponds to the second sub-pixel. The mobile phone can determine a gain value of a third sub-pixel on the second screen according to a difference value before and after adjustment of a third color component of the first picture, simulate a gamma adjustment curve for the third sub-pixel on the second screen according to the gain value, and further determine a compensation value of the third sub-pixel on the second screen in all gray scales according to the gamma adjustment curve of the third sub-pixel, wherein the second color component corresponds to the third sub-pixel. Then, the mobile phone can compensate each sub-pixel on the second screen according to the compensation value of each sub-pixel on the second screen.
If the user adjusts a plurality of pure color pictures, for example, two pictures, which are respectively a first picture and a second picture, the compensation value of the first sub-pixel on the second screen may be determined according to a first difference value between the gray scale of the first color component after the adjustment of the first picture and the gray scale of the first color component before the adjustment, and a second difference value between the gray scale of the first color component after the adjustment of the second picture and the gray scale of the first color component before the adjustment.
For example, the mobile phone may determine a gain value of the first sub-pixel on the second screen according to the first difference value and the second difference value, simulate a gamma adjustment curve for the first sub-pixel according to the gain value, and then determine a compensation value of the first sub-pixel on the second screen at all gray levels according to the gamma adjustment curve of the first sub-pixel. The compensation value of the first sub-pixel on the second screen determined from the first difference and the second difference is more accurate than the compensation value of the first sub-pixel on the second screen determined from only the first difference.
Similarly, the mobile phone may determine a gain value of a second sub-pixel on the second screen according to a difference value before and after adjustment of the second color component of the first picture and a difference value before and after adjustment of the second color component of the second picture, simulate a gamma adjustment curve for the second sub-pixel according to the gain value, and then determine a compensation value of the second sub-pixel on the second screen at all gray levels according to the gamma adjustment curve of the second sub-pixel. The mobile phone can determine a gain value of a third sub-pixel on the second screen according to a difference value before and after adjustment of the third color component of the first picture and a difference value before and after adjustment of the third color component of the second picture, simulate a gamma adjustment curve for the third sub-pixel according to the gain value, and further determine a compensation value of the third sub-pixel on the second screen in all gray scales according to the gamma adjustment curve of the third sub-pixel. Then, the mobile phone can compensate each sub-pixel on the second screen according to the compensation value of each sub-pixel on the second screen.
Optionally, before step 504, a step 503a may be further included:
503a, receiving a fourth operation of the user, wherein the fourth operation is used for adjusting the backlight brightness of the folding screen.
Since the OLED screen emits light in a single pixel, the aging phenomenon of the OLED screen can give different degrees of screen consistency difference feeling to users under different backlight brightness/intensity. Therefore, each picture under each backlight brightness can be checked by adjusting the backlight brightness, and pictures which do not meet the screen consistency are adjusted to meet the requirement of a user on the screen consistency.
For example, as shown in (a) or (B) of fig. 12B, a slider 1010 may be further provided on the interface 1001, and the user may select the corresponding backlight brightness by dragging the slider 1011 on the slider 1010. The corresponding luminance range for slider 1010 may be 0-255. Alternatively, a value selection box may be provided on the first interface, for example, the data selection box may correspond to a brightness range of 0-255, and the user may select the corresponding backlight brightness by modifying the value of the data selection box.
Optionally, a color space selection picture can be provided for the user, and the user customizes the gray scale and the color component which need to be adjusted, so as to provide better screen display experience.
Based on the method provided by the embodiment of the application, when the user determines that the display boundaries exist between the first screen and the second screen when the first picture is displayed, each color component of the first picture can be adjusted, each sub-pixel on the second screen is compensated based on the difference value between the adjusted color components, the problem of a yin-yang screen is avoided, and the requirements of users with different sensitivities and different use habits on screen consistency can be met.
Further, if the user determines that the display boundaries exist when the first screen and the second screen display a plurality of different pictures (such as the first picture and the second picture), the user can respectively adjust each color component of the first picture and the second picture displayed on the second screen, and compensate each sub-pixel on the second screen based on the difference value between the first picture and the second picture before and after each color component is adjusted, so that the problem of the yin-yang screen is avoided, and the requirements of users with different sensitivities and different use habits on screen consistency can be met.
It can be understood that, if the electronic device includes one display screen, that is, the electronic device is a single-screen display, the user may respectively determine whether a boundary exists between every two display areas on the display screen, and if a boundary exists, the user may perform corresponding adjustment, and the adjustment process may refer to the adjustment process of the folding screen including the two screens, which is not described herein again.
As shown in fig. 13, an embodiment of the present application provides a pixel compensation method, which is applied to an electronic device including a display screen, and is described below with reference to the display screen as a foldable screen, the foldable screen is an outward-folded foldable screen, the outward-folded foldable screen can be folded to form a first screen, a second screen, and a third screen, where the first screen is a bending region, the second screen is a main screen, and the third screen is a sub-screen, as an example, the method includes:
1301. the electronic equipment displays a first interface on the first screen, the second screen and the third screen, wherein the first interface comprises a first picture.
Since the frequency of using each screen by a user is different in different scenes (for example, the main screen is used most frequently, the sub-screen is used second, and the bending area is more second), the attenuation speeds of the screens of the mobile phone are also different, so that the mobile phone may have a negative screen and a positive screen (that is, the brightness and/or the color tone of the main screen, the sub-screen, and the bending area are not consistent), and thus the folding area of the folding screen (the area where the boundary of the main screen and the bending area is located, and the boundary of the sub-screen and the bending area) may have a boundary which can be perceived by the user, which affects the user experience. The user can adjust the screen consistency of the folding screen by setting the APP to eliminate the problem of the yin and yang screens to the greatest extent.
As shown in fig. 6, the user may click an icon 602 of the setting APP of the mobile phone on a desktop 601 of the mobile phone to enter a setting interface 603 as shown in (a) in fig. 7, may click a display control 604 on the setting interface 603 to enter a display interface 605 as shown in (b) in fig. 7, and may click a screen consistency control 606 on the display interface 605 to enter a screen consistency interface 801 as shown in fig. 8.
Alternatively, as shown in (a) of fig. 9, the user may click on the more settings control 607 on the settings interface 603 to enter the settings interface 608 shown in (b) of fig. 9, and the user may click on the screen consistency control 609 on the more settings interface 608 to enter the screen consistency interface 801 shown in fig. 8.
The user may select different ways to adjust the screen consistency of the folded screen at the screen consistency interface 801. The specific process may refer to the related description of step 501, which is not described herein.
After the user selects the screen consistency adjusting mode, the mobile phone can display a first interface on the first screen, the second screen and the third screen, wherein the first interface comprises a first picture. For the description of the first picture, reference may be made to step 501, which is not described herein.
1302. And receiving a first operation of a user, wherein the first operation is used for determining screen calibration according to the first picture.
When the mobile phone displays the first picture, a display boundary can exist at one or more positions, for example, the display boundary exists at the boundary of the first screen and the second screen, and/or the display boundary exists at the boundary of the first screen and the third screen.
If the user determines that the display boundary exists on the screen, the user can perform a first operation to determine to perform screen calibration according to the first picture. For example, the user may instruct the cell phone to calibrate the first picture by clicking a calibration button or by voice operation instructions.
For example, as shown in fig. 8, if the user selects mode 1, that is, the user clicks the control 802 on the screen consistency interface 801, after the mobile phone detects that the user clicks the control 802 on the screen consistency interface 801, the GUI shown in fig. 14, that is, the first interface, may be displayed. The first interface 1401 includes a first picture, which may be, for example, a solid picture with a gray scale value of 60. The first picture can fill (imbue) the primary screen, the secondary screen and the inflection zone. On the first picture, some user prompt information and function buttons may be displayed. For example, a prompt 1402, an OK button 1403, a calibration button 1404, and a dotted line region 1405 and a dotted line region 1406 may be displayed. The boundary between the sub-screen and the inflection zone is located in the dotted area 1405, the boundary between the main screen and the inflection zone is located in the dotted area 1406, and the user can view the boundary in the dotted area 1405 and the dotted area 1406, respectively, to determine whether there is a weak boundary in the dotted area 1405 and the dotted area 1406. If the user determines that a boundary exists within dashed area 1405 and/or dashed area 1406, the user may click calibration button 1404 for adjustment; if the user determines that there is no obvious boundary in the dashed area 1407, the user can click the OK button to enter the next step, for example, enter a second interface, where the second interface includes a second picture, the gray scale of the second picture is different from the gray scale of the first picture, and the user can check whether there is a display boundary when the screen displays the second picture.
1303. In response to the first operation, a first color component is displayed on the first screen, the second screen, and the third screen.
After the mobile phone determines that the first picture needs to be calibrated, the first picture can be decomposed into at least one color component, and each color component in the at least one color component is respectively displayed on the first screen and the second screen, so that a user can respectively calibrate each color component. The specific process may refer to step 503.
For example, assuming that the cellular phone decomposes the first picture into red, green, and blue components and displays the color components in accordance with the order of the red, green, and blue components, as shown in fig. 14, in response to an operation of the user clicking on the calibration button 1404, as shown in (a) in fig. 15, the cellular phone may display the red component on the interface 1501, and may also display some user prompt information and function buttons on the interface 1501, such as a slider 1502, a save button 1503, prompt information 1504, a dotted line region 1505, a dotted line region 1506, and a slider 1507. The user can observe the dotted line region 1505 and the dotted line region 1506 respectively to determine whether a light boundary exists in the dotted line region 1505 and the dotted line region 1506, if a boundary exists in the dotted line region 1505, the slider 1502 can be dragged to adjust, if a boundary exists in the dotted line region 1506, the slider 1507 can be dragged to adjust, and after the adjustment is completed, the user can click the save button 1503 to save the adjustment information and can continue to adjust the next color component.
When the mobile phone detects that the user clicks the save button 1503 on the interface 1501, the interface 1508 as shown in (b) in fig. 15 may be displayed, the mobile phone may display a green component on the interface 1508, the user may observe in two dotted line regions respectively, determine whether a light boundary exists in the dotted line region, if so, drag a corresponding slider for adjustment, and after adjustment, click the save button 1509 to save the adjustment information and may continue to adjust the next color component. When the mobile phone detects that the user clicks the save button 1509 on the interface 1508, the interface 1510 shown in (c) in fig. 15 may be displayed, the mobile phone may display the blue component on the interface 1510, and the user may perform corresponding adjustment in the interface 1510, and the specific process may refer to the above description, which is not described herein again.
1304. And receiving a second operation of the user, wherein the second operation is used for adjusting the gray scale of the first color component displayed on the second screen according to the gray scale of the first color component displayed on the first screen.
When the user adjusts the gray scale of the first color component displayed on the second screen according to the gray scale of the first color component displayed on the first screen, the user can shield the boundary between the first screen and the third screen, and the sight line of the user is prevented from being interfered. As shown in fig. 16 (a), when the user adjusts the gray scale of the first color component displayed on the main screen through the slider 1502, a black bar may be placed in the dotted area 1506 for shielding, so as to prevent interference with the user's sight.
1305. And responding to the second operation, and displaying the adjustment effect of the gray scale of the first color component on the second screen in real time.
1306. And receiving a third operation of the user, wherein the third operation is used for determining that the current adjusting effect meets the screen consistency.
After the user finishes adjusting the gray scale of the first color component displayed on the second screen, the gray scale of the first color component displayed on the third screen can be continuously adjusted. That is, the mobile phone may receive an eighth operation of the user, where the eighth operation is configured to adjust the gray scale of the first color component displayed on the third screen according to the gray scale of the first color component displayed on the first screen, and in response to the eighth operation, display an adjustment effect of the gray scale of the first color component on the second screen in real time.
Optionally, when the user adjusts the gray scale of the first color component displayed on the third screen according to the gray scale of the first color component displayed on the first screen, the user may shield the division between the first screen and the third screen to prevent interference with the user's sight. As shown in fig. 16 (b), when the user adjusts the gray scale of the first color component displayed on the sub-screen through the slider 1507, a black bar may be placed in the dotted line region 1505 for blocking, thereby preventing interference with the user's sight.
1307. And determining a compensation value of a first sub-pixel on the second screen according to a first difference value between the gray scale of the first color component displayed on the second screen after adjustment and the gray scale of the first color component displayed on the second screen before adjustment, wherein the first sub-pixel is a sub-pixel corresponding to the first color component, and compensating the first sub-pixel on the second screen according to the compensation value of the first sub-pixel on the second screen.
The specific process may refer to step 507, which is not described herein again.
And the mobile phone may determine a compensation value of a first sub-pixel on the third screen according to a third difference between the adjusted gray scale of the first color component displayed on the third screen and the gray scale of the first color component displayed on the third screen before adjustment, where the first sub-pixel is a sub-pixel corresponding to the first color component, and compensate the first sub-pixel on the third screen according to the compensation value of the first sub-pixel on the third screen. The specific process may refer to step 507, which is not described herein again.
Optionally, before step 1304, a step 1303a may be further included:
1303a, receiving a fourth operation of the user, wherein the fourth operation is used for adjusting the backlight brightness of the folding screen.
The specific process may refer to step 503 a.
Based on the method provided by the embodiment of the application, the user can adjust each color component of the first picture, and compensate each sub-pixel on the second screen and the third screen based on the difference value before and after each color component is adjusted, so that the requirements of users with different sensitivities and different use habits on the consistency of the screens can be met.
Furthermore, the user can adjust each color component of a plurality of pictures (such as the first picture and the second picture), and compensate each sub-pixel on the second screen and the third screen based on the difference value before and after adjustment of each color component of the first picture and the second picture, so that the requirements of users with different sensitivities and different use habits on screen consistency can be met.
It can be understood that, if the foldable screen is foldable to form more than three screens (for example, four screens or five screens), a boundary (dividing line) between every two screens can be marked (outlined) on the display interface, and a user can respectively judge whether a boundary exists between every two screens, and if the boundary exists, the adjustment process can refer to the adjustment process of the foldable screen including two screens or three screens, which is not described herein again.
Fig. 17 is a block diagram of a software configuration of an electronic device according to an embodiment of the present application. The software system of the electronic device may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the invention takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of an electronic device.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. And the layers communicate with each other through an interface. In some embodiments, the Android system is divided into four layers, which are an application layer, an application framework layer, an Android runtime and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 17, the application package may include applications such as camera, gallery, calendar, phone call, map, settings, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 17, the application framework layer may include an activity manager, a window manager, a content provider, a resource manager, a notification manager, and the like, which is not limited in this embodiment.
Activity Manager (Activity Manager): for managing the lifecycle of each application. Applications typically run in the operating system in the form of Activity. For each Activity, there is an application record (activetyrecord) in the Activity manager corresponding to it, which records the state of the Activity of the application. The Activity manager can schedule Activity processes for the application using this Activity record as an identification.
Window manager (windowmanager service): graphical User Interface (GUI) resources for managing GUI resources used on a screen may specifically be used to: the method comprises the steps of obtaining the size of a display screen, creating and destroying a window, displaying and hiding the window, arranging the window, managing a focus, managing an input method, managing wallpaper and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc. The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal vibrates, an indicator light flashes, and the like.
In the embodiment of the application, the application framework layer may further include a screen consistency adjustment service, and the screen consistency adjustment service may present different display effects on the display screen according to user operations.
For example, as shown in fig. 8, after the user selects the screen consistency adjustment mode, the mobile phone performs screen calibration initialization, adjusts the gamma unit and/or the demura unit to an initial state, and optionally switches the screen backlight to a manual mode. In the first picture adjustment scene shown in fig. 10A, after detecting that the user instructs to calibrate the first picture (i.e., the operation of clicking the calibration button 1004), the electronic device displays an adjustment interface for the first color component as shown in (a) of fig. 11. After the touch sensor detects the operation of dragging the slider 1102 by the user, the screen consistency adjustment service may call a gamma adjustment unit in the AP drive or may call a demura unit in the DDIC drive to compensate for each sub-pixel on the second screen. If the user confirms that the screen consistency reaches the standard, the user can click the storage button, after the touch sensor detects that the user clicks the storage button, the screen consistency adjusting service can write the adjusting result into the display terminal chip and store the adjusting result into the file system, and the requirement of the user can be met by using subsequent products.
In addition, during the compensation adjustment period (during the period that the user drags the slider in real time to perform adjustment), a hardware module which takes effect in real time may be adopted, for example, a gamma adjustment unit in the AP drive may be called or a demura unit in the DDIC drive may be called to perform real-time compensation, so that the user may view the adjustment result in real time. After the adjustment is finished, when the compensation takes effect, the hardware module which takes effect in non-real time can be used according to the hardware platform capability of the terminal display product, so that more hardware resources are released to support other display effect characteristics.
The system library and the kernel layer below the application framework layer may be referred to as an underlying system, and the underlying system includes an underlying display system for providing display services, for example, the underlying display system includes a display driver in the kernel layer and a surface manager in the system library.
As shown in fig. 17, the Android Runtime includes a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
As shown in fig. 17, the system library may include a plurality of function modules. For example: surface manager (surface manager), Media Libraries (Media Libraries), OpenGL ES, SGL, and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
OpenGL ES is used to implement three-dimensional graphics drawing, image rendering, compositing, and layer processing, among others.
SGL is a drawing engine for 2D drawing.
As shown in fig. 17, the kernel layer is a layer between hardware and software. The kernel layer at least comprises a DDIC driver, a camera driver, an AP driver, an audio driver, a sensor driver and the like.
Other embodiments of the present application further provide a pixel compensation device, which can be applied to the electronic device. The device is used for executing each function or step executed by the mobile phone in the method embodiment.
In the case of dividing each functional module according to each function, fig. 18 shows a schematic structural diagram of a possible electronic device in the above embodiment, where the electronic device is used to implement the method described in each method embodiment, and the method specifically includes: a display unit 1801, a receiving unit 1802, and a processing unit 1803.
A display unit 1801, configured to support the electronic device to perform processes 501, 503, and 505 shown in fig. 5; processes 1301, 1303 and 1305 shown in fig. 13. A receiving unit 1802 for supporting an electronic device to perform the processes 502, 504, and 506 shown in fig. 5; processes 1302, 1304, and 1306 shown in fig. 13. A processing unit 1803 to support the electronic device to perform process 507 shown in fig. 5; the process 1307 shown in figure 13. All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
Embodiments of the present application further provide a chip system, as shown in fig. 19, the chip system includes at least one processor 1901 and at least one interface circuit 1902. The processor 1901 and the interface circuit 1902 may be interconnected by wires. For example, the interface circuit 1902 may be used to receive signals from other devices (e.g., a memory of an electronic device). Also for example, the interface circuit 1902 may be used to send signals to other devices, such as the processor 1901. Illustratively, the interface circuit 1902 may read instructions stored in a memory and send the instructions to the processor 1901. The instructions, when executed by the processor 1901, may cause the electronic device to perform the various steps in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
The embodiment of the present application further provides a computer storage medium, where the computer storage medium includes computer instructions, and when the computer instructions are run on the electronic device, the electronic device is enabled to execute each function or step executed by the mobile phone in the foregoing method embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute each function or step executed by the mobile phone in the above method embodiments.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a ROM, a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

1. A pixel compensation method applied to an electronic device including a display screen including a first display region and a second display region, the method comprising:
displaying a first interface on the first display area and the second display area, wherein the first interface comprises a first picture, and a display boundary exists at the boundary of the first display area and the second display area;
receiving a first operation of a user, wherein the first operation is used for determining screen calibration according to the first picture;
in response to the first operation, displaying a first color component of the first picture in the first display area and the second display area;
receiving a second operation of a user, wherein the second operation is used for adjusting the gray scale of the first color component displayed on the second display area according to the gray scale of the first color component displayed on the first display area; the gray scale of the first color component displayed on the first display area is unchanged;
determining a compensation value of a first sub-pixel on the second display area according to a first difference value between the adjusted gray scale of the first color component displayed on the second display area and the gray scale of the first color component displayed on the second display area before adjustment, wherein the first sub-pixel is a sub-pixel corresponding to the first color component, and compensating the first sub-pixel on the second display area according to the compensation value of the first sub-pixel on the second display area.
2. The pixel compensation method according to claim 1,
the first picture is a pure-color picture comprising one color; or
The first picture is a picture with patterns and comprises at least two colors.
3. The pixel compensation method according to claim 1 or 2,
the first picture is one of a group of preset pure color pictures with different gray scales; or
The first picture is selected from a pure color picture in a preset gray scale range by the user; or
The first picture is a first user screenshot; or
The first picture is obtained by carrying out preset processing on a first user screenshot, and when the first user screenshot is displayed in the first display area and the second display area, a display boundary exists at the boundary of the first display area and the second display area.
4. The pixel compensation method according to any one of claims 1 to 3,
the first color component is a red component, a green component, a blue component, a yellow component, a cyan component, or a magenta component.
5. The pixel compensation method according to any one of claims 1-4, wherein after receiving the second operation by the user, the method further comprises:
responding to the second operation, and displaying the adjustment effect of the gray scale of the first color component in the second display area in real time;
and receiving a third operation of the user, wherein the third operation is used for determining that the current adjustment effect meets the screen consistency.
6. The pixel compensation method according to any one of claims 1-5, wherein the receiving a first operation of a user comprises:
receiving click operation of the user on a calibration button displayed on the display screen; or
And receiving a voice operation instruction of the user, wherein the voice operation instruction is used for indicating to calibrate the first picture.
7. The pixel compensation method according to any one of claims 1-6, wherein the receiving a second operation of a user comprises:
receiving the dragging operation of the user on a sliding block on a sliding strip displayed on the second display area and used for adjusting the gray scale; or
And receiving the input operation of the user on the numerical value selection frame which is displayed on the second display area and is used for adjusting the gray scale.
8. The pixel compensation method according to any one of claims 1-7, wherein, prior to receiving the second operation by the user, the method further comprises:
and receiving a fourth operation of the user, wherein the fourth operation is used for adjusting the backlight brightness of the display screen.
9. The pixel compensation method according to any one of claims 1 to 8,
the adjusted gray scale of the first color component displayed on the second display area is smaller than or larger than the gray scale of the first color component displayed on the first display area.
10. A pixel compensation method according to any one of claims 1-9, wherein the method further comprises:
displaying a second color component of the first picture in the first display area and the second display area;
receiving a fifth operation of the user, wherein the fifth operation is used for adjusting the gray scale of the second color component displayed on the second display area according to the gray scale of the second color component displayed on the first display area; the gray scale of the second color component displayed on the first display region is unchanged.
11. A pixel compensation method according to any one of claims 1-10, wherein before determining the compensation value for the first sub-pixel in the second display region based on the first difference value, the method further comprises:
displaying a second interface on the first display area and the second display area, wherein the second interface comprises a second picture, the second picture is different from the first picture, and a display boundary exists at the boundary of the first display area and the second display area;
receiving a sixth operation of the user, wherein the sixth operation is used for determining screen calibration according to the second picture;
in response to the sixth operation, displaying a first color component of the second picture in the first display area and the second display area;
receiving a seventh operation of a user, wherein the seventh operation is used for adjusting the gray scale of the first color component displayed on the second display area according to the gray scale of the first color component displayed on the first display area;
the determining a compensation value of a first sub-pixel on the second display area according to a first difference between the adjusted gray scale of the first color component displayed on the second display area and the gray scale of the first color component displayed on the second display area before the adjustment includes:
and determining a compensation value of the first sub-pixel on the second display area according to the first difference value and a second difference value between the gray scale of the first color component after the second picture is adjusted and the gray scale of the first color component before the second picture is adjusted.
12. The pixel compensation method according to claim 11,
the second picture is one of a group of preset pure color pictures with different gray scales; or
The second picture is selected from a pure color picture in a preset gray scale range by the user; or
The second picture is a second user screenshot; or
The second picture is obtained by presetting a second user screenshot, and a display boundary exists at the boundary of the first display area and the second display area when the first display area and the second display area display the second user screenshot.
13. The pixel compensation method of any one of claims 1-12, wherein the display screen further comprises a third display area, and wherein displaying the first interface in the first display area and the second display area comprises:
displaying the first interface in the first display area, the second display area and the third display area, wherein the first interface comprises the first picture, and a display boundary exists at the boundary of the first display area and the third display area;
the displaying a first color component of the first picture in the first display region and the second display region includes:
displaying the first color component of the first picture in the first display region, the second display region, and the third display region;
after the receiving the first operation of the user, the method further comprises:
receiving an eighth operation of a user, wherein the eighth operation is used for adjusting the gray scale of the first color component displayed on the third display area according to the gray scale of the first color component displayed on the first display area;
determining a compensation value of a first sub-pixel on the third display area according to a third difference value between the adjusted gray scale of the first color component displayed on the third display area and the gray scale of the first color component displayed on the third display area before adjustment, wherein the first sub-pixel is a sub-pixel corresponding to the first color component, and compensating the first sub-pixel on the third display area according to the compensation value of the first sub-pixel on the third display area.
14. The pixel compensation method of any one of claims 1-13, wherein the display screen is a folded screen.
15. An electronic device comprising a processor and a display screen, the processor coupled to the display screen, the processor configured to cause the electronic device to perform the pixel compensation method of any of claims 1-14.
16. A computer-readable storage medium comprising instructions that, when executed on a computer, cause the computer to perform the pixel compensation method of any one of claims 1-14.
CN201910945742.4A 2019-09-30 2019-09-30 Pixel compensation method and electronic equipment Active CN112581903B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910945742.4A CN112581903B (en) 2019-09-30 2019-09-30 Pixel compensation method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910945742.4A CN112581903B (en) 2019-09-30 2019-09-30 Pixel compensation method and electronic equipment

Publications (2)

Publication Number Publication Date
CN112581903A true CN112581903A (en) 2021-03-30
CN112581903B CN112581903B (en) 2022-05-06

Family

ID=75117074

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910945742.4A Active CN112581903B (en) 2019-09-30 2019-09-30 Pixel compensation method and electronic equipment

Country Status (1)

Country Link
CN (1) CN112581903B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI778703B (en) * 2021-07-09 2022-09-21 敦泰電子股份有限公司 One-site oled panel demura and lirc method

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080238860A1 (en) * 2007-03-29 2008-10-02 Oki Electric Industry Co., Ltd. Liquid crystal display apparatus
EP2469505A1 (en) * 2010-12-23 2012-06-27 Research In Motion Limited Handheld electronic communication device having an age compensating display
US20130016081A1 (en) * 2011-07-11 2013-01-17 Samsung Electronics Co., Ltd. Display apparatus having uniformity correction function and control method thereof
CN102915172A (en) * 2011-08-03 2013-02-06 中兴通讯股份有限公司 Method and device for managing display screen
EP2701140A1 (en) * 2012-08-23 2014-02-26 BlackBerry Limited Organic light emitting diode based display aging monitoring
CN103680469A (en) * 2013-12-23 2014-03-26 深圳市华星光电技术有限公司 Liquid crystal display and adjusting method for brightness and contrast degree of liquid crystal display
CN104091565A (en) * 2014-07-03 2014-10-08 广东威创视讯科技股份有限公司 Method and system for correcting full-screen uniformity of display device
CN104143320A (en) * 2013-05-06 2014-11-12 咏传电子科技(上海)有限公司 Brightness compensation method and display control device and image display device thereof
CN105405392A (en) * 2015-12-07 2016-03-16 西安诺瓦电子科技有限公司 Method for compensating luminance- and chroma-differences between subareas during LED display screen subarea correction
CN107274848A (en) * 2017-07-28 2017-10-20 北京京东方多媒体科技有限公司 A kind of method, the debugging apparatus of mosaic screen and system for debugging mosaic screen
CN107591134A (en) * 2017-08-16 2018-01-16 深圳创维-Rgb电子有限公司 The compensation method of MURA phenomenons, TV and computer-readable recording medium
US20180144719A1 (en) * 2016-11-23 2018-05-24 Samsung Display Co., Ltd. Display device and method of compensating luminance of the same
US20180182285A1 (en) * 2016-06-17 2018-06-28 Boe Technology Group Co., Ltd. Method and apparatus for establishing luminance compensation model, method and apparatus for compensating for luminance of display screen, and display device
US20190073946A1 (en) * 2017-09-04 2019-03-07 Shanghai Tianma AM-OLED Co., Ltd. Display panel and display device
CN109658900A (en) * 2019-02-28 2019-04-19 京东方科技集团股份有限公司 Driving method, compensation circuit and driving device, the display device of display panel

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080238860A1 (en) * 2007-03-29 2008-10-02 Oki Electric Industry Co., Ltd. Liquid crystal display apparatus
EP2469505A1 (en) * 2010-12-23 2012-06-27 Research In Motion Limited Handheld electronic communication device having an age compensating display
US20130016081A1 (en) * 2011-07-11 2013-01-17 Samsung Electronics Co., Ltd. Display apparatus having uniformity correction function and control method thereof
CN102915172A (en) * 2011-08-03 2013-02-06 中兴通讯股份有限公司 Method and device for managing display screen
EP2701140A1 (en) * 2012-08-23 2014-02-26 BlackBerry Limited Organic light emitting diode based display aging monitoring
CN104143320A (en) * 2013-05-06 2014-11-12 咏传电子科技(上海)有限公司 Brightness compensation method and display control device and image display device thereof
CN103680469A (en) * 2013-12-23 2014-03-26 深圳市华星光电技术有限公司 Liquid crystal display and adjusting method for brightness and contrast degree of liquid crystal display
CN104091565A (en) * 2014-07-03 2014-10-08 广东威创视讯科技股份有限公司 Method and system for correcting full-screen uniformity of display device
CN105405392A (en) * 2015-12-07 2016-03-16 西安诺瓦电子科技有限公司 Method for compensating luminance- and chroma-differences between subareas during LED display screen subarea correction
US20180182285A1 (en) * 2016-06-17 2018-06-28 Boe Technology Group Co., Ltd. Method and apparatus for establishing luminance compensation model, method and apparatus for compensating for luminance of display screen, and display device
US20180144719A1 (en) * 2016-11-23 2018-05-24 Samsung Display Co., Ltd. Display device and method of compensating luminance of the same
CN107274848A (en) * 2017-07-28 2017-10-20 北京京东方多媒体科技有限公司 A kind of method, the debugging apparatus of mosaic screen and system for debugging mosaic screen
CN107591134A (en) * 2017-08-16 2018-01-16 深圳创维-Rgb电子有限公司 The compensation method of MURA phenomenons, TV and computer-readable recording medium
US20190073946A1 (en) * 2017-09-04 2019-03-07 Shanghai Tianma AM-OLED Co., Ltd. Display panel and display device
CN109658900A (en) * 2019-02-28 2019-04-19 京东方科技集团股份有限公司 Driving method, compensation circuit and driving device, the display device of display panel

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KIMMO KERÄNEN; JYRKI OLLILA; ESA-MATTI SARJANOJA; SAMULI YRJÄNÄ;: "《Large area flexible lighting element controlled by mobile phone user interface》", 《2016 6TH ELECTRONIC SYSTEM-INTEGRATION TECHNOLOGY CONFERENCE (ESTC)》 *
王宇庆,刘维亚,丁铁夫,郑喜凤,王瑞光,徐秀知,陈 宇,汪 洋: "《基于CCD图像的LED显示屏亮度均匀性评估方法》", 《液晶与显示》 *
韦胜钰: "《OLED显示屏检测方法分析》", 《电子产品可靠性与环境试验》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI778703B (en) * 2021-07-09 2022-09-21 敦泰電子股份有限公司 One-site oled panel demura and lirc method

Also Published As

Publication number Publication date
CN112581903B (en) 2022-05-06

Similar Documents

Publication Publication Date Title
CN112217923B (en) Display method of flexible screen and terminal
CN109814766B (en) Application display method and electronic equipment
CN112130742B (en) Full screen display method and device of mobile terminal
CN109274828B (en) Method for generating screenshot, control method and electronic equipment
CN114679537A (en) Shooting method and terminal
CN112445448B (en) Flexible screen display method and electronic equipment
CN111327814A (en) Image processing method and electronic equipment
CN110956939B (en) Method for adjusting screen brightness and electronic equipment
CN114556294A (en) Theme switching method and theme switching device
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN114579016A (en) Method for sharing input equipment, electronic equipment and system
WO2022143180A1 (en) Collaborative display method, terminal device, and computer readable storage medium
CN113641271A (en) Application window management method, terminal device and computer readable storage medium
CN112449101A (en) Shooting method and electronic equipment
WO2020233593A1 (en) Method for displaying foreground element, and electronic device
CN112581903B (en) Pixel compensation method and electronic equipment
CN115798390A (en) Screen display method and terminal equipment
CN115729346A (en) Interface display method and electronic equipment
CN113495733A (en) Theme pack installation method and device, electronic equipment and computer readable storage medium
WO2023207844A1 (en) Dynamic wallpaper display method and apparatus, and electronic device
CN117009005A (en) Display method, automobile and electronic equipment
CN116974658A (en) Interface layout method and device
CN115619628A (en) Image processing method and terminal device
CN115657992A (en) Screen display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant