CN113808030B - Noise monitoring method, electronic equipment and chip system - Google Patents
Noise monitoring method, electronic equipment and chip system Download PDFInfo
- Publication number
- CN113808030B CN113808030B CN202110606261.8A CN202110606261A CN113808030B CN 113808030 B CN113808030 B CN 113808030B CN 202110606261 A CN202110606261 A CN 202110606261A CN 113808030 B CN113808030 B CN 113808030B
- Authority
- CN
- China
- Prior art keywords
- image
- time
- hwc
- module
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 189
- 238000012544 monitoring process Methods 0.000 title claims abstract description 23
- 230000010354 integration Effects 0.000 claims description 230
- 238000004422 calculation algorithm Methods 0.000 claims description 211
- 238000004364 calculation method Methods 0.000 claims description 53
- 230000004044 response Effects 0.000 claims description 52
- 230000008859 change Effects 0.000 claims description 35
- 241000023320 Luma <angiosperm> Species 0.000 claims description 27
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 claims description 27
- 238000012545 processing Methods 0.000 claims description 22
- 238000004590 computer program Methods 0.000 claims description 15
- 230000002194 synthesizing effect Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 abstract description 161
- 230000002452 interceptive effect Effects 0.000 abstract description 9
- 230000002829 reductive effect Effects 0.000 abstract description 9
- 238000001514 detection method Methods 0.000 abstract description 4
- 230000000875 corresponding effect Effects 0.000 description 91
- 230000004927 fusion Effects 0.000 description 75
- 230000008569 process Effects 0.000 description 71
- 238000004891 communication Methods 0.000 description 32
- 238000010586 diagram Methods 0.000 description 30
- 238000007726 management method Methods 0.000 description 15
- 238000010295 mobile communication Methods 0.000 description 11
- 230000004617 sleep duration Effects 0.000 description 11
- 230000005236 sound signal Effects 0.000 description 8
- 230000036961 partial effect Effects 0.000 description 6
- 230000001960 triggered effect Effects 0.000 description 6
- 230000004622 sleep time Effects 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 239000000872 buffer Substances 0.000 description 4
- 239000002131 composite material Substances 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000002123 temporal effect Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000001276 controlling effect Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000001965 increasing effect Effects 0.000 description 3
- 230000007958 sleep Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000005059 dormancy Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000002730 additional effect Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000006748 scratching Methods 0.000 description 1
- 230000002393 scratching effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 208000019116 sleep disease Diseases 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000036962 time dependent Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
- G01J1/4204—Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Sustainable Development (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The embodiment of the application provides a noise monitoring method, electronic equipment and a chip system, relates to the technical field of ambient light sensors, and can solve the problem of overlarge power consumption of the electronic equipment. The detection method comprises the following steps: an ambient light sensor of the electronic equipment collects ambient light in a collection period, and before the ambient light sensor collects the ambient light each time, a memory write-back function is started to obtain image noise during the collection of the ambient light; stopping the memory write-back function after the ambient light sensor finishes collecting the ambient light every time so as to avoid image noise except the period of calculating and collecting the ambient light by the electronic equipment; and the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. Because the noise interfering with the ambient light may be related to the image displayed on the display screen at the starting time of collecting the ambient light, the image can be forcibly refreshed to obtain the image currently displayed on the display screen after the memory write-back function is started, so that the noise interfering with the ambient light is obtained.
Description
Technical Field
The embodiment of the application relates to the field of ambient light sensors, in particular to a control method of electronic equipment, the electronic equipment and a chip system.
Background
With the development of electronic devices, the display screen of the electronic device has a higher and higher occupancy rate. In pursuit of an excellent screen occupation ratio, an ambient Light sensor on an electronic device may be disposed below an OLED (Organic Light-Emitting Diode) screen of the electronic device. The OLED screen itself emits light, which causes the ambient light collected by the ambient light sensor disposed below the OLED screen to include the light emitted by the OLED screen itself, resulting in inaccuracy of the ambient light collected by the ambient light sensor.
Currently, in order to accurately measure ambient light, ambient light collected by an ambient light sensor and noise generated by a display screen of an electronic device may be obtained. Then, the true ambient light is obtained based on the ambient light collected by the ambient light sensor and noise generated by the display screen of the electronic device. In this method, noise generated by the display screen of the electronic device is related to an image displayed by the display screen of the electronic device, and therefore, the image displayed by the display screen of the electronic device needs to be acquired.
Disclosure of Invention
The embodiment of the application provides a control method of an electronic device, the electronic device and a chip system, and solves the problem of overlarge power consumption when the electronic device acquires noise.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a noise monitoring method, which is applied to an electronic device, where the electronic device includes: a HWC module, a display subsystem, and a library of noise algorithms, the method comprising:
in response to receiving the first information, the HWC module sets a write back flag to a first flag;
in response to receiving the first image, the HWC module queries the write-back flag as a first flag;
the HWC module sends the first image to the display subsystem based on the first flag;
the display subsystem stops writing back a memory to the electronic equipment to store a second image comprising a first target image on the first image, wherein the first target image is an image in a first area;
in response to reaching a first time, the HWC module sets the write back flag to a second flag;
the HWC module acquires a third image;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
In response to receiving the third image and the second information, the display subsystem stores, to a write-back memory of the electronic device, a fourth image that includes a second target image on the third image, the second target image being an image within the first region;
the HWC module acquires the second target image from the write-back memory;
the HWC module sends the second target image to a noise algorithm library;
and the noise algorithm library calculates and obtains first image noise based on the second target image.
In the embodiment of the application, after the ambient light sensor finishes acquiring ambient light every time, the SCP processor sends first information to the AP processor, and on one side of the AP processor, the HWC module sets a write-back flag as a first flag, and the memory write-back function is stopped. The HWC module may set the write back flag to the second flag and may also force a refresh of an image, such as a third image, before the ambient light sensor next acquires ambient light, such as at a first time. And under the condition that the write-back mark is a second mark, starting a memory write-back function, if the image is refreshed, obtaining a target image of the refreshed image, namely a target image on a third image, sending the target image obtained according to the third image to a noise algorithm library by the HWC, and calculating by the noise algorithm library to obtain image noise. By the method, the HWC can be controlled to obtain the target image of the current refreshed image only when the refreshed image exists during the period that the ambient light sensor collects the ambient light, and the HWC does not obtain the target image of the current refreshed image any more when the refreshed image exists at the time other than the period that the ambient light sensor collects the ambient light. In practice, whether the HWC gets the target image from the current refreshed image matte is set by the write back flag. According to the embodiment of the application, the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. In a possible implementation manner of the first aspect, the first information includes a first duration, where the first duration is a duration for the display subsystem to stop storing the image to the write-back memory; the first moment is as follows: the write-back mark is set to a time when a first time length passes after the time of the first mark;
Or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting a delay time length from the first time length, and the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information.
In one possible implementation of the first aspect, the HWC module obtaining the third image includes:
the HWC module sends a first signal to a surface flag of the electronic device;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameters.
In one possible implementation manner of the first aspect, after the HWC module sets the write-back flag to be the second flag and before the HWC module acquires the third image, the method further includes:
the HWC module acquires the moment when the image is refreshed on the electronic equipment last time;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the HWC module acquires the third image.
In one possible implementation manner of the first aspect, after the HWC module obtains the time when the image is last refreshed on the electronic device, the method further includes:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the HWC module waits for a Surface flag module of the electronic equipment to send a second display parameter.
In a possible implementation manner of the first aspect, if a time when the electronic device last refreshes the image satisfies a first preset condition, the obtaining, by the HWC module, the first image includes:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the HWC module waits for a second time length;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second duration, the HWC module acquires the first image.
In a possible implementation manner of the first aspect, the method further includes:
if the HWC module receives a fourth display parameter sent by a Surface flag within the second duration, the HWC module acquires a fifth image based on the fourth display parameter;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the fifth image and third information to the display subsystem based on the second flag;
in response to receiving the fifth image and the third information, the display subsystem stores a sixth image on the fifth image that includes a third target image in a write-back memory of the electronic device, the third target image being an image within the first region;
the HWC module obtains the third target image from the write-back memory;
the HWC module sends the third target image to a noise algorithm library;
and the noise algorithm library calculates and obtains second image noise based on the third target image.
In a possible implementation manner of the first aspect, the first information includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device acquires the first value;
The electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time.
In a possible implementation manner of the first aspect, the first information further includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device collects the first value, and a time when an image is last refreshed by the electronic device meets a first preset condition includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
and a first difference between the last image refreshing time of the electronic equipment and the current time is greater than or equal to a second difference between the second time and the current time.
In a possible implementation manner of the first aspect, the meeting of the first preset condition by the electronic device at the time when the image is refreshed last time includes:
The moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
In a possible implementation manner of the first aspect, the method further includes:
after the HWC module sets a write back flag to a first flag; the HWC module monitors whether data in a kernel node of the electronic equipment is changed or not, and the kernel node stores a brightness value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a first luminance value from the core node;
after the HWC module obtains a first luminance value from the kernel node, the HWC module obtains a second luminance value from the kernel node in response to monitoring that data in the kernel node of the electronic device changes;
In response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
In a possible implementation manner of the first aspect, the method further includes:
after the HWC module sets a write back flag to a second flag, the HWC module monitors whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a third luminance value from the core node;
the HWC module sends the third luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, in response to listening for a change in data in a core node of the electronic device, the HWC module retrieving a fourth luma value from the core node;
the HWC module sends the fourth luma value to the noise algorithm library.
In one possible implementation manner of the first aspect, the calculating, by the noise algorithm library, the first image noise based on the second target image includes:
and calculating to obtain first image noise based on the second target image and the second brightness value by the noise algorithm library.
In one possible implementation of the first aspect, the HWC module receiving the first image includes:
the HWC module receives a fifth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module derives the first image based on the fifth display parameter.
In a possible implementation manner of the first aspect, the first area is an area on a display screen of the electronic device, which is located above an ambient light sensor of the electronic device.
In one possible implementation manner of the first aspect, the obtaining, by the HWC module, an image last refreshed by the electronic device includes:
the HWC module receives a sixth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module stores a time at which the HWC module received the sixth display parameter;
the HWC module acquires the last time when the electronic device refreshes the image, and the method comprises the following steps:
the HWC module obtains the stored time when the sixth display parameter is received, where the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before the time when the electronic device last refreshes an image is obtained.
In a possible implementation manner of the first aspect, the first display parameter includes: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
In a second aspect, an embodiment of the present application provides a noise monitoring method, which is applied to an electronic device, where the electronic device includes: a first processor, the method comprising:
the first processor receives first information;
after the first processor receives the first information, in response to receiving a first image, the first processor stops acquiring a first target image from the first image, wherein the first target image is an image in a first area;
after the first time is reached, the first processor acquires a third image;
the first processor acquires a second target image from the third image, wherein the second target image is an image in the first area.
In the embodiment of the application, an ambient light sensor of electronic equipment collects ambient light in a collection period, and before the ambient light sensor collects the ambient light each time, a memory write-back function is started to obtain a target image of a refreshed image, so that image noise during the collection of the ambient light is obtained; stopping the memory write-back function after the ambient light sensor finishes collecting the ambient light every time so as to avoid image noise except the period of calculating and collecting the ambient light by the electronic equipment; the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. Because the noise interfering with the ambient light may be related to the image displayed on the display screen at the starting time of collecting the ambient light, after the memory write-back function is started, the image, that is, the third image, may be forcibly refreshed to obtain the image currently displayed on the display screen, so as to obtain the noise interfering with the ambient light.
In one possible implementation manner of the second aspect, the method further includes:
in response to receiving the first information, the first processor setting a write-back flag as a first flag by a HWC module of the electronic device;
the first processor, in response to receiving a first image, ceasing to acquire a first target image from the first image comprises:
in response to receiving a first image, the first processor querying, by the HWC module, that the write-back flag is a first flag;
the first processor sending, by the HWC module, the first image to a display subsystem of the electronic device based on the first flag;
the first processor stops storing a second image comprising a first target image on the first image to a write-back memory of the electronic equipment through the display subsystem, wherein the first target image is an image in a first area;
the method further comprises the following steps:
in response to reaching a first time, the first processor setting, by the HWC module, the write-back flag to a second flag;
the first processor acquires a third image, acquires a second target image from the third image, and the second target image is an image in the first area, and comprises:
The first processor obtaining a third image through the HWC module;
the first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sending, by the HWC module, the third image and second information to the display subsystem based on the second flag, the second information to instruct the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
in response to receiving the third image and the second information, the first processor stores, by the display subsystem, a fourth image including a second target image on the third image to a write-back memory of the electronic device, where the second target image is an image in the first area;
the first processor retrieves the second target image from the write-back memory through the HWC module;
the method further comprises the following steps:
the first processor sending, by the HWC module, the second target image to a noise algorithm library;
the first processor obtains first image noise through the noise algorithm library based on the second target image calculation.
In a possible implementation manner of the second aspect, the first information includes a first duration, where the first duration is a duration for the display subsystem to stop storing the image to the write-back memory; the first moment is as follows: the write-back mark is set to a time when a first time length passes after the time of the first mark;
or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is when a second time length passes after the write-back flag is set as the first flag, the second time length is the time length obtained by subtracting the delay from the first time length, and the time length of the delay is the time length obtained by subtracting the second time from the time when the HWC module receives the first information.
In one possible implementation of the second aspect, the first processor obtaining, by the HWC module, the first image includes:
the first processor sending a first signal to a surface maker of the electronic device through the HWC module;
In response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameter.
In one possible implementation manner of the second aspect, after the first processor sets the write-back flag to be the second flag through the HWC module, and before the first processor acquires a third image through the HWC module, the method further includes:
the first processor acquires the moment when the image is refreshed on the electronic device last time through the HWC module;
and if the moment when the image of the electronic equipment is refreshed last time meets a first preset condition, the first processor acquires the third image through the HWC module.
In one possible implementation manner of the second aspect, after the first processor obtains, by the HWC module, a time when the image is last refreshed on the electronic device, the method further includes:
and if the moment when the image of the electronic equipment is refreshed last time does not meet a first preset condition, the first processor waits for the Surface maker module of the electronic equipment to send a second display parameter through the HWC module.
In a possible implementation manner of the second aspect, if a time when the electronic device last refreshes an image satisfies a first preset condition, the acquiring, by the first processor, the first image by the HWC module includes:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the first processor waits for a second time length through the HWC module;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second time length, the first processor acquires the first image through the HWC module.
In one possible implementation manner of the second aspect, the method further includes:
if the HWC module receives a fourth display parameter sent by Surface flag within the second duration, the first processor acquires a fifth image based on the fourth display parameter through the HWC module;
the first processor querying, via the HWC module, the write back flag as the second flag;
the first processor sending, by the HWC module, the fifth image and third information to the display subsystem based on the second flag;
in response to receiving the fifth image and the third information, the first processor stores, by the display subsystem, a sixth image on the fifth image that includes a third target image in a write-back memory of the electronic device, the third target image being an image within the first region;
The first processor retrieves the third target image from the write-back memory through the HWC module;
the first processor sending, by the HWC module, the third target image to a noise algorithm library of the electronic device;
and the first processor calculates and obtains second image noise based on the third target image through the noise algorithm library.
In a possible implementation manner of the second aspect, the first information includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device acquires the first value;
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time;
or, the electronic device meeting the first preset condition at the moment of last image refreshing includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
The moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time;
or,
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
In one possible implementation manner of the second aspect, the method further includes:
after the first processor sets a write back flag to a first flag via the HWC module; the first processor monitors whether data in a core node of the electronic device changes through the HWC module, wherein the core node stores a brightness value;
In response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a first luminance value from the core node;
after the first processor acquires a first luminance value from the kernel node through the HWC module, the first processor acquires a second luminance value from the kernel node through the HWC module in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the first processor sends the second luma value to the noise algorithm library through the HWC module.
In one possible implementation manner of the second aspect, the method further includes:
after the first processor sets a write back flag to a second flag via the HWC module, the first processor monitors, via the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to monitoring that a change occurs in data in a core node of the electronic device, the first processor obtaining, by the HWC module, a third luma value from the core node;
The first processor sending, by the HWC module, the third luminance value to the noise algorithm library;
after the first processor sends the third luminance value to the noise algorithm library through the HWC module, in response to monitoring that a change in data in a core node of the electronic device occurs, the first processor obtains a fourth luminance value from the core node through the HWC module;
the first processor sends the fourth luma value to the noise algorithm library through the HWC module.
In one possible implementation manner of the second aspect, the obtaining, by the first processor, first image noise based on the second target image calculation through the noise algorithm library includes:
the first processor calculates a first image noise based on the second target image and the second luminance value through the noise algorithm library.
In one possible implementation of the second aspect, the first processor receiving, by the HWC module, the first image includes:
the first processor receives a fifth display parameter sent by a Surface flanger module of the electronic device through the HWC module;
the first processor obtains, by the HWC module, the first image based on the fifth display parameter.
In a possible implementation manner of the second aspect, the first area is an area on a display screen of the electronic device, which is located above an ambient light sensor of the electronic device.
In one possible implementation manner of the second aspect, the obtaining, by the first processor, a time when the image is last refreshed by the electronic device through the HWC module includes:
the first processor receives a sixth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor storing, by the HWC module, a time at which the sixth display parameter was received by the HWC module;
the first processor acquires, by the HWC module, a time when the image is last refreshed on the electronic device, where the time includes:
the first processor obtains, by the HWC module, the stored time at which the sixth display parameter was received, where the time at which the sixth display parameter was received is the time at which the HWC module last stored the received display parameter before the time at which the HWC module last refreshed the image was obtained.
In one possible implementation manner of the second aspect, the first display parameter includes: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
In a third aspect, an electronic device is provided, comprising a processor configured to execute a computer program stored in a memory, to implement the method of any of the first aspect or the method of any of the second aspect of the present application.
In a fourth aspect, a chip system is provided, which includes a processor coupled to a memory, and the processor executes a computer program stored in the memory to implement the method of any one of the second aspect of the present application.
In a fifth aspect, there is provided a computer readable storage medium storing a computer program which, when executed by one or more processors, performs the method of any one of the first or second aspects of the present application.
In a sixth aspect, embodiments of the present application provide a computer program product, which, when run on an apparatus, causes the apparatus to perform the method of any one of the first aspect or the second aspect of the present application.
It is to be understood that, for the beneficial effects of the second aspect to the sixth aspect, reference may be made to the relevant description in the first aspect, and details are not described herein again.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
Fig. 2 is a diagram illustrating a positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating another positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present application;
FIG. 4 is a diagram illustrating another positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present application;
fig. 5 is a positional relationship diagram of a target area on a display screen according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a positional relationship between an ambient light sensor and a target area on a display screen according to an embodiment of the present application;
fig. 7 is a technical architecture diagram of a method for detecting ambient light according to an embodiment of the present application;
fig. 8 is a schematic diagram of an acquisition cycle of the ambient light sensor for acquiring ambient light according to an embodiment of the present application;
FIG. 9 is a schematic diagram of time nodes for image refresh and backlight adjustment during an acquisition cycle in the embodiment of FIG. 8;
FIG. 10 is a timing flow diagram of a method for detecting ambient light based on the technical architecture shown in FIG. 7;
fig. 11 is a timing flow chart of various modules in the AP processor provided by the embodiment of the present application in the embodiment shown in fig. 10;
FIG. 12 is a diagram of another technical architecture upon which the method for detecting ambient light provided by embodiments of the present application relies;
FIG. 13 is another timing flow diagram of a method for detecting ambient light based on the technical architecture shown in FIG. 12;
FIG. 14 is a schematic diagram of calculating integral noise based on image noise and backlight noise at each time node provided by the embodiment shown in FIG. 9;
fig. 15 is a schematic diagram of each time node for performing image refreshing and backlight adjustment in an upward direction of a time axis in an acquisition period according to the embodiment of the present application;
FIG. 16 is a schematic diagram illustrating the calculation of integral noise based on the image noise and backlight noise at each time node provided by the embodiment shown in FIG. 15;
fig. 17 is a schematic diagram of a start-stop scheme of a CWB write-back function according to an embodiment of the present disclosure;
FIG. 18 is a flowchart illustrating a process of the SCP processor transmitting the first information to the AP processor according to an embodiment of the present application;
fig. 19 is a schematic diagram of a start-stop scheme of a CWB write-back function of a forced refresh image according to an embodiment of the present disclosure;
FIG. 20 is a schematic diagram illustrating events at various times in a start-stop scheme of a CWB write-back function provided by the implementation shown in FIG. 19;
FIG. 21 is a schematic diagram of the start-stop scheme for obtaining the integral noise using the CWB write back function provided by the embodiments shown in FIGS. 19 and 20;
FIG. 22 is a diagram illustrating a refresh state and an idle state of a display screen according to an embodiment of the present disclosure;
fig. 23 is a start-stop scheme of a CWB write-back function for forcibly refreshing an image when a display screen is in an idle state for a long time according to the embodiment of the present application;
fig. 24 is a start-stop scheme of the CWB write-back function using the forced refresh image shown in fig. 23 according to the embodiment of the present application;
fig. 25 is a start-stop scheme provided in an embodiment of the present application, which uses the CWB write-back function shown in fig. 17;
fig. 26 is a schematic flowchart of another start/stop scheme of the CWB writeback function provided in this embodiment of the present application;
FIG. 27 is a diagram illustrating a method for determining whether to forcibly refresh an image according to the embodiment shown in FIG. 26;
fig. 28 is a schematic diagram of obtaining integral noise by using the start-stop scheme shown in fig. 26 according to the embodiment of the present application;
FIG. 29 is a schematic diagram illustrating another embodiment of the present application for obtaining integral noise using the start-stop scheme shown in FIG. 26;
fig. 30 is a schematic diagram of obtaining integral noise by using the start-stop scheme shown in fig. 26 according to another embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in the embodiments of the present application, "one or more" means one, two, or more than two; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," "fourth," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The noise monitoring method provided by the embodiment of the application can be suitable for electronic equipment with an OLED screen. The electronic device may be a tablet computer, a wearable device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or other electronic devices. The embodiment of the present application does not limit the specific type of the electronic device.
Fig. 1 shows a schematic structural diagram of an electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a touch sensor 180K, an ambient light sensor 180L, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area can store an operating system and an application program required by at least one function. The storage data area may store data created during use of the electronic device 100.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc.
In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave.
In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio signals into analog audio signals for output and also to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to implement noise reduction functions in addition to listening to voice information. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on. For example, the microphone 170C may be used to collect voice information related to embodiments of the present application.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A.
In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the electronic device 100 at a different position than the display screen 194.
The ambient light sensor 180L is used to sense ambient light brightness. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may employ an organic light-emitting diode (OLED). In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The embodiment of the present application does not particularly limit a specific structure of an execution subject of monitoring noise, as long as the execution subject can be processed by a noise monitoring method provided according to the embodiment of the present application by running a program recorded with the noise monitoring method of the embodiment of the present application. For example, an execution main body of the noise monitoring method provided by the embodiment of the present application may be a functional module capable of calling a program and executing the program in the electronic device, or a communication device, such as a chip, applied to the electronic device.
Fig. 2 is a front position relationship diagram of a display screen and an ambient light sensor in an electronic device according to an embodiment of the present application.
As shown in fig. 2, the projection of the ambient light sensor on the display screen of the electronic device is located at the upper half of the display screen of the electronic device. When a user holds the electronic equipment, the ambient light sensor positioned on the upper half part of the electronic equipment can detect the light intensity and the color temperature of the environment on the front side (the orientation of the display screen in the electronic equipment) of the electronic equipment, and the light intensity and the color temperature are used for adjusting the brightness and the color temperature of the display screen of the electronic equipment, so that a better visual effect can be achieved. For example, the display screen may not be too bright in dark environments to cause glare, and may not be too dark in bright environments to cause poor viewing.
Fig. 3 is a side view of the display screen and the ambient light sensor in the electronic device. The display screen of the electronic equipment comprises from top to bottom: glass apron (printing opacity), display module assembly and protection pad pasting, wherein, all are used for showing the azimuth relation when electronic equipment's display screen upwards places here under and. Because the ambient light sensor need gather the ambient light of the top of electronic equipment's display screen, consequently, can dig a part with the display module assembly in the display screen, ambient light sensor is placed to this part, is equivalent to ambient light sensor and places the below of the glass apron in the display screen in, and display module assembly is located the same layer. Note that the detection direction of the ambient light sensor coincides with the orientation of the display screen in the electronic device (the orientation of the display screen in the electronic device is upward in fig. 3). Obviously, this arrangement of the ambient light sensor sacrifices a portion of the display area. When a high screen occupation ratio is pursued, the arrangement mode of the ambient light sensor is not applicable.
Fig. 4 shows another arrangement of the ambient light sensor provided in the embodiments of the present application. And transferring the ambient light sensor from the lower part of the glass cover plate to the lower part of the display module. For example, the ambient light sensor is located below an Active Area (AA) area in the OLED display module, and the AA area is an area in the display module where image content can be displayed. This arrangement of the ambient light sensor does not sacrifice the display area. However, the OLED screen is a self-luminous display screen, when the OLED screen displays an image, a user can see the image from above the display screen, and similarly, the ambient light sensor located below the OLED screen can also collect light corresponding to the image displayed by the OLED screen. Therefore, the ambient light collected by the ambient light sensor includes light emitted by the display screen and actual ambient light from the outside. If the actual ambient light from the outside is to be accurately obtained, it is necessary to obtain the light emitted from the display screen in addition to the ambient light collected by the ambient light sensor.
As can be understood from fig. 4, since the ambient light sensor is located below the AA area, the AA area in the display module is not sacrificed due to the arrangement of the ambient light sensor. Therefore, the projection of the ambient light sensor on the display screen can be located in any area of the front of the display screen, and is not limited to the following arrangement: the projection of the ambient light sensor on the display screen is located at the top of the front of the display screen.
Regardless of which region of the display screen is located below the AA region, the projected area of the ambient light sensor on the display screen is much smaller than the area of the display screen itself. Therefore, not the entire display screen may emit light that interferes with the ambient light collected by the ambient light sensor. But the light emitted from the display area above the ambient light sensor in the display screen and the light emitted from the display area above a certain range around the ambient light sensor will interfere with the ambient light collected by the ambient light sensor.
As an example, the light sensing area of the ambient light sensor has a light sensing angle, and the ambient light sensor may receive light within the light sensing angle but not light outside the light sensing angle. In fig. 5, light emitted from point a above the ambient light sensor (within the sensing angle) and light emitted from point B above a certain range around the ambient light sensor (within the sensing angle) both interfere with the ambient light collected by the ambient light sensor. While the light emitted from point C (located outside the light sensing angle) farther away from the ambient light sensor in fig. 5 does not interfere with the ambient light collected by the ambient light sensor. For convenience of description, a display area of the display screen that interferes with the ambient light collected by the ambient light sensor may be referred to as a target area (the target area may be referred to as a first area). The location of the target area in the display screen is determined by the specific location of the ambient light sensor under the AA area. As an example, the target area may be a square area with a side length of a certain length (e.g., 80 microns, 90 microns, 100 microns) centered at a center point of the ambient light sensor. Of course, the target area may also be an area of other shape obtained by measurement that interferes with the light collected by the ambient light sensor.
As another example, fig. 6 is a schematic front view of an OLED screen of an electronic device provided in an embodiment of the present application, and as shown in fig. 6, the electronic device includes a housing, an OLED screen of the electronic device displays an interface, a corresponding area of the display interface in the display screen is an AA area, and an ambient light sensor is located behind the AA area. The center point of the target area coincides with the center point of the ambient light sensor.
It should be noted that, the ambient light sensor is a single device, and the manufacturer may be different, and the shape of the external appearance may also be different. The central point of the ambient light sensor in the embodiment of the present application is the central point of the photosensitive area where the ambient light sensor collects ambient light. In addition, the target area shown in FIG. 6 is larger than the projected area of the ambient light sensor on the OLED screen. In practical application. The target area may also be less than or equal to the projection area of the ambient light sensor on the OLED screen. However, the target area is typically larger than the photosensitive area of the ambient light sensor. As mentioned above, the real ambient light from the outside is equal to the ambient light collected by the ambient light sensor minus the light emitted by the display screen. While the light emitted by the display screen has been determined to be the light emitted by the target area. The emitted light of the target area is light generated by the display content of the target area. And the interference of the display content to the ambient light collected by the ambient light sensor comes from two parts: RGB pixel information of the display image and the brightness of the display image. It can be understood from the above analysis that the interference to the ambient light collected by the ambient light sensor is: RGB pixel information of an image displayed by the target area and luminance information of the target area. As an example, if the pixel value of a pixel is (r, g, b) and the luminance is L, the normalized luminance of the pixel is: l x (r/255) 2.2 ,L×(g/255) 2.2 ,L×(b/255) 2.2 。
For convenience of description, an image corresponding to the target area may be denoted as a target image, and interference of RGB pixel information and luminance information of the target image on ambient light collected by the ambient light sensor may be denoted as fusion noise. The ambient light collected by the ambient light sensor can be recorded as initial ambient light, and the external real ambient light can be recorded as target ambient light.
From the above description it can be derived: the target ambient light is equal to the initial ambient light minus the fusion noise at each instant in the time period in which the initial ambient light was collected. In the embodiment of the present application, a process of calculating the fusion noise together according to the RGB pixel information and the luminance information is referred to as a noise fusion process.
When the display screen is in a display state, the RGB pixel information of the image displayed in the target area may change, and the brightness information of the displayed image may also change. The fusion noise may be changed whether the RGB pixel information of the image displayed in the target area is changed or the luminance information of the displayed image is changed. Therefore, it is necessary to calculate the fusion noise thereafter from the changed information (RGB pixel information or luminance information). If the image of the target area is unchanged for a long time, the fusion noise is calculated only when the brightness of the display screen is changed. Therefore, in order to reduce the frequency of calculating the fusion noise, the target region may be a region in which the frequency of change of the image displayed on the display screen is low. For example, a status bar area at the top of the front of the electronic device. The projection of the ambient light sensor on the display screen is located to the right in the status bar area of the display screen. Of course, the position of the ambient light sensor may be a position to the left in the status bar area or a position in the middle in the status bar area, and the embodiment of the present application does not limit the specific position of the ambient light sensor.
A technical architecture corresponding to the method for obtaining the target ambient light through the initial ambient light and the content displayed on the display screen provided by the embodiment of the present application will be described below by using fig. 7.
As shown in fig. 7, the processor in the electronic device is a multi-core processor, which at least includes: an AP (application processor) processor and an SCP (sensor co processor) processor. The AP processor is an application processor in the electronic device, and an operating system, a user interface and an application program are operated on the AP processor. The SCP processor is a co-processor that may assist the AP processor in performing events related to images, sensors (e.g., ambient light sensors), and the like.
Only the AP processor and SCP processor are shown in fig. 7. In practical applications, the multi-core processor may also include other processors. For example, when the electronic device is a mobile phone, the multi-core processor may further include a Baseband (BP) processor that runs mobile phone radio frequency communication control software and is responsible for sending and receiving data.
The AP processor in fig. 7 only shows the content related to the embodiment of the present application, and the implementation of the embodiment of the present application needs to rely on: an Application Layer (Application), a Java Framework Layer (Framework Java), a native Framework Layer (Framework native), a Hardware Abstraction Layer (HAL), a kernel Layer (kernel), and a Hardware Layer (Hardware).
The SCP processor in fig. 7 may be understood as a sensor control center (sensor hub) which can control the sensors and process data related to the sensors. The implementation of the embodiment of the present application needs to rely on: a co-application layer (Hub APK), a co-framework layer (Hub FWK), a co-driver layer (Hub DRV), and a co-hardware layer (Hub hardware).
Various applications exist in the application layer of the AP processor, and application a and application B are shown in fig. 7. Taking application a as an example, after the user starts application a, the display screen will display the interface of application a. Specifically, the application a sends the display parameters (for example, the memory address, the color, and the like of the interface to be displayed) of the interface to be displayed to the display engine service.
And the display engine service in the AP processor sends the received display parameters of the interface to be displayed to a surfaceFlinger of a Framework layer (Framework native) of the AP processor.
The surface Flinger in the native Framework layer (Framework native) of the AP processor is responsible for the fusion of the control interface (surface). As an example, an overlap region of at least two interfaces that overlap is calculated. The interface here may be an interface presented by a status bar, a system bar, the application itself (interface to be displayed by application a), wallpaper, background, etc. Therefore, the surfaceflag can obtain not only the display parameters of the interface to be displayed by the application a, but also the display parameters of other interfaces.
The hardware abstraction layer of the AP processor has HWC (hardware component hal), which is a module for interface synthesis and display in the system and provides hardware support for a surfaceflag service. Step a1 is to send the display parameters (e.g., memory address, color, etc.) of each interface to the HWC through the interface (e.g., setLayerBuffer, setLayerColor, etc.) for interface fusion by the surfefinger. In practical applications, the display parameters may include: the location, size, color, memory address, etc. of the interface of the composite image on the display screen of the electronic device.
Generally, in image synthesis (e.g., when an electronic device displays an image, the status bar, the system bar, the application itself, and the wallpaper background need to be synthesized), the HWC obtains a synthesized image according to display parameters of each interface through hardware (e.g., a hardware synthesizer) underlying the HWC. The HWC in the hardware abstraction layer of the AP processor sends the underlying hardware-synthesized image to the OLED driver, see step a 2.
In practice, the HWC module may obtain the synthesized image based on the display parameters sent by the surfefinger in any manner.
The OLED drive of the kernel layer of the AP processor gives the synthesized image to the display subsystem (DSS) of the hardware layer of the AP processor, see step A3. The display subsystem (DSS) in the hardware layer of the AP processor may perform a secondary processing (e.g., HDR10 processing for enhancing image quality) on the combined image, and may display the secondary processed image after the secondary processing. In practical applications, the secondary treatment may not be performed. Taking the example of not performing the secondary processing, the display subsystem of the AP processor hardware layer sends the synthesized image to the OLED screen for display.
If the starting of the application a is taken as an example, the synthesized image displayed by the OLED screen is an interface synthesized by the interface to be displayed by the application a and the interface corresponding to the status bar.
The OLED screen can complete image refreshing and displaying once according to the mode.
In the embodiment of the present application, before the image after the secondary processing (or the synthesized image) is sent to be displayed, the display subsystem (DSS) may be controlled to store the whole frame of image (which may also be an image of the whole frame of image larger than the target area, or may also be an image corresponding to the target area in the whole frame of image) in the memory of the kernel layer of the AP processor, and since the process belongs to Concurrent Write-Back image frame data, the memory may be recorded as a Write-Back (CWB) memory, see step a 4.
In the embodiment of the present application, for example, the display subsystem stores the whole frame image in the CWB memory of the AP processor, and after the display subsystem successfully stores the whole frame image in the CWB memory, the display subsystem may send a signal indicating that the storage is successful to the HWC. The whole frame image corresponding to the image stored in the CWB memory by the display subsystem may be recorded as an image to be refreshed (the image to be refreshed may also be understood as an image after the current refresh).
The AP processor may also be configured to allow the HWC to access the CWB memory. The HWC may obtain the target image from the CWB memory after receiving a signal indicating that the storage sent by the subsystem was successful, see step a 5.
It should be noted that, regardless of whether the image of the whole frame image or the image of the partial region in the whole frame image is stored in the CWB memory, the HWC can obtain the target image from the CWB memory. The process of the HWC obtaining the target image from the CWB memory may be denoted as HWC matting from the CWB memory.
The range of the target image can be understood as the length and width limited range size of the target image, the range of the image to be refreshed is the range of the whole frame image, and the length and width limited range size can also be adopted.
As an example, the size of the image to be refreshed is X1 (pixels) × Y1 (pixels), the size of the target image is X2 (pixels) × Y2 (pixels), and the size of the image stored in the CWB memory is X3 (pixels) × Y3 (pixels). X3 satisfies X1 ≥ X3 ≥ X2, and Y3 satisfies Y1 ≥ Y3 ≥ Y2.
Of course, when X3 is X1 and Y3 is Y1, the image stored in the CWB memory is an entire frame image. When X3 is X2 and Y3 is Y2, the image stored in the CWB memory is the target image.
Continuing to take application a as an example, when application a has a brightness adjustment requirement due to switching of the interface, application a sends the brightness to be adjusted to the display engine service.
And the display engine service in the AP processor sends the brightness to be adjusted to the kernel node in the kernel layer of the AP processor so as to enable related hardware to adjust the brightness of the OLED screen according to the brightness to be adjusted stored in the kernel node.
According to the mode, the OLED screen can complete one-time brightness adjustment.
In the embodiment of the present application, the HWC may be further configured to obtain the brightness to be adjusted from the kernel node, and the brightness to be adjusted may also be recorded as the brightness after the current adjustment, which is specifically referred to in step a 5'.
In specific implementation, the HWC may monitor whether data stored in the kernel node changes based on the uevent mechanism, and obtain currently stored data, that is, a brightness value to be adjusted (the brightness value to be adjusted is used to adjust the brightness of the display screen, and therefore may also be recorded as the brightness value of the display screen) from the kernel node after monitoring that the data in the kernel node changes. After obtaining the target image or the brightness information to be adjusted, the HWC may send the target image or the brightness information to be adjusted to a noise algorithm library of a hardware abstraction layer of the AP processor. See step a 6. The noise algorithm library can calculate and obtain the fusion noise at the refreshing time of the target image after the target image is obtained every time. After each brightness is obtained, the fusion noise at the brightness adjusting time is calculated and obtained. And the noise algorithm library stores the fusion noise obtained by calculation in a noise memory of the noise algorithm library.
In practical applications, after the HWC obtains the target image, the HWC may store the target image, and the HWC may send the storage address of the target image to the noise algorithm library, and the noise algorithm library may buffer the target image of a frame at the latest moment in a manner of recording the address. After the HWC obtains the brightness to be adjusted, the HWC may send the brightness to be adjusted to a noise algorithm library, which may buffer a brightness at the latest moment. For convenience of description, the subsequent embodiments of the present application are described in terms of sending the target image to the noise algorithm library by the HWC, and in practical applications, the HWC may obtain the target image and store the target image, and send a storage address of the target image to the noise algorithm library.
As an example, after receiving the storage address of the first frame target image, the noise algorithm library caches the storage address of the first frame target image. And then, each time a new storage address of the target image is received, the new storage address of the target image is used as the storage address of the latest target image cached. Correspondingly, the noise algorithm library buffers the first brightness after receiving the first brightness, and the new brightness is taken as the latest brightness buffered every time a new brightness is received. In the embodiment of the application, the noise algorithm library caches the acquired target image and brightness value in the data storage library. The target image and the luminance value stored in the data store may be recorded as screen data, i.e. the screen data stored in the data store includes: a target image and a luminance value.
In addition, in order to describe the transfer relationship between parameters such as the target image and the brightness to be adjusted, the embodiment of the present application takes the example that the HWC sends the parameters such as the target image and the brightness to be adjusted to the noise algorithm library. In practice, the relationship between the HWC and the noise algorithm library calls the noise algorithm library for the HWC. When the HWC calls the noise algorithm library, the HWC inputs parameters such as a target image (a memory address of the target image), brightness to be adjusted, and the like as arguments of a calculation model in the noise algorithm library to the noise algorithm library. Other parameters will not be exemplified.
Because brightness adjustment and image refreshing are two completely independent processes, an image may be refreshed at a certain time, and brightness remains unchanged, when fusion noise at the time is calculated, a target image corresponding to the refreshed image and current brightness (a brightness value stored latest before the time indicated by the timestamp of the target image in brightness values stored in the noise algorithm library) are adopted. For convenience of description, the fusion noise at the image refresh timing calculated due to the image refresh may be written as the image noise at the image refresh timing. Similarly, if the image is not refreshed at a certain time and the brightness is adjusted, the adjusted brightness and the current target image (the target image stored in the noise algorithm library and newly stored before the time indicated by the timestamp of the brightness value) are used for calculating the fusion noise at the certain time. For convenience of description, the fusion noise at the luminance adjustment timing calculated due to the luminance adjustment may be written as the backlight noise at the luminance adjustment timing.
The target image and the brightness sent by the HWC to the noise algorithm library are both time-stamped, and correspondingly, the image noise and the backlight noise obtained by the computation of the noise algorithm library are also both time-stamped. The timestamp of the image noise is the same as the timestamp of the target image, and the timestamp of the backlight noise is the same as the timestamp of the brightness to be adjusted. The timestamp of the image noise should be the image refresh moment in the strict sense. In practical applications, another time node close to the image refresh time may be used as the image refresh time, for example, the start time (or the end time, or any time between the start time and the end time) of the HWC performing matting to obtain the target image from the CWB memory may be used as the image refresh time. The time stamp of the backlight noise should be strictly speaking the backlight adjustment instant. In practical applications, other time nodes close to the backlight adjusting time may also be used as the backlight adjusting time, for example, the start time (or the end time, or any time between the start time and the end time) when the HWC executes to obtain the brightness to be adjusted from the kernel node is used as the brightness adjusting time. The timestamp of the image noise and the timestamp of the backlight noise facilitate denoising of the initial ambient light collected by the subsequent ambient light sensor and the ambient light sensor over a time span to obtain the target ambient light. The noise algorithm library stores image noise and backlight noise in a noise memory, stores a timestamp of the image noise when the noise algorithm library stores the image noise, and stores a timestamp of the backlight noise when the noise algorithm library stores the backlight noise.
An Ambient Light Sensor (ALS) in the co-hardware layer of the SCP processor collects initial ambient light at a certain collection period after start-up (typically, after the electronic device is powered on, the ambient light sensor is started up). The ambient light sensor of the SCP processor transmits the initial ambient light information to the ambient light sensor driver (ALS DRV) of the co-driver layer (Hub DRV) layer of the SCP processor, see step E2.
The initial ambient light information transmitted by the SCP processor to the AP processor includes a first value, a first time and a second time, where the first value can be understood as a raw value of the initial ambient light, the first time is an integration start time at which the ambient light sensor acquires the first value, and the second time is an integration end time at which the ambient light sensor acquires the first value.
And in a cooperative driving (Hub DRV) layer of an SCP processor, an ambient light sensor driving (ALS DRV) carries out preprocessing on initial ambient light information to obtain raw values on four channels of the RGBC. The co-driver layer of the SCP processor transmits raw values on the RGBC four channels to the ambient light sensor application of the co-application layer of the SCP processor, see step E3.
The ambient light sensor of the co-application layer of the SCP processor sends raw values on the RGBC four channels and other relevant data (e.g. start time and end time of each time the ambient light sensor collects initial ambient light) to the HWC of the AP processor via a first inter-core communication (communication between the ambient light sensor application of the SCP processor and the HWC of the AP processor), see step E4.
After the HWC in the AP processor obtains the initial ambient light data reported by the SCP processor, the HWC in the AP processor may send the initial ambient light data to the noise algorithm library. See step a 6.
As described above, the noise algorithm library may calculate the image noise at the image refresh timing and the backlight noise at the luminance adjustment timing, and store the calculated image noise and backlight noise in the noise memory in the noise algorithm library. In practical application, the noise algorithm library can calculate and obtain image noise at the image refreshing time and backlight noise at the brightness adjusting time. The integral noise between the acquisition start time and the acquisition end time of the initial ambient light may also be obtained from the image noise and the backlight noise stored in the noise memory after the acquisition start time and the acquisition end time of the initial ambient light are obtained. And the noise algorithm library deducts integral noise between the acquisition starting time and the acquisition ending time of the initial environment light from the initial environment light to obtain the target environment light.
As can be understood from the above description of the noise algorithm library, the noise calculation library includes various calculation models, for example, a first algorithm model, which is used for obtaining the fusion noise according to the target image and the brightness calculation. And the second algorithm model is used for obtaining integral noise between the acquisition starting time and the acquisition ending time of the initial ambient light according to the fusion noise at each moment. And the third algorithm model is used for obtaining the target ambient light according to the initial ambient light and the integral noise. In practical applications, the noise algorithm library may further include other calculation models, for example, in a process of obtaining the target ambient light based on the target image, the brightness, and the initial ambient light, if the raw values on the four channels of the initial ambient light are filtered, there is a model for filtering the raw values on the four channels of the initial ambient light, which is not illustrated in the embodiment of the present application.
The inputs to the library of noise algorithms include: the target image and brightness acquired by the HWC at various times, and the initial ambient light correlation data acquired by the HWC from the SCP processor. The output of the noise algorithm library is: and the raw value of the target environment light can be recorded as a second value. In the embodiment of the present application, the process of sending the target image, the brightness, and the initial ambient light from the HWC to the noise algorithm library is denoted as step a 6.
The noise calculation library also needs to return the target data to the HWC after obtaining the target ambient light, and this process is denoted as step a 7. In practical applications, the output of the noise algorithm library is raw values on four channels of the target ambient light.
The HWC in the AP processor sends the raw values on the four channels of the target ambient light returned by the noise algorithm library to the ambient light sensor application in the co-application layer of the SCP processor via first inter-core communication, see step A8.
After the ambient light sensor application of the co-driver layer of the SCP processor obtains the raw values on the target ambient light four channels, the raw values on the target ambient light four channels are stored in the ambient light memory of the co-driver layer. See step E5.
The co-driver layer of the SCP processor is provided with a calculation module that obtains from memory the raw values on the target ambient light four channels, see step E6. When the integration of each time is finished, the ambient light sensor generates an integration interrupt signal, the ambient light sensor sends the integration interrupt signal to the ambient light sensor driver, the ambient light sensor driver calls the calculation module, and the calculation module is triggered to obtain raw values on four channels of the target ambient light from the storage.
The ambient light sensor drive triggers the calculation module to acquire the raw value of the target ambient light after the integration is finished, so that the raw value of the target ambient light in the previous integration period is acquired at the moment.
Taking the embodiment shown in FIG. 8 as an example, at t 1 After the time integral is finished, the ambient light sensor obtains t 0 Time to t 1 Initial ambient light at time, SCP processor will t 0 Time to t 1 The initial ambient light at the moment is sent to an AP processor, and the AP processor obtains t through calculation 0 Time to t 1 Raw value of the target ambient light at the time. AP processor will t 0 Time to t 1 The raw value of the target ambient light at the time is sent to the SCP processor. The SCP processor will store t 0 Time to t 1 The raw value of the target ambient light at the time is entered into the memory of the SCP processor.
At t 3 After the time integral is finished, the ambient light sensor obtains t 2 Time to t 3 Initial ambient light at time, SCP processor will t 2 Time to t 3 The initial ambient light at the time is sent to the AP processor. An integral interrupt signal is generated after the integration of the ambient light sensor is finished every time, the ambient light sensor sends the integral interrupt signal to the ambient light sensor drive, the ambient light sensor drive calls the calculation module, and the calculation module is triggered to obtain the currently stored t from the memory 0 Time to t 1 Raw value of the target ambient light at the time. Since this time is t 3 After the moment of time, the calculation module is therefore at t 3 After the moment according to t obtained 0 Time to t 1 And calculating the raw value of the target ambient light at the moment to obtain the lux value of the target ambient light. That is, the SCP processor calculates the lux value of the target ambient light obtained in the T2 period as the lux value of the real ambient light in the T1 period.
As previously mentioned, the ambient light sensor in the SCP processor will end up integrating (t) 3 Time) is followed by an integration interrupt signal (which gives the ambient light sensor drive) and at t 3 Initial environment of T2 cycle after timeThe light is sent to the AP processor, which sends the target ambient light to the SCP processor after the AP processor calculates the target ambient light, which stores the target ambient light for the period T2 in memory. If the SCP processor calculates the lux value using the raw value of the target ambient light for period T2, it will wait until the AP processor transmits the target ambient light to the memory of the SCP processor, starting from the receipt of the integration interrupt signal by the ambient light sensor driver. The ambient light sensor driver in the SCP processor can invoke the calculation module to retrieve the raw value of the target ambient light from memory for the period T2. Waiting for a time period comprising at least: the process of transmitting the initial ambient light to the AP processor by the SCP processor, the process of calculating and obtaining the target ambient light by the AP processor based on the initial ambient light and other related data and the process of transmitting the target ambient light to a memory in the SCP processor by the AP processor are respectively corresponding to the time, and the time is relatively longer and is not fixed. Therefore, the ambient light sensor driver in the SCP processor may be configured to invoke the calculation module to fetch the raw value of the target ambient light of the previous cycle from the memory after receiving the integral interrupt signal of the second acquisition cycle, so as to calculate the lux value according to the raw value of the target ambient light of the previous cycle. The lux value of the target ambient light may be recorded as a third value, and the third value and the second value are the lux value and the raw value of the same target ambient light.
Taking the collection period shown in fig. 8 as an example, if the raw value of the initial ambient light collected in the collection period T1 is the first value. The raw value of the target ambient light corresponding to the acquisition period T1, which is obtained from the raw value of the initial ambient light acquired during the acquisition period T1, is the second value. The lux value of the target ambient light corresponding to the acquisition period T1, which is obtained from the raw value of the target ambient light corresponding to the acquisition period T1, is a third value. The raw value of the initial ambient light acquired during the acquisition period T2 may be recorded as a fourth value. The fourth value is the initial ambient light acquired in an acquisition period subsequent to the acquisition period corresponding to the first value (or the acquisition period corresponding to the second value, or the acquisition period corresponding to the third value).
And a calculation module in a co-driving layer of the SCP processor obtains the lux value of the target ambient light according to the raw value on the four channels of the target ambient light. The calculation module in the SCP processor sends the calculated lux value of the target ambient light to the ambient light sensor application of the co-application layer through the interface of the co-framework layer, see steps E7 and E8.
The ambient light sensor application of the co-application layer in the SCP processor transmits the lux value of the target ambient light to the light service (light service) of the raw framework layer in the AP processor through the second inter-core communication (communication of the SCP processor to the light service of the AP processor), see step E9.
A light service (light service) may send the lux value of the target ambient light to the display engine service. The display engine service may send the lux value of the target ambient light to the upper layer to facilitate an application in the application layer to determine whether to adjust the brightness. The display engine service can also send the lux value of the target ambient light to the kernel node so as to enable related hardware to adjust the brightness of the display screen according to the lux value of the target ambient light stored by the kernel node.
After describing the technical architecture on which the method of obtaining the target ambient light depends, the process of obtaining the target ambient light from the target image, the brightness, and the initial ambient light collected by the ambient light sensor will be described from the perspective of the collection period of the ambient light sensor.
As can be understood from the above examples, the target image and the brightness to be adjusted are both obtained by the HWC, and therefore, there is a sequential order in the processes of obtaining the target image and obtaining the brightness to be adjusted by the HWC. After the HWC acquires the target image or the brightness to be adjusted, the target image or the brightness to be adjusted is sent to the noise algorithm library, and the process that the HWC sends the target image or the brightness to be adjusted to the noise algorithm library also has a sequence. Correspondingly, the time when the noise algorithm library receives the target image and the brightness to be adjusted also has a sequence. However, even if there is a chronological order in the time when the noise algorithm library receives the target image and the brightness to be adjusted, the timestamps of the target image and the brightness to be adjusted may be the same since the HWC may be within the same time metric level when acquiring the target image and the brightness to be adjusted. As an example, within the same millisecond (5 th millisecond), the HWC performs acquisition of the brightness to be adjusted first and then performs acquisition of the target image. Although there is a precedence in the execution of the HWC, the time stamps of the target image and the brightness to be adjusted are both 5 th msec.
Referring to fig. 8, the ambient light sensor collects ambient light at a time period T from which the ambient light sensor collects 0 To t 2 (acquisition period T1), from T 2 To t 4 (acquisition period T2), from T 4 To t 6 (acquisition period T3) is one acquisition period. In the acquisition period of T1, the real acquisition time of the ambient light sensor is from T 0 To t 1 From t 1 To t 2 The ambient light sensor may be in a sleep state for this period of time. The embodiment of the present application is described by taking as an example that the collection period of the ambient light is fixed (i.e., the values of T1, T2, and T3 are the same) and the duration of the integration period is fixed.
As an example, it may be in 350ms (t) 2 -t 0 ) As one acquisition cycle. The actual acquisition time of the ambient light sensor in one acquisition period is 50ms (t) 1 -t 0 ) Then the ambient light sensor will have 300ms (t) in one acquisition period 2 -t 1 ) Is in a dormant state. The examples of 350ms, 50ms, and 300ms are for example only and not for limitation.
For ease of description, the time period (e.g., t) for which the ambient light sensor actually collects may be described 0 To t 1 ) Noted as an integration period, a period of time (e.g., t) during which the environmental sensor does not initiate acquisition 1 To t 2 ) Noted as a non-integration period.
The image displayed on the display screen of the electronic device is refreshed at a certain frequency. Taking 60Hz as an example, it is equivalent to refreshing the display screen of the electronic device 60 times per second, or refreshing the image every 16.7 ms. Image refresh occurs during the acquisition period of the ambient light sensor when the display screen of the electronic device displays images. When the image displayed on the display screen is refreshed, the AP processor performs steps a1 to a6 (transmission target image) in the technical architecture shown in fig. 7. HWC in AP processor from t 0 Start at a moment and controlThe CWB is always written back, i.e. the above steps are repeated as long as there is an image refresh.
Note that, in the present embodiment, a refresh rate of 60Hz is taken as an example. In practice, the refresh rate may be 120Hz or other refresh rates. In the embodiment of the present application, the above-mentioned steps a1 to a6 (transmission target image) need to be repeatedly executed every refresh frame, and in practical applications, the above-mentioned steps a1 to a6 (transmission target image) can also be repeatedly executed every other frame (or two frames, etc.).
The brightness adjustment does not have a fixed periodicity, so the brightness adjustment may also occur during the acquisition period of the ambient light sensor. When the brightness is adjusted, the HWC also performs steps a 5' to a6 (sending the brightness to be adjusted) in the technical architecture shown in fig. 7.
After each integration of the ambient light sensor (i.e. at t) 1 After that, t 3 After that, t 5 … …), the SCP processor reports the data of the initial ambient light collected in the current integration process (for example, raw values on four channels of the initial ambient light and the integration start time and the integration end time of the current integration process) to the HWC of the AP processor, and the HWC of the AP processor sends the data related to the initial ambient light to the noise algorithm library, and obtains the target ambient light through calculation of the noise algorithm library.
Referring to FIG. 9, taking an acquisition cycle as an example, at t 01 Time (sum t) 0 The same time), t 03 Time of day, t 04 Time and t 11 The moments are all image refreshing moments at t 02 Time and t 12 The time is the brightness adjustment time. Thus, the AP processor can compute t in real time 01 Image noise at time t 02 Backlight noise at time t 03 Image noise at time t 04 Image noise at time t 11 Image noise and t at time 12 Backlight noise at the moment. At the end of this integration (t) 1 Time of day), the noise memory of the AP processor stores: t is t 01 Image noise at time t 02 Backlight noise at time t 03 Temporal image noiseAnd t 04 Image noise at time instants.
At the end of this integration (t) 1 Time), the ambient light sensor obtains the initial ambient light of the integration and the integration time period. The SCP processor reports the data of the initial ambient light to the AP processor, and a noise calculation module in the AP processor obtains t from a noise memory according to the starting time and the ending time of the current integration time period 01 Image noise at time t 02 Backlight noise at time, t 03 Image noise at time, t 04 Image noise at the moment. And the noise calculation library calculates and obtains target environment light according to the initial environment light collected in the integral time period and the image noise and backlight noise influencing the integral time period.
During a non-integration period (t) 1 To t 2 ) Since the HWC always controls the CWB write back, therefore, the HWC is on t 11 The refreshed image at the moment is also subjected to matting to obtain a target image, and the noise algorithm library also calculates t 11 Image noise at time instants. Non-integral time period t 12 The brightness changes at the moment, and the noise algorithm base also calculates t 12 Backlight noise at the moment. However, when the target ambient light is obtained by calculation, the required fusion noise is a fusion noise that interferes with the initial ambient light obtained in the current integration period, and therefore, t is not required 11 Image noise and t at time 12 The backlight noise at the moment can also obtain the target ambient light of the current integration time period. In practical application, the noise algorithm library computer obtains t 11 Image noise and t at time 12 After the backlight noise at the moment, t also needs to be set 11 Image noise and t at time 12 The backlight noise at the moment is stored in a noise memory.
The above example describes the process of acquiring the target ambient light from the perspective of the technical architecture based on fig. 7 and from the perspective of the acquisition period of the ambient light sensor based on fig. 9, respectively. A timing sequence for acquiring the target ambient light provided by the embodiment shown in fig. 10 will be described in conjunction with the technical architecture shown in fig. 7 and one acquisition cycle of the ambient light sensor shown in fig. 9.
It can be understood from the above description that the process of the image refreshing triggering the AP processor to calculate the image noise, the process of the brightness adjusting triggering the AP processor to calculate the backlight noise, and the process of the SCP processor controlling the underlying hardware ambient light sensor to collect the initial ambient light are performed independently, and there is no chronological order. And the noise calculation library of the AP processor processes the target image, the brightness and the initial ambient light obtained in the three independent processes to obtain the target ambient light.
The same reference numbers for steps in the embodiment of fig. 10 and steps in the technical architecture of fig. 7 indicate that the same steps are performed. In order to avoid repetitive description, the contents detailed in the embodiment shown in fig. 7 will be briefly described in the embodiment shown in fig. 10.
From t, in connection with FIG. 9 0 The moment begins and the image is refreshed. At the same time, the ambient light sensor enters an integration period to begin collecting initial ambient light.
Accordingly, in FIG. 10, step E1, the ambient light sensor in the co-hardware layer of the SCP processor enters an integration period from t 0 (t 01 ) The initial ambient light is collected at the beginning of the time.
Step A1, image t 0 (t 01 ) And refreshing at the moment, and sending the display parameters of the interface to the HWC in the hardware abstraction layer of the AP processor by the SurfaceFlinger in the native framework layer of the AP processor. The HWC may send the received display parameters of each layer interface sent by the surfafinger to the hardware at the bottom of the HWC, and the hardware at the bottom of the HWC obtains the image synthesized by each layer interface according to the display parameters of each layer interface. The hardware underlying the HWC returns the synthesized image to the HWC.
In step A2, the HWC in the hardware abstraction layer of the AP processor sends the resultant image to the OLED drive in the kernel layer of the AP processor. Step a3, the OLED driver in the kernel layer of the AP processor sends the synthesized image to the display subsystem of the hardware layer of the AP processor.
In step a4, the display subsystem in the hardware layer of the AP processor stores the image before display in the CWB memory in the kernel layer of the AP processor.
In the embodiment of the present application, the HWC waits for a successful store signal from the display subsystem after sending the synthesized image to the OLED driver.
The display subsystem will send a signal to the HWC that the image was successfully stored in the CWB memory before being displayed. After receiving the signal that the display subsystem is successfully stored, the HWC performs cutout operation on the image before display stored in the CWB memory in the kernel layer to obtain a target image.
In step a5, the HWC in the hardware abstraction layer of the AP processor abstracts the target image from the pre-rendering image stored in the CWB memory in the kernel layer.
Step A6, after the HWC in the hardware abstraction layer of the AP processor obtains the target image, the target image is sent to the noise algorithm library of the layer, and after the noise algorithm library receives the target image, t is calculated according to the target image and the cached current brightness information 01 Image noise at time instants. During the execution of steps a1 through a6, the ambient light sensor in the co-hardware layer of the SCP processor is always in the integration process for one acquisition period.
In conjunction with FIG. 9, at t 02 At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t 02 At that moment, the brightness of the display screen changes, triggering the execution of step B1 in fig. 10.
In FIG. 10, step B1 (step A5' in the architecture shown in FIG. 7), the HWC of the hardware abstraction layer of the AP processor obtains t from a kernel node in the kernel layer of the AP processor 02 Luminance information of the time instant.
Step B2 (step A6), HWC of hardware abstraction layer of AP processor will t 02 The brightness information of the moment is sent to a noise algorithm library according to t 02 Calculating and obtaining t by the brightness information of the moment and the cached currently displayed target image 02 Backlight noise at the moment.
During the execution of steps B1 through B2, the ambient light sensor in the co-hardware layer of the SCP processor is always in the integration process for one acquisition period.
After step B2, a library of noise algorithmsIn the noise memory of (2) stores t 01 Image noise sum of time of day t 02 Backlight noise at the moment.
In conjunction with FIG. 9, at t 03 At this point in time, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t 03 At that time, the image is refreshed once.
In fig. 10, since the image is refreshed, the steps C1 to C6 are continuously performed, and the steps C1 to C6 can refer to the descriptions in a1 to a6, and are not repeated herein.
During the execution of steps C1 through C6, the ambient light sensor in the co-hardware layer of the SCP processor is still in the integration process for one acquisition period.
After step C6, the noise memory of the noise algorithm library stores t 01 Image noise at time t 02 Backlight noise and t at time 03 Image noise at time instants.
Referring to FIG. 9, at t 04 At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t 04 At that time, the image is refreshed once.
In fig. 10, since the image is refreshed, the steps D1 to D6 are continued, and the steps D1 to D6 refer to the descriptions in a1 to a6, which are not repeated herein.
During the execution of steps D1 through D6, the ambient light sensor in the co-hardware layer of the SCP processor is still in the integration process for one acquisition period.
After step D6, the noise memory of the noise algorithm library stores t 01 Image noise at time t 02 Backlight noise at time t 03 Image noise and t at time 04 Image noise at time instants.
In conjunction with FIG. 9, at t 1 At this time, the current integration of the ambient light sensor is finished, and the integration of the ambient light sensor is finished (t) 1 Time) the ambient light sensor obtains the initial ambient light, and in fig. 10 the SCP processor starts to perform steps E2, E3, E4, relating the initial ambient light data (on RGBC four channels) Raw value, integration start time, integration end time) into the HWC of the hardware abstraction layer of the AP processor.
In conjunction with FIG. 9, during non-integration periods, the image may also be refreshed (e.g., t 11 Temporal image refresh), brightness may also change (e.g., t) 12 The luminance changes at the moment). Therefore, in the non-integration period, steps F1 to F6 still exist in fig. 10 (steps F1 to F5 in fig. 10 are omitted, and specifically, reference may be made to steps a1 to a5), so that t 11 The image noise at the time is stored in a noise memory of a noise algorithm library. In the non-integration period, there still exists step G1 to step G2 (step G1 in fig. 9 is omitted, and specifically, refer to step B1) so that t 12 The backlight noise at the time is stored in a noise memory of a noise algorithm library.
At step a 6', the HWC in the hardware abstraction layer of the AP processor sends the initial ambient light data to the noise algorithm library. And the noise algorithm library calculates and obtains the target ambient light according to the data of the initial ambient light and the image noise and the backlight noise which interfere with the initial ambient light.
As can be understood from fig. 10, the integration start time and the integration end time of the ambient light sensor are controlled by the corresponding clock of the ambient light sensor; the process of calculating the image noise by the AP processor is controlled by an image refreshing clock; the process of the AP processor calculating the backlight noise is controlled by the timing of the backlight adjustment. Therefore, the execution of step a1 (or step C1, step D1, step F1) is triggered by an image refresh. The execution of step B1 (or step G1) is triggered by a brightness adjustment. The integration start time and the integration end time of the ambient light sensor are completely performed according to a preset acquisition period and each integration duration. Thus, the execution of step E2 is triggered by the event that the ambient light sensor integration ends.
From the triggering event perspective, these three processes are completely independent. However, the results obtained by the three processes (image noise, backlight noise, and initial ambient light) are correlated by the denoising process after the ambient light sensor integration period is over. The initial ambient light fused in the denoising process is the initial ambient light collected by the ambient light sensor in the current collection period, and the image noise and the backlight noise removed in the denoising process are image noise and backlight noise which can cause interference on the initial ambient light collected in the current collection period.
The embodiment of the application can obtain by analyzing the structure of the ambient light under the screen: factors disturbing the ambient light collected by the ambient light sensor include the display content of the display area directly above the photosensitive area of the ambient light sensor and the display area directly above a certain area around the photosensitive area of the ambient light sensor. The display content is divided into two parts: RGB pixel information and luminance information of the display image. Therefore, the noise calculation library in the embodiment of the present application obtains the fusion noise according to the fusion of the RGB pixel information and the luminance information of the target image. Then, integral noise of an integral period of the initial ambient light is obtained from the fusion noise. The target ambient light is obtained by removing integration noise that interferes with the initial ambient light from the initial ambient light obtained from the ambient light sensor integration period. Because the interference part is removed, accurate target ambient light can be obtained, and the universality is strong.
In addition, since the AP processor of the electronic device can obtain the target image and the luminance information, accordingly, the AP processor obtains image noise and backlight noise. The SCP processor may obtain initial ambient light. Thus, the SCP processor may send the initial ambient light to the AP processor, where the initial ambient light and the fusion noise are processed by the AP processor to obtain the target ambient light. The problem that the AP processor frequently sends a target image (or image noise) and brightness information (or backlight noise) to the SCP processor, and the inter-core communication is too frequent and the power consumption is large is solved.
Furthermore, the DSS in the AP processor may store the image before display (the image to be displayed this time refreshed on the display screen) in the CWB memory. The HWC in the AP processor scratches a target image from an image before display sending, which is stored in a CWB memory, so that fusion noise is obtained through calculation, and the fusion noise obtained in the mode is accurate and has low power consumption.
It should be noted that, in the case of displaying an image on the display screen, the brightness of the display screen needs to be adjusted according to the target ambient light. In the case where the display screen does not display any image, it is not necessary to adjust the brightness of the display screen in accordance with the target ambient light. Therefore, the AP processor also needs to monitor the display screen for on and off events. When the screen is bright, the detection method of the ambient light provided by the embodiment of the application is executed. While the screen is being turned off, the AP processor may not perform the steps a4 through a 6. Similarly, the SCP processor may also control the ambient light sensor to stop collecting the initial ambient light when the screen is turned off, and the SCP processor may not perform steps E2 to E5.
To provide a clearer understanding of the execution inside the AP processor, a timing diagram between various modules inside the AP processor is described, which is obtained by obtaining t in the embodiment shown in fig. 10 01 Image noise at time, t 02 The description is made taking the backlight noise at the time as an example.
In the embodiment shown in fig. 11, when refreshing the image, the respective modules in the AP processor perform the following steps:
step 1100, after the display engine service obtains the display parameters of the interface to be displayed from the application in the application layer, the display engine service sends the display parameters of the interface to be displayed to the SurfaceFlinger.
In step 1101, after obtaining the display parameters of the interface to be displayed of the application a from the display engine service, the surface flinger sends the display parameters (e.g., memory address, color, etc.) of each interface (the interface to be displayed of the application a, the status bar interface, etc.) to the HWC through the interfaces (e.g., setLayerBuffer, setLayerColor).
In step 1102, after the HWC receives the display parameters of each interface, the hardware on the bottom layer of the HWC obtains a synthesized image according to the display parameters of the interface to be displayed.
In step 1103, the HWC obtains the image synthesized by the hardware on the bottom layer, and sends the synthesized image to the OLED driver.
And step 1104, after receiving the synthesized image sent by the HWC, the OLED driver sends the synthesized image to the display subsystem.
Step 1105, after the display subsystem receives the synthesized image, it performs a secondary processing on the synthesized image to obtain the image before display.
At step 1106, the display subsystem stores the pre-display image in the CWB memory.
It should be noted that, since the OLED screen needs to refresh the image, the display subsystem needs to send the image before being sent to the display screen for display.
In the embodiment of the application, the step of sending the image before being sent and displayed to the display screen by the display subsystem and the step of storing the image before being sent and displayed in the CWB memory by the display subsystem are two independent steps without strict precedence order.
In step 1107, after the display subsystem successfully stores the pre-display image in the CWB memory, it may send a signal to the HWC that the storage was successful.
In step 1108, after receiving the signal that the storage is successful, the HWC performs matting to obtain the target image from the image before display stored in the CWB memory, and the time when the HWC starts to obtain the target image is used as the timestamp of the target image.
In step 1109, the HWC sends the target image and the timestamp to the noise algorithm library after acquiring the target image and the timestamp.
Step 1110, the noise algorithm library calculates and obtains the image noise (t) at the refresh time corresponding to the target image 01 Image noise at the moment). The timestamp of the image noise is the timestamp of the target image from which the image noise is obtained. A noise algorithm library stores the image noise and a timestamp of the image noise.
During brightness adjustment, each submodule in the AP processor performs the following steps:
and 1111, after the display engine service obtains the brightness to be adjusted from the application a in the application layer, the display engine service sends the brightness to be adjusted to the kernel node.
In step 1112, the HWC acquires the brightness to be adjusted from the core node after monitoring that the data in the core node changes. The time when the HWC executes the retrieval of the brightness to be adjusted from the kernel node is a time stamp of the brightness to be adjusted.
In practical applications, the HWC always listens to the kernel node for data changes.
In step 1113, the HWC sends the adjusted brightness and the timestamp of the brightness to be adjusted to the noise algorithm library.
Step 1114, the noise algorithm library calculates the backlight noise (t) at the adjustment time to obtain the brightness to be adjusted 02 Backlight noise at the moment). The timestamp of the backlight noise is the timestamp of the brightness to be adjusted of the backlight noise. A noise algorithm base stores the backlight noise and a time stamp of the backlight noise.
After the end of an integration period, the SCP processor sends the initial ambient light collected during the integration period to the HWC in the AP processor.
In step 1115, the HWC of the AP processor receives the initial ambient light sent by the SCP processor and the integration start time and the integration end time of the initial ambient light.
In step 1116, after the HWC receives the initial ambient light sent by the SCP processor and the integration start time and the integration end time of the initial ambient light, the HWC sends the initial ambient light and the integration start time and the integration end time of the initial ambient light to the noise algorithm library.
In step 1117, the noise algorithm library calculates the integral noise according to the image noise and the corresponding timestamp, the backlight noise and the corresponding timestamp and the integral start time and the integral end time of the initial ambient light. And the noise algorithm library calculates and obtains backlight noise according to the integral noise and the initial ambient light.
The embodiment of the application mainly describes a sequential logic diagram among modules when the AP processor obtains target ambient light.
In the above embodiments, the example is that after the AP processor acquires the target image and the luminance information, the AP processor calculates the fusion noise, and after the SCP processor acquires the initial ambient light, the SCP processor sends the initial ambient light to the AP processor, and the AP processor processes the fusion noise to acquire the integral noise of the integral time period of the initial ambient light, and then acquires the target ambient light according to the initial ambient light and the integral noise.
In practical application, the AP processor may also send the target image and the brightness information to the SCP processor after obtaining the target image and the brightness information. The SCP processor fuses the target image and the brightness information to obtain fusion noise and integral noise of an integral time period of the initial ambient light, and then obtains the target ambient light according to the fusion noise and the initial ambient light.
In practical application, after the AP processor acquires the target image and the brightness information, the AP processor calculates the fusion noise and sends the fusion noise obtained by calculation to the SCP processor. The SCP processor obtains integral noise of an integral time period according to the received fusion noise, and obtains target ambient light according to the integral noise of the integral time period and initial ambient light collected by the ambient light sensor.
Referring to fig. 12, the fusion noise is calculated and obtained at the AP processor according to the embodiment of the present disclosure; and calculating integral noise at the SCP processor, and obtaining target ambient light according to the integral noise and the target ambient light.
As mentioned above, the process of obtaining the target ambient light can be briefly described as follows:
And 2, calculating backlight noise according to the brightness.
And 3, calculating target ambient light (raw values on four channels) according to the image noise, the backlight noise and the initial ambient light.
In the technical architecture shown in fig. 7, step 3, the target ambient light is calculated according to the image noise, the backlight noise and the initial ambient light, and is implemented in the noise algorithm library of the AP processor. The noise algorithm library of the AP processor can calculate the image noise and the backlight noise. The initial ambient light is derived from the driving of the ambient light sensor of the SCP processor. Therefore, the AP processor noise algorithm library needs to obtain the relevant data of the initial ambient light reported by the SCP processor (steps E3 to E4). The AP processor finally returns the calculated values on the four channels of the target ambient light to the SCP processor to obtain the Lux value of the target ambient light (step A8, step E5, step E6).
In the technical architecture shown in fig. 12, step 3, the target ambient light is calculated according to the image noise, the backlight noise and the initial ambient light, and is implemented in the denoising module of the SCP processor. The image noise and backlight noise are acquired by the AP processor and the initial ambient light is acquired by the ambient light sensor drive of the SCP processor. Therefore, the denoising module of the SCP processor needs to acquire the image noise and the backlight noise transmitted by the AP processor (step A8, step E5, step E6), and also needs the ambient light sensor of the SCP processor to drive the transmitted initial ambient light (step E3).
In view of the above analysis, in the technical architecture shown in fig. 7, the calculations of step 1 to step 3 need to be implemented in the noise algorithm library of the AP processor. In the technical architecture shown in fig. 12, step 1 and step 2 need to be implemented in the noise algorithm library of the AP processor, and step 3 needs to be implemented in the computation module of the SCP processor.
For a clearer understanding of the process of obtaining the target ambient light corresponding to the technical architecture shown in fig. 12, reference is made to a timing chart shown in fig. 13. From t in connection with events at various times in FIG. 9 0 The moment begins and the image is refreshed. At the same time, the ambient light sensor enters an integration period and begins to collect initial ambient light.
Accordingly, in FIG. 13, step E1, the ambient light sensor in the co-hardware layer of the SCP processor enters an integration period from t 0 (t 01 ) The initial ambient light is collected at the beginning of the time.
Steps A1 through A6 refer to the descriptions of steps A1 through A6 in the embodiment of FIG. 7.
Step A7, noise Algorithm library in hardware abstraction layer in AP processor will t 01 The image noise at that moment is sent to the HWC of the same layer.
Step A8, calculating t in AP processor 01 After the image noise of the moment, t 01 The image noise at that moment is sent to the ambient light sensor application of the co-application layer of the SCP processor.
Step A9 (step E5 in the architecture shown in FIG. 12), the ambient light sensor application of the co-application layer of the SCP processor will t 01 Sending the image noise of the moment to SCP for processingAnd the noise memory of the co-driver layer.
Steps B1 through B2 refer to the description of steps B1 through B2 in the embodiment of FIG. 7.
Step B3, noise algorithm library in hardware abstraction layer in AP processor will t 02 The backlight noise at the moment is sent to the HWC of the same layer.
Step B4, calculating t at AP processor 02 After the backlight noise of the moment, t 02 The backlight noise at the moment is sent to the ambient light sensor application of the co-application layer of the SCP processor.
Step B5 (step E5 in the architecture shown in FIG. 11), ambient light sensor application t of the co-application layer of the SCP processor 02 The backlight noise at the moment is sent to a noise memory of the SCP processor co-driving layer.
The steps C1 to C9, and the steps D1 to D9 refer to the descriptions of the steps a1 to a9, which are not repeated herein.
After the ambient light sensor integration is over, the SCP processor is triggered to perform step E2, step E2 as described with reference to the embodiment shown in fig. 7.
And E3 to E6, the SCP processor cooperates with a denoising module in the driving layer to take out the fusion noise from the noise memory of the layer, and raw values on four channels of the initial ambient light are obtained from the ambient light sensor of the layer. And calculating according to raw values on four channels of the initial ambient light and image noise and backlight noise which interfere with the initial ambient light to obtain the target ambient light. During non-integration periods, the image may also be refreshed (e.g., t) 11 Temporal image refresh), brightness may also change (e.g., t 12 The luminance changes at the moment). Therefore, in the non-integration period, steps F1 to F9 still exist in fig. 13 (steps F1 to F5 in fig. 13 are omitted, and specifically, steps a1 to a5 in fig. 13 may be referred to), so that t is t 11 The image noise at the time is stored in a noise memory of the SCP processor. In the non-integration period, there are still steps G1 to G5 (step G1 in fig. 13 is omitted, and specifically, step B1 in fig. 13 may be referred to), so that t 12 The backlight noise at a time is stored in a noise memory of a noise algorithm library.
The process of obtaining the target ambient light from the target image, the brightness, and the initial ambient light calculation by the noise algorithm library in the embodiment shown in fig. 7 will be described below.
Step one, when a noise calculation base obtains a target image, calculating and obtaining image noise at the refreshing time of the target image according to the target image and the brightness of a display screen at the refreshing time of the target image; and when the noise calculation library obtains a brightness, calculating and obtaining the backlight noise at the brightness adjusting time according to the brightness and the target image at the brightness adjusting time.
Although the image noise and the backlight noise are different names, the calculation process is calculated based on a frame of the target image and a luminance value.
The target image is composed of a plurality of pixel points, firstly, weighting and operation are carried out according to the RGB value of each pixel point and the weighting coefficient of each pixel point, and the weighted RGB value of the target image is obtained. And determining the weighting coefficient of each pixel point according to the distance between the coordinate of the pixel point and the reference coordinate of the target image. The coordinates of the center point of the photosensitive area of the ambient light sensor may be used as reference coordinates of the target image.
And step two, the noise calculation library obtains fusion noise according to the weighted RGB value and the brightness of the target image. The fusion noise may be obtained by a table lookup method (in the table, fusion noise corresponding to the weighted RGB value of the target image and the luminance is set), or may be obtained by a preset functional relationship (the independent variable is the weighted RGB value and the luminance of the target image, and the dependent variable is the fusion noise). The fusion noise obtained at this time is a raw value of four channels.
And step three, calculating and obtaining integral noise in the integral time period of the initial environment light by the noise calculation base according to the fusion noise at each moment.
It should be noted that image noise is not generated by the image refresh process itself. In the integration time period, in the time period before the image refreshing, the interference to the initial environment light is the image noise corresponding to the image before the refreshing, and in the time period after the image refreshing, the interference to the initial environment light is the image noise corresponding to the image after the refreshing.
Similarly, the backlight noise is not generated by the process of adjusting the brightness. In the integration time period, in the time period before brightness adjustment, the interference to the initial environment light is backlight noise corresponding to the brightness before adjustment, and in the time period after brightness adjustment, the interference to the initial environment light is backlight noise corresponding to the brightness after adjustment.
As described above, the noise memory stores the image noise and the backlight noise at each time point calculated by the noise algorithm library. The noise stored in the noise memory is collectively referred to as fusion noise or first noise.
A step a1, the first processor fetches the first noise from the exit position of the noise memory through the noise algorithm library, and the first processor updates the exit position of the noise memory or the first noise of the exit position through the noise algorithm library;
step B1, if the timestamp corresponding to the currently fetched first noise is before the first time or the first time, the first processor continues to execute the step A1 through the noise algorithm library until the currently fetched first noise is after the first time;
step B2, if the first noise currently extracted is after the first time, the first processor performs the following steps through the noise algorithm library:
Step C1, if the timestamp of the currently extracted first noise is after the first time for the first time and before the second time, calculating and obtaining the integral noise between the first time and the time corresponding to the timestamp of the currently extracted first noise according to the last extracted first noise, and continuing to execute the step A1;
step C2, if the timestamp of the currently extracted first noise is after the first time for the first time and after the second time or the second time, calculating to obtain the integral noise between the first time and the second time according to the last extracted first noise, and continuing to execute step D1;
step C3, if the timestamp of the first noise currently taken out is not after the first time and before the second time, calculating, according to the first noise taken out last time, to obtain an integrated noise between a time corresponding to the timestamp of the first noise taken out last time and a time corresponding to the timestamp of the first noise currently taken out; and continues from step a 1;
step C4, if the timestamp of the first noise taken out at present is not after the first time and is after the second time or the first time, calculating the integral noise between the time corresponding to the timestamp of the first noise taken out at last time and the second time according to the first noise taken out at last time, and continuing to execute step D1;
And D1, obtaining a second value according to the integral noise between the first time and the second time and the first value.
When the noisy memory is a fifo (First Input First output) memory. The FIFO memory is a first-in first-out double-port buffer, one of two ports of the memory is an input port of the memory, and the other port of the memory is an output port of the memory. In the structure of the memory, the data which enters the memory firstly is shifted out, and correspondingly, the sequence of the shifted-out data is consistent with the sequence of the input data. The outlet position of the FIFO memory is the storage address corresponding to the output port of the FIFO memory.
When the FIFO memory shifts out a datum, the process is as follows: the fusion noise stored in the exit position is removed from the exit position (first position), and then the data in the second position from the exit position is moved to the exit position, and the data in the third position from the exit position is moved to the second position from the exit position, … … in sequence.
Of course, in practical applications, after the fused noise stored at the first position (a1) is removed from the exit position (the first position, a1), the exit position in memory may be updated to the second position (a 2). After the fusion noise stored at the current exit position (a2) is removed again, the exit position of the memory is continuously updated to the third position (A3) … ….
The process of obtaining the second value based on the above calculation may refer to the embodiment described with reference to fig. 14 to the embodiment shown in fig. 16.
Referring to fig. 14, fig. 14 is a process of calculating integral noise according to image noise and backlight noise by the noise calculation library in the AP processor provided in the embodiment of the present application. The various times in the process may be compared to the descriptions of the various times in the embodiments shown in fig. 9 and 10: at t 01 Constantly refreshing the image to obtain t 01 Image noise at the moment; at t 02 Adjusting brightness at every moment to obtain t 02 Backlight noise at the moment; at t 03 Constantly refreshing the image to obtain t 03 Image noise at the moment; at t 04 Refreshing the image at all times to obtain t 04 Image noise at time instants.
From t 01 Time to t 02 At the moment, the displayed image is t 01 The brightness of the display screen of the image after the moment refreshing is t 01 Brightness at time (t) 01 The brightness at the moment is the brightness value stored in the noise algorithm library at t 01 Brightness value most recently stored before the time), t 01 The image noise at time t 01 The brightness of the image after the moment refreshing on the display screen is t 01 Noise in the case of the brightness of the time instant. Thus, the initial ambient light includes a duration of "t 02 -t 01 ", time stamp t 01 The image noise of (1).
From t 02 Time to t 03 At the moment, the brightness of the display screen is t 02 The brightness after the moment adjustment is t, the image displayed by the display screen 01 Image after temporal refresh, t 02 Backlight noise at time t 02 The brightness after the moment adjustment is displayed on the display screen to display t 01 Noise in the case of time-adjusted images. Thus, the initial ambient light includes a duration of "t 03 -t 02 ", time stamp t 02 Backlight noise at the moment.
From t 03 Time to t 04 At the moment, the displayed image is t 03 The brightness of the display screen of the image after the moment refreshing is t 02 Brightness after time adjustment, t 03 Image noise at time t 03 The brightness of the image refreshed at any moment on the display screen is t 02 Noise in the case of the luminance after the time adjustment. Thus, the initial ambient light includes a duration of "t 04 -t 03 ", the time stamp is t 03 The image noise of (1).
From t 04 Time to t 1 At the moment, the displayed image is t 04 The brightness of the display screen of the image after the moment refreshing is t 02 Brightness adjusted at time t 04 The image noise at time t 04 The brightness of the image after the moment refreshing on the display screen is t 02 Noise in the case of the luminance after the time adjustment. Thus, the initial ambient light includes a duration of "t 1 -t 04 ", time stamp t 04 The image noise of (1).
Based on the above understanding, the AP processor, in calculating the integral noise:
t 01 Image noise pair t of time 01 Time to t 02 The initial ambient light at that moment causes interference;
t 02 backlight noise pair t of time instants 02 Time to t 03 The initial ambient light at that moment causes interference;
t 03 image noise pair t of time instant 03 Time to t 04 The initial ambient light at the moment causes interference;
t 04 image noise pair t of time 04 Time to t 1 The initial ambient light at the moment causes interference.
Thus, t can be calculated separately 01 Time to t 02 Integral noise at time t 02 Time to t 03 Integral noise at time t 03 Time to t 04 Integral noise at time t 04 Time to t 1 Integral noise at time instants.
For t 01 Time to t 02 The integral noise at time is: (t) 02 -t 01 )/(t 1 -t 0 )×N t01 。
For t 02 Time to t 03 Integral noise of time of dayComprises the following steps: (t) 03 -t 02 )/(t 1 -t 0 )×N t02 。
For t 03 Time to t 04 The integral noise at time is: (t) 04 -t 03 )/(t 1 -t 0 )×N t03 。
For t 04 Time to t 1 The integral noise at time is: (t) 1 -t 04 )/(t 1 -t 0 )×N t04 。
Wherein N is t01 Represents t 01 Merging noise of moments, N t02 Represents t 02 Merging noise of moments, N t03 Represents t 03 Merging noise of moments, N t04 Represents t 04 Fusion noise at time.
While each sub-period (t) within the integration period 01 To t 02 ,t 02 To t 03 ,t 03 To t 04 ,t 04 To t 1 ) The integrated noise of (a) together is the integrated noise of the whole integration period.
The start time of the integration period in the above example is just the time of image refresh, i.e., the image noise at the start time of the integration period can be obtained.
In practical applications, it is possible that the start time of the integration period is not the time of image refresh nor the time of backlight adjustment. In this case, it is necessary to acquire the fusion noise corresponding to the change time (image refresh time or backlight adjustment time) that is the latest before the start of the current integration period.
Referring to fig. 15, an integration time period (t) is obtained for a noise calculation library in an AP processor provided in an embodiment of the present application 01 Time to t 1 Time of day), t 01 The time is not the starting time of the current integration time period, but is the image refreshing time within the current integration time period. The latest change time (image refresh time or brightness adjustment time) before the start of the current integration time period is t -1 And the moment is the image refreshing moment.
Referring to fig. 16, if the latest change time before the start of the current integration period is the image refresh time, the image noise corresponding to the image refresh time will be referred to t 0 Time to t 01 The initial ambient light at that moment causes interference.
Of course, if the latest changing time is the brightness adjusting time, the backlight noise corresponding to the brightness adjusting time will be for t 0 To t 01 The initial ambient light at the moment causes interference.
In the embodiment shown in fig. 16, the integration noise corresponding to each sub-period in the integration period is:
for t 0 Time to t 01 The integral noise at time is: (t) 01 -t 0 )/(t 1 -t 0 )×N t-1 。
For t 01 Time to t 02 The integral noise at time is: (t) 02 -t 01 )/(t 1 -t 0 )×N t01 。
For t 02 Time to t 03 The integral noise at time is: (t) 03 -t 02 )/(t 1 -t 0 )×N t02 。
For t 03 Time to t 04 The integral noise at time is: (t) 04 -t 03 )/(t 1 -t 0 )×N t03 。
For t 04 Time to t 1 The integral noise at time is: (t) 1 -t 04 )/(t 1 -t 0 )×N t04 。
Wherein N is t-1 Represents t -1 Merging noise of moments, N t01 Represents t 01 Merging noise of moments, N t02 Represents t 02 Merging noise of moments, N t03 Represents t 03 Merging noise of moments, N t04 Represents t 04 Fusion noise at time.
As can be understood from the above example, the obtained integral noise is also a raw value on four channels.
The timestamps in the above examples are different, and in practical applications, the HWC may perform both the process of acquiring the target image and the process of acquiring the brightness to be adjusted within one time measurement unit (e.g., within 1 ms). However, the time stamp of the target image acquired at this time and the brightness to be adjusted are the same.
If a target image and a brightness value with the same timestamp exist and the noise algorithm library receives the target image first, the noise algorithm library calculates image noise according to the latest brightness value before the target image and the target image, and calculates backlight noise according to the target image and the brightness value with the same timestamp when calculating backlight noise corresponding to the brightness value;
If the target image and the brightness value with the same timestamp exist and the noise algorithm library receives the brightness value first, the noise algorithm library calculates backlight noise according to the brightness value and the latest target image before the brightness value, and when image noise corresponding to the target image is calculated, image noise is calculated according to the target image and the brightness value with the same timestamp.
The noise algorithm library firstly receives a target image, then calculates to obtain image noise, and firstly stores the image noise to a noise memory. The fusion noise stored in the noise memory has a time sequence, that is, before being stored in the noise memory, whether the fusion noise to be stored currently is behind the timestamp of the fusion noise stored last time is judged, if the fusion noise to be stored currently is behind the timestamp of the fusion noise stored last time, the fusion noise to be stored currently is stored, and if the fusion noise to be stored currently is before or the same as the timestamp of the fusion noise stored last time, the noise to be stored currently is discarded. Therefore, the backlight noise obtained by the post-calculation is discarded.
In practice, the timestamp of the target image may be the time when the HWC starts to perform a write-back memory fetch of the target image from the CWB as the timestamp of the target image. The timestamp of the luma value may be the time at which the HWC started to perform the fetching of the luma value from the kernel node as the timestamp of the luma value. The HWC may switch to capture luminance values during the process of capturing the target image. Therefore, the HWC performs the capture of the target image first and then the luminance value, and the timestamp of the luminance value is later than the timestamp of the target image. In practical applications, the HWC may obtain the luminance value first and send the luminance value to the noise algorithm library, and the noise algorithm library calculates the backlight noise and stores the backlight noise. And obtaining a target image after HWC and sending the target image to a noise algorithm library, and calculating by the noise algorithm library to obtain image noise and storing the image noise. This results in the time at which the timestamp of image noise is currently ready to be stored being before the time at which the timestamp of backlight noise was last stored.
And step four, removing integral noise of the whole integral time period from the initial environment light by a noise algorithm library to obtain the target environment light.
In the embodiment of the present application, the initial ambient light sent by the SCP processor to the HWC of the AP processor is initial ambient light data in the form of RGBC four-channel raw values. The HWC sends the initial ambient light data, also in the form of RGBC four-channel raw values, to the noise algorithm library. The raw values over the four channels of the integrated noise are obtained in step three. Therefore, in this step, the raw value on the four channels of the target ambient light can be obtained by performing an operation on the raw value of the four channels of the initial ambient light and the raw value of the integrated noise four channels.
After calculating and obtaining raw values on four channels of the target ambient light, the noise algorithm library can send the raw values on the four channels of the target ambient light to the SCP processor, and the SCP processor calculates and obtains the lux value of the target ambient light according to the raw values on the four channels of the target ambient light.
As an example, the lux value may be weighted according to the raw value of each channel multiplied by a coefficient of each channel (which may be provided by the manufacturer of the ambient light sensor).
As described above, each time (or once, twice, etc. at intervals) the electronic device refreshes an image, the DSS in the AP processor stores the image before display (which may be understood as an image to be refreshed in the current refreshing process or an image after the current refreshing) in the CWB memory. The HWC in the AP processor performs matting to obtain a target image from the image to be refreshed stored in the CWB memory, and then sends the target image to a noise algorithm library. For convenience of description, the step of the DSS in the AP processor storing the pre-send image in the CWB memory, the HWC obtaining the target image from the CWB memory is denoted as a write-back function of the CWB.
In the CWB write back function enabled state of the AP processor, the DSS stores the pre-presentation image in the CWB memory each time (or every other time, twice, etc.) the electronic device refreshes the image. After the DSS stores the image before display in the CWB memory, the DSS sends a message to the HWC module that the storage was successful. After receiving the successful store information, the HWC may retrieve the target image from the CWB memory. Accordingly, the HWC module may send the target image to a noise algorithm library.
That is, in the CWB write back function enable state of the AP processor, the AP processor performs steps a1 through a step a6 in fig. 7 each time (or, once every interval, twice, etc.) the electronic device refreshes an image.
In the CWB write-back function stop state of the AP processor, the electronic device performs steps a1 to A3 in the technical architecture shown in fig. 7 according to the refresh display flow each time (or every other time, twice, etc.) the image is refreshed. However, the DSS no longer stores the pre-sent image in the CWB memory and, accordingly, the AP processor no longer performs subsequent correlation steps.
That is, in the CWB writeback function stop state of the AP processor, the AP processor performs steps a1 through A3 in fig. 7 each time (or, once, twice, etc. at intervals) the electronic device refreshes the image, but does not perform any more steps a4 through a6 in the technical architecture shown in fig. 7.
As mentioned above, after receiving the target image, the noise algorithm library calculates and obtains the image noise according to the received target image. Therefore, when the electronic device performs image refreshing during the activation state of the CWB write-back function of the AP processor, the corresponding image noise is not obtained, and when the electronic device performs image refreshing during the deactivation state of the CWB write-back function of the AP processor, the corresponding image noise is not obtained.
Taking fig. 9 and 14 as an example, at the end of integration (t) 1 Time) later, the noise algorithm library is calculated to obtain t 0 Time to t 1 Target of time of dayWhen ambient light is used, the fusion noise used is: t is t 01 Image noise at time, t 02 Backlight noise at time, t 03 Image noise sum of time of day t 04 Image noise at the moment. The unnecessary fusion noise includes at least: t is t 11 Image noise sum of time of day t 12 Backlight noise at the moment. That is, there is interference on the initial ambient light collected in the current integration process by the image noise and the backlight noise corresponding to the start time (including the start time) to the end time of the integration time period, and there may be no interference on the initial ambient light collected in the current integration process by the image noise and the backlight noise corresponding to the non-integration time period.
Thus, to reduce power consumption, the AP processor may control the CWB writeback function to be enabled via the HWC during the integration period of the ambient light sensor, and the AP processor performs steps a4 through a 6. During the non-integration period of the ambient light sensor, the CWB writeback function is stopped by the HWC control and the AP processor no longer performs steps a4 through a 6.
Referring to fig. 17, fig. 17 is a schematic diagram of a start-stop method of a CWB write-back function according to an embodiment of the present application.
As shown in fig. 17, during the integration period of the ambient light sensor (from t) 0 To t 1 From t 2 To t 3 From t 4 To t 5 ) The HWC controls the CWB writeback function to start, during the non-integration period (from t) 1 To t 2 From t 3 To t 4 From t 5 To t 6 ) The HWC controls the CWB write back function to stop. In this way, the image noise in each integration process can be obtained, and the power consumption of the AP processor can be reduced.
The embodiment of the application will focus on a start-stop method of the CWB write-back function. In the start-stop method of the CWB write-back function, the HWC in the AP processor may monitor whether the data in the kernel node has changed during both integration and non-integration periods. When the brightness of the backlight is changed, the HWC can obtain the brightness to be adjusted, and correspondingly, the noise algorithm library can calculate the backlight noise of the whole acquisition period.
In the subsequent embodiments of the present application, for example, the HWC in the AP processor may monitor the change of data in the core node in both the integration period and the non-integration period, and when the data stored in the core node changes, the HWC obtains the brightness to be adjusted from the core node, and transmits the brightness to the noise algorithm library to calculate and obtain the backlight noise.
In addition, since the integration process of the ambient light sensor is controlled by the SCP processor side, the SCP processor needs to send time-related parameters during the process of acquiring the initial ambient light by the ambient light sensor to the AP processor.
As described above, the SCP processor may transmit the initial ambient light, the time related to the integration process of the current initial ambient light (for example, the current integration end time and the current integration duration, or the current integration start time and the current integration end time, etc.) to the HWC in the AP processor after obtaining the initial ambient light each time the integration ends. The SCP processor may also send the integration start time (or a period of time before) when the ambient light sensor is next ready to collect the initial ambient light to the HWC in the AP processor as the time when the CWB writeback function is initiated. The integration end time (or some time thereafter) at which the initial ambient light is next ready to be collected is sent to the HWC in the AP processor as the time at which the CWB write back function stops. I.e., the start and stop times at which the CWB write back function is sent by the SCP processor to the HWC in the AP processor.
In practical application, when the acquisition period of the ambient light sensor is fixed, the SCP processor may send the integration start time of the next acquisition of the initial ambient light to the AP processor as the time when the CWB write-back function is started. And the AP processor calculates and obtains the moment when the CWB write-back function stops according to the received starting moment and the received acquisition period of the CWB write-back function. Or the SCP processor reports the start time of the first integration, the integration duration, the sampling period and the like. The start time and stop time of the CWB write back function are determined by the AP processor based on the data.
It should be noted that, in the embodiment of the present application, the parameter related to the time during the process of acquiring the initial ambient light by the ambient light sensor, which is sent by the SCP processor to the AP processor, is not limited. As long as the AP processor can obtain the next starting time of the CWB write back function according to the received time-dependent parameter.
The start time of the CWB write-back function does not completely coincide with the integration start time, and the stop time of the CWB write-back function does not completely coincide with the integration end time. The time period corresponding to the start state of the CWB write back function needs to include the integration time period in the acquisition cycle. Taking an acquisition cycle as an example, the start time of the CWB write-back function is earlier than or equal to the start time of the integration period of one acquisition cycle, and the stop time of the CWB write-back function is later than or equal to the start time of the integration period of one acquisition cycle.
As another example, the SCP processor may further send the initial ambient light, the integration duration corresponding to the initial ambient light (or the current integration start time), the current integration end time, and the sleep duration of the CWB write-back function to the HWC in the AP processor after the current integration is ended. For convenience of description, information transmitted after the end of the integration may be collectively referred to as first information. The first information is not limited to the above information, and more or less information may be included in the above information.
As described above, the noise algorithm library calculates and obtains the target ambient light based on the initial ambient light, the integration duration (or the current integration start time) corresponding to the initial ambient light, the current integration end time, and the corresponding fusion noise. The detailed process refers to the description of the above embodiments and is not repeated.
The HWC in the AP processor needs to determine the time to initiate the CWB writeback function based on the sleep duration of the AP processor.
Of course, the first message may be split into a plurality of sub-messages, which are respectively sent to the HWC of the AP processor. The embodiment of the present application does not limit this.
Taking the example of sending the above information together to the AP processor, the SCP processor may transmit the time at which the SCP processor sends the first information together.
After receiving the first message, the HWC in the AP processor first controls the CWB write back function to stop. And then the HWC in the AP processor obtains the starting time of the CWB write-back function according to the partial information in the received first information or obtains the starting time of how long the CWB write-back function needs to be waited for.
Since the start of the CWB write back function may be before the start of the integration of the next cycle, it is not necessary to strictly control at some point in time. Therefore, the starting time of the CWB write back function can be obtained in any of the following ways or other ways not shown in the embodiments of the present application.
The HWC in the AP processor obtains the inter-core communication delay based on the time when the SCP processor sends the first information and the time when the AP processor receives the first information. Then, the HWC in the AP processor obtains, according to the inter-core communication delay and the sleep time, a time length (the sleep time length minus the inter-core communication delay) until the start time of the CWB write-back function is still waiting or obtains a start time of the CWB write-back function (the time length until the start time of the CWB write-back function is added to the time length until the AP processor receives the first information through the HWC). The HWC in the AP processor initiates the CWB writeback function while waiting until the CWB writeback function is initiated.
As an example, in the case where the total duration of the non-integration period is 300ms, the sleep duration may be 240ms, 250ms, 260ms, 270ms, 280ms, or the like. Thus, even if there is an inter-core communication delay (e.g., 1ms), the CWB write back function is guaranteed to start before the next integration starts.
Certainly, in practical applications, the AP processor may also calculate, by using the HWC, the start time of the CWB write-back function (the integration end time plus the sleep time) or the time duration that the start time of the CWB write-back function should continue to wait from the start time of the CWB write-back function (the integration end time plus the sleep time minus the time when the AP processor receives the first information through the HWC) according to the integration end time and the sleep time sent by the SCP processor. In this example, the CWB write back function start time is noted as the first time. The first time is also the time when the second time length passes after the time when the cutout mark is set as the first character. The sleep duration in the first information may be recorded as a first duration. The second duration is: the sleep duration minus the duration of the delay. The time length of the delay is as follows: the time when the HWC module receives the first message is subtracted by the duration of the integration end time, which may be recorded as the second time. As described above, the first information received by the AP processor may further include an integration start time (first time), an integration end time (second time), an initial ambient light (first value), a sleep duration (first duration), and the like.
Since the scratch out flag represents the start and stop of the CWB write back function, the duration of the CWB write back function stop can also be understood as the duration that the scratch out flag is set to the first character.
As described above, the start time of the CWB write-back function is not strictly fixed at a certain time in the embodiment of the present application, and therefore, other calculation methods may also be adopted in the embodiment of the present application as long as the start time of the CWB write-back function is ensured to be before the integration start time. Therefore, the AP processor may also ignore the communication delay by using the sleep duration in the received first message as the sleep duration of the CWB write-back function.
The above examples are described by taking the case of time alignment between the AP processor and the SCP processor, and in the case of time misalignment between the AP processor and the SCP processor, the time difference when the AP processor and the SCP processor are not time aligned needs to be considered on the basis of the time or the time obtained by the above calculation.
As described above, the stop time of the CWB write back function is the time or after the AP processor receives the first message through the HWC, and therefore, after the CWB write back function is activated, the HWC stops the CWB write back function in the current activation state at the time or after the next reception of the first message sent by the SCP processor.
As can be appreciated in the manner described above, the turn-off time of the CWB write back function is after the integration ends. The starting time of the CWB write back function may be determined according to the sleep duration of the AP processor.
Referring to fig. 18, on the AP processor side, each time the electronic device refreshes an image, the surface flipper sends the display parameters of the interface to the HWC (specifically, refer to step a1 in the embodiment shown in fig. 7), the HWC obtains a synthesized image based on the display parameters, the HWC needs to query the matting flag, and if the matting flag indicates that the HWC starts, the HWC starts the CWB write-back function, and the AP processor may perform steps a1 to a6 in the technical architecture shown in fig. 7. I.e., the CWB write back function is enabled, the noise algorithm library can calculate the image noise and backlight noise during the activation of the CWB write back function.
In this example, the scratch marks can be written as write back marks.
The sectional mark in the above embodiments may also be in the form of an identifier. After the HWC receives the first information sent by the SCP processor, the HWC sets the identifier to a first character (e.g., 0, False), and after the HWC waits for the sleep duration, the HWC sets the identifier to a second character (e.g., 1, True). If the tag is the first character (e.g., 0, False), the HWC control stops the CWB writeback function. If the identifier is the second character (e.g., 1, True), then the HWC control initiates the CWB write back function.
In this example, the first character may be marked as a first mark and the second character may be marked as a second mark.
The HWC controls whether the CWB write back function is enabled or disabled by querying the tag.
As an example, the HWC may query whether the identifier is currently the first character or the second character each time before performing step a2 in the technical architecture shown in fig. 7. If the character is the second character, which indicates that the CWB write-back function is in the enabled state, the HWC transmits the information to be scratched when executing step a2, and the display subsystem stores the image to be displayed in the CWB memory after receiving the synthesized image and the information to be scratched, which are transmitted downward by the HWC. If the character is the first character and indicates that the CWB write-back function is in a stop state, the HWC executes step a2, and then does not transmit the information that needs to be scratched (or transmits the information that does not need to be scratched all together), and after the display subsystem receives the synthesized image that the HWC has transmitted downward, if the information that needs to be scratched (or receives the information that does not need to be scratched), the image to be displayed is not stored in the CWB memory. The HWC may not retrieve the target image.
As an example, when the matte mark is the second mark, if the electronic device refreshes the image (which may be referred to as the fifth image), the surface flicker transmits the display parameter of the interface (which may be referred to as the fourth display parameter) to the HWC, and the HWC may call the underlying hardware composite image after receiving the fourth display parameter. The HWC may transmit the synthesized image (which may be the fifth image or an image processed to obtain the fifth image) to the display subsystem together with the information to be scratched (which may be referred to as the third information). The display subsystem receives the fifth image and the third information, and may store the fifth image, or a partial image in the fifth image (which may be referred to as a sixth image), and also a target image on the fifth image (which may be referred to as a third target image) in the CWB memory. The HWC obtains the target image from the CWB memory and sends the target image to a noise algorithm library. The library of noise algorithms may derive image noise (which may be referred to as second image noise) based on the target image.
As another example, when the matte mark is the first mark, if the electronic device refreshes the image (which may be referred to as the first image), the surface flicker transmits the display parameter of the interface (which may be referred to as the fifth display parameter) to the HWC, and the HWC may call the underlying hardware composite image after receiving the fifth display parameter. The HWC does not transmit the information to be scratched when transmitting the synthesized image (which may be the first image or an image that has been processed to obtain the first image) to the display subsystem. The display subsystem receives the first image and no longer stores the first image, a partial image (which may be referred to as a second image) in the first image, and a target image (which may be referred to as a first target image) on the first image in the CWB memory. Accordingly, the HWC cannot obtain the target image from the CWB memory and does not send the target image to the noise algorithm library. The noise algorithm library no longer derives image noise based on the target image.
As shown in fig. 18, on the SCP processor side, after the SCP processor is started, the driving of the ambient light sensor is initialized, and then the ambient light integration is started according to a preset acquisition period.
After the ambient light integration is finished, the on-off screen state can also be monitored, and after the on-off screen event is monitored by the HWC on the AP processor side, the AP processor sends related information to the SCP processor to trigger the change of the on-off screen state on the SCP processor side.
In the bright screen state, the SCP processor needs to send the acquired initial ambient light to the HWC of the AP processor, the HWC in the AP processor sends the initial ambient light to the noise algorithm library, and the noise algorithm library calculates to obtain a raw value of the target ambient light according to the received initial ambient light. The AP processor sends the raw value of the target ambient light to the ambient light memory of the SCP processor. Of course, in practical applications, the AP processor may also calculate the lux value of the target ambient light according to the raw value of the target ambient light. The AP processor sends the lux value of the target ambient light to the SCP processor.
Referring to fig. 18, in the bright screen state, the SCP processor needs to send the integration end time and the sleep duration to the AP processor.
And the AP processor receives the integral ending time and the dormancy duration reported by the SCP processor. The HWC sets the scratch flag to be first character, and in the case where the scratch flag is first character, the HWC will stop the CWB write back function.
After receiving the first information, a matting thread (a thread for executing to obtain a target image) in the HWC in the AP processor sets a matting flag as a first character. Then, the duration which should be dormant is obtained through calculation, a sleep function is called based on the duration which should be dormant, when the sleep function is called by the matting thread, the duration which should be dormant is transmitted (for example, 270 ms), and the matting thread can be dormant for 270 ms. After sleeping for 270ms, the matting thread is finished sleeping. After the dormancy of the matting thread is finished, the matting thread sets the matting mark as a second character, and the CWB write-back function is started.
The SCP processor also needs to calculate to obtain the lux value of the target ambient light from the raw value of the target ambient light. In addition, ambient light integration continues to be initiated at the beginning of the next integration.
When the screen is turned off, the initial ambient light collected by the ambient light sensor is the real ambient light, and at this time, the SCP processor can no longer report the lux value of the initial ambient light collected in the integration time period to the AP processor. Since the CWB write back function need not be enabled to pick up the associated noise in the off-screen state, the SCP processor no longer counts the time for the next activation of the CWB write back function.
Certainly, in some scenes, for example, when the face is unlocked in the screen-off state, the electronic device needs to know whether the current environment is a dark environment, and the face needs to be supplemented with light in the dark ambient light. Therefore, the electronic device needs to know the current lux value of the real ambient light in this scenario. Therefore, even in the screen-off state, the ambient light sensor needs to collect ambient light, and the SCP processor reports the lux value of the collected ambient light to the AP processor when receiving the face unlocking request issued by the AP processor, so that the AP processor determines whether to supplement light according to the received lux value of the ambient light.
The embodiment of the application focuses on how the HWC controls the start and stop of the CWB write back function in the AP processor. Other details not shown may refer to the description in any of the embodiments above.
As mentioned above, the starting time and the stopping time of the CWB write back function in the AP processor are determined by the data reported by the SCP processor. Data transmission delays may exist in view of inter-core communications between the AP processor and the SCP processor. It is possible to provide: after the AP processor determines that the display screen is on, the HWC in the AP processor controls the CWB write-back function to be always on, and after the HWC receives the first information reported by the SCP processor, the HWC starts to control the CWB write-back function to start and stop according to the start-stop scheme described in any of the above embodiments.
Taking the embodiment shown in fig. 9 as an example, if the start-stop method of the CWB write-back function shown in fig. 17 is adopted:
HWC can obtain t 01 The target image at the moment, the noise algorithm base also calculates to obtain t 01 Image noise at the moment;
HWC may obtain t 02 The brightness value to be adjusted at the moment, and the noise algorithm base also calculate to obtain t 02 Backlight noise at the moment;
HWC can obtain t 03 The target image at the moment and the noise algorithm base also calculate to obtain t 03 Image noise at a time;
HWC can obtain t 04 The target image at the moment and the noise algorithm base also calculate to obtain t 04 Image noise at a time;
HWC no longer gets t 11 The target image at the moment, the noise algorithm base does not calculate to obtain t 11 Image noise at a time;
HWC can obtain t 12 The brightness value to be adjusted at the moment, and the noise algorithm base do not calculate to obtain t 12 Backlight noise at the moment.
If the electronic device refreshes an image at a frequency of 60Hz and a CWB write-back function is normally open in a scene that the electronic device plays a video, the HWC may acquire a target image 300ms/(1000ms/60) 18 times in a non-integration time period (300ms for example) within an acquisition cycle (350ms for example), and the noise algorithm library may calculate and store image noise 18 times.
By using the CWB write-back function start-stop scheme shown in fig. 17, within one acquisition cycle (350ms), the process of acquiring the target image by the HWC for 18 times and the process of calculating the image noise by the noise algorithm library for 18 times can be reduced. Obviously, the power consumption can be reduced by adopting the CWB write back function start-stop scheme shown in fig. 17.
However, in the embodiment shown in FIGS. 15 and 16, if t -1 The time of day is in a non-integration period of time within the last acquisition cycle. Since the non-integration period CWB write back function is stopped (steps A4-A6 are no longer performed), the display subsystem no longer stores the image to be refreshed in the CWB memory, and accordingly, the HWC does not obtain t -1 The target image at the moment and the noise algorithm library do not obtain t -1 Target image of time and t -1 Image noise at time instants. Accordingly, there is also no t in the noise memory -1 Image noise at time instants. Then the noise algorithm library will lose the t pair when calculating the integral noise for each sub-period 0 Time to t 01 The initial ambient light corresponding to the moment has interfering fusion noise. In the missing pair t 0 Time to t 01 Under the condition that the initial environment light corresponding to the moment has the interfered fusion noise, the noise algorithm library adopts t stored in the noise memory -1 The fusion noise before the time is taken as the interference t 0 Time to t 01 The fusion noise of the initial ambient light at the moment of time, resulting in the target ambient light of the final calculation notAnd (3) accuracy.
To solve this problem, the write-back function of the CWB may be controlled to be activated before the start of each integration period, and after the write-back function of the CWB is activated, the image is forced to be refreshed once, so as to ensure that the display subsystem stores the image to be refreshed in the CWB memory, and the HWC can obtain the target image corresponding to the image to be refreshed from the CWB memory. Correspondingly, the noise algorithm library also calculates the image noise corresponding to the moment of forcibly refreshing the image. In the embodiment of the present application, an image that is forcibly refreshed is referred to as a third image.
As mentioned above, before the image is forced to be refreshed, the CWB write-back function is already enabled, that is, the matting flag is already recorded as the second flag, and the HWC module transmits the information (which may be recorded as the second information) to be scratched together when sending the image to be forced to be refreshed to the display subsystem. Accordingly, the image stored by the display subsystem into the CWB memory that can be a forced refresh may be a partial image of the forced refresh image (the partial image is denoted as the fourth image) or may be a target image (the target image is denoted as the second target image). As mentioned above, the target image may obtain corresponding image noise (which may be referred to as first image noise).
An interface for forcing an image is present in the HWC, and the HWC calls the interface when determining that the image needs to be forced, and the electronic device implements forced refreshing of the image once. When the HWC calls the interface, a first signal is sent to the Surface flag through the interface, and after receiving the first signal, the Surface flag acquires the latest cached display parameter in the cached display parameters from the cache, wherein the display parameter is marked as the first display parameter. The Surface flag sends the display parameter to the HWC module, the HWC calls underlying hardware based on the display parameter to obtain a synthesized image (the image is a third image), and checks that the scratch mark is a second mark, so that the HWC carries information to be scratched when sending the synthesized image to the display subsystem.
In practical applications, the latest cached display parameter of the display parameters cached by the Surface flicker may be the corresponding display parameter when the image is refreshed before. If the electronic device refreshes the image before forcibly refreshing the image, the refreshed image is the first image, and correspondingly, the latest cached display parameter of the display parameters cached by the Surface flicker may be the fifth display parameter for generating the first image. Thus, the third image may be the same as the first image. Therefore, the image that the electronic device implements the forced refresh may be an image currently displayed on the display screen of the electronic device (after the first image is refreshed last time, the display screen of the electronic device keeps displaying the first image). The process of forcibly refreshing the image is the same as the process of normally refreshing the image, and the forced image is displayed through the surface flanger, the HWC, the OLED drive and the display subsystem. For a specific process, reference may be made to the description of the above embodiments, which is not repeated herein.
In the embodiment of the application, the purpose of forcibly refreshing the image is to display the image currently displayed on the display screen, and the image currently displayed on the display screen is the image refreshed last time on the display screen. In practical applications, before the display subsystem displays the image to be displayed, a frame of image may be cached, and the frame of image may be understood as an image currently displayed on the display screen or an image refreshed on the display screen at the last time. The HWC retrieves the image from the cache and then passes the image down to the display subsystem along with the information that needs matting. The display subsystem may store the image (or a region image of the image, or a target image corresponding to the image) in the CWB memory and the HWC performs the step of retrieving the target image from the CWB memory.
As mentioned above, if the HWC needs to perform matting on a refreshed image to obtain a target image, the HWC can transmit the synthesized image downward to carry information that needs to be scratched. If the HWC does not need to matte the image currently to be refreshed, the HWC may not transmit the information that needs matting (or carry information that does not need matting). The display subsystem is based on whether the received image carries information needing to be scratched and is used as a basis for storing the information in the CWB memory. In the case where the received image carries information that needs to be matted, steps a4 through a6 in the technical architecture shown in fig. 7 are performed. In case the received image does not carry information that needs matting (or carries information that does not need matting), steps a4 to a6 in the technical architecture shown in fig. 7 are no longer performed.
The forced image refresh is performed after the CWB write-back function is activated, so when the AP processor performs steps a2 to A3 in the technical architecture shown in fig. 7, the transmitted data carries the information to be scratched.
Referring to fig. 19, a start-stop scheme for forcibly refreshing an image once after a CWB write-back function is started at a first preset time before integration starts is provided in the embodiment of the present application. In this embodiment and the following embodiments, for convenience of drawing, the time when the CWB write-back function is started and the time when the image is forcibly refreshed are set to be the same time. For convenience of drawing, the stop time of the CWB write back function and the integration end time are set to be the same time, and in practical applications, the stop time of the CWB write back function may be later than the integration end time.
As shown in fig. 19, a first preset time (t) before the start of integration for each acquisition cycle 2 -t 1n 、t 4 -t 3n 、t 6 -t 5n ) Corresponding time (t) 1n 、t 3n 、t 5n ) After the CWB write back function is enabled, the image is forced to refresh once. Also understood as a second preset time (t) after the start of the non-integration period of each acquisition cycle 1n -t 1 、t 3n -t 3 、t 5n -t 5 ) Corresponding time (t) 1n 、t 3n 、t 5n ) After the CWB write back function is enabled, the image is forced to refresh once. The sum of the first preset time and the second preset time is the duration of a non-integration time period.
Taking the first acquisition cycle as an example, T is the non-integration period of the first acquisition cycle (T1) 1n At this point, the HWC in the AP processor controls the writeback function of the CWB to start and force a refresh of the image once after start. HWC can obtain t 1n The target image corresponding to the image moment is forcibly refreshed at the moment, and the noise algorithm library can calculate to obtain t 1n Image noise at time, noise algorithm library will t 1n The image noise at the time is stored in a noise memory. Other acquisition periods being referred toExamples, which are not described in detail herein.
To verify that the start-stop scheme of the CWB writeback function shown in fig. 19 does not lose the fusion noise that interferes with the initial ambient light collected during the integration period, see the embodiment shown in fig. 20, in which the T of the non-integration period before the integration start time of the second collection period (T2) is shown in fig. 20 1n At this point, the HWC in the AP processor controls the writeback function of the CWB to start and force a refresh of the image once after start. HWC can obtain t 1n The target image corresponding to the image moment is forcibly refreshed at the moment, and the noise algorithm library can cache t 1n Target image of time and calculating to obtain t 1n Image noise at time, noise algorithm library will t 1n The image noise at that time is stored in a noise memory.
At t 1n After the time to the start of the integration period (t) 2 ) Neither brightness adjustment nor image refresh is present.
If there is only one brightness adjustment during the second acquisition period (T2): t is t 21 And adjusting the brightness at the moment. The library of noise algorithms may be based on t 1n Target image corresponding to image refreshed at any time and t 21 Adjusted luminance of a moment of time obtains t 21 Backlight noise at the moment. Noise algorithm library will t 21 The backlight noise at the moment is sent to the noise memory.
The integration period ends (t) during the second acquisition cycle 3 Time) after, t is stored in the noise memory 1n Image noise and t at time 21 Backlight noise at the moment.
Referring to fig. 21, the integral noise of the initial ambient light that interferes with the second acquisition period is:
duration of "t 2 Time to t 21 T at time "t 1n Image noise at a time;
Duration of "t 21 Time to t 2 T at time "t 21 Backlight noise at the moment.
As can be understood from the embodiment shown in fig. 20, if a start-stop scheme is used that forces the image to be refreshed before integration begins:
when there is no image refresh between the moment of forcibly refreshing the image to the moment of starting integration next time, the noise algorithm library can also obtain the first sub-time period (t) influencing the integral time period 2 Time to t 21 Time of day) of the fusion noise (t) 1n Fusion noise at time).
And, there is a brightness adjustment (t) between the time of this forced refresh of the image and the next image refresh time 21 Brightness adjustment of time), a target image (t) corresponding to the brightness adjustment time can be obtained 1n The target image corresponding to the time instant) to obtain the correct brightness adjustment time instant (t) 21 Time of day) corresponding to the backlight noise.
When the electronic device plays a video through the display screen, the image displayed on the display screen of the electronic device may be refreshed at a frequency of 60Hz, i.e., every 16.7 ms. The acquisition period of the ambient light sensor may be set to 350ms, the integration period may be set to 50ms, and the non-integration period to 300 ms. Even before the start of the integration period (e.g., t) 2 -t 1n 20ms) and forces a refresh of the image once after the CWB write back function is activated. This corresponds to the process of acquiring the target image and computing the image noise by the noise algorithm library with a reduced number of HWC acquisition target images (300-20)/16.7-16.8 times in one acquisition cycle.
In the above embodiment, t 2 -t 1n 20ms, in practice, t 2 -t 1n But may also be equal to other duration values. Example of the present application for t 2 -t 1n The corresponding time duration is set to ensure that the noise algorithm library can obtain the target image once and the image noise once before the integration starts. Therefore, the above embodiment can reduce the power consumption of the processor, and simultaneously can obtain accurate target ambient light.
In practical applications, when a display screen of an electronic device is on, the display screen may not be in a refresh state all the time, and may also be in an idle state for a long time.
The display screen comprises the following components when the display screen is lightened: an idle state and a refresh state. In practical application, the moment when the display screen refreshes the image for the last time can be obtained, and whether the display screen is in a refreshing state or an idle state at present is judged according to the difference value between the current moment and the moment when the display screen refreshes the image for the last time. A threshold value may be preset, the display screen is currently in a refresh state when a difference between a current time and a time when the image is refreshed for the last time on the display screen is smaller than the threshold value, and the display screen is currently in an idle state when the difference between the current time and the time when the image is refreshed for the last time on the display screen is greater than or equal to the threshold value.
The embodiments of the present application do not intend to strictly distinguish between a refresh state and an idle state. It is only to explain that when the image displayed on the display screen of the electronic device does not change for a long time (idle state), the image displayed on the display screen is always the corresponding image when the image is refreshed for the last time.
As another example, when a user views a certain interface of an electronic device, no operation is performed for a long time while there is no animation in the current interface. Before the display screen is turned off, the display screen is in an idle state. When the display screen of the electronic device plays a video, the display screen may refresh an image at a frequency of 60Hz, and the display screen is in a refresh state. In the embodiment of the application, the display screen is in a refreshing state, and the image displayed by the display screen may or may not change. The refresh state of the display screen the displayed content of the display screen does not change because: the pre-refresh image obtained by the AP processor performing steps a1 through A3 in the technical architecture shown in fig. 7 is exactly the same as the post-refresh image obtained by the AP processor performing steps a1 through A3 in the technical architecture shown in fig. 7. The display screen is in an idle state, and the images displayed by the display screen do not change because: the AP process does not perform steps a1 to A3 in the technical architecture shown in fig. 7, and the display subsystem still sends the image obtained by the AP processor performing steps a1 to A3 in the technical architecture shown in fig. 7 for the last time to the display screen of the electronic device according to the preset refresh frequency.
Referring to fig. 22, a schematic diagram of a refresh state and an idle state provided in an embodiment of the present application is shown. In fig. 22, the display screen is always on.
In the TS0 period, each module in the AP processor is matched with and synthesizes an image 1 to be refreshed on the display screen when in the TS1 period.
In the TS1 period, the display subsystem sends the image 1 to the display screen, the display screen displays the image 1 which is synthesized by the cooperation of each module in the AP processor in the TS0 period, and simultaneously, each module in the AP processor synthesizes the image 2 to be refreshed on the display screen in the TS2 period.
In the TS2 period, the display subsystem sends the image 2 to the display screen, the display screen displays the image 2 which is synthesized by the cooperation of each module in the AP processor in the TS1 period, and simultaneously, each module in the AP processor synthesizes the image 3 to be refreshed on the display screen in the TS3 period.
In the TS3 period, the display subsystem sends the image 3 to the display screen, and the display screen displays the image 3 which is synthesized by the cooperation of each module in the AP processor in the TS2 period, and simultaneously, each module in the AP processor cooperates with the image 4 which is to be refreshed by the display screen when the AP processor synthesizes the TS4 period.
From the start time of the TS4 cycle, the display enters an idle state.
In the TS4 period, the display subsystem sends the image 4 to the display screen, the display screen displays the image 4 which is synthesized by matching each module in the AP processor in the TS3 period, and the AP processor does not synthesize the image to be refreshed any more.
During the TS5 period, the display subsystem sends image 4 to the display screen, which continues to display image 4 and the AP processor no longer synthesizes the image to be refreshed.
During the TS6 period, the display subsystem sends image 4 to the display screen, which continues to display image 5, and the AP processor no longer synthesizes the image to be refreshed.
In the above process, the periods TS0 to TS3 are the refresh states of the display screen, and the periods TS4 to TS6 are the idle states of the display screen. After the TS4 period and the TS4 period, the electronic device does not perform an image refresh operation, the display screen enters an idle state, and after the display screen enters the idle state, the display subsystem still sends the image 4 synthesized by the AP processor for the last time to the display screen for display according to the preset frequency (the frequency is the refresh frequency of the display screen). The displayed image (image 4) is the last image refreshed before the display screen switches to the idle state. Although the display subsystem still sends the image 4 last composed by the AP processor to the display screen at the preset frequency (which is the refresh frequency of the display screen), the AP processor does not perform steps a1 to a2 of the technical architecture described in fig. 7.
Of course, in practical applications, the periods TS0 to TS4 may be recorded as the refresh state of the display screen, and the periods TS5 to TS6 may be recorded as the idle state of the display screen.
In the CWB write-back function enabled state, if the display screen is in a refresh state, the HWC may extract a target image corresponding to a currently refreshed image, and similarly, may also extract corresponding image noise. If the display is idle, even if the CWB write back function is enabled, the AP processor does not perform the process of matching the composite image with the modules in steps a1 through A3 in the embodiment of fig. 7. Accordingly, the AP processor no longer performs steps a4 through a6 in the embodiment depicted in fig. 7, and the noise algorithm library does not receive the target image during the idle state of the display screen, nor does it obtain the image noise during the idle state of the display screen.
The display screen is idle for a long time, e.g. 1 minute, and the display screen does not need to refresh the image within this 1 minute. With a start-stop scheme that forces an image refresh once before the start of integration of the ambient light sensor, this may result in a forced refresh every 350ms in the 1 minute. This corresponds to 171.4 additional images refreshed within 1 minute by about 60000ms/350 ms. Therefore, in the case where the display screen is in the idle state for a long time, power consumption is increased again undoubtedly.
To more clearly understand the reason why the start-stop scheme of forcing the image refresh between the integration starts may lead to an increase in the power consumption of the processor when the display screen is in an idle state for a long time, this is illustrated by way of example in fig. 23.
Referring to fig. 23, t at the integration period of the 1 st acquisition cycle 01 Refreshing the image once at any time, and storing t in a noise algorithm library correspondingly 01 Image sum of time t 01 Of time of dayImage noise.
Referring to FIG. 23, at t 01 After the moment, the display screen is at t of the integration time period of the M +1 acquisition cycle (2M)1 The image is refreshed once again at a time, which example ignores the brightness adjustment.
Referring to FIG. 23, t precedes the integration period of the 2 nd acquisition cycle 1n After the CWB write back function is enabled at a time, the image is forced to refresh once (for convenience of description, the time of forced image refresh and the time of enabling the CWB write back function are within the same time metric unit, for example, both are within 1 ms), the AP processor performs steps a4 to a6, and the noise algorithm library obtains t 1n Target image of time and t 1n Image noise at time instants.
Referring to fig. 24, the integrated noise for the integration period of the 2 nd acquisition cycle is: duration t being integral duration 1n Image noise at time instants.
Referring to FIG. 23, t precedes the integration period of the 3 rd acquisition cycle 3n After the CWB write-back function is started at the moment, the image is forcibly refreshed once, the AP processor executes the steps A4 to A6 once, and the noise algorithm library obtains t 3n Target image of time and t 3n Image noise at time instants.
Referring to fig. 24, the integrated noise for the integration period of the 3 rd acquisition cycle is: duration t being integral duration 3n Image noise at time instants.
……。
Referring to FIG. 23, t before the integration period of the M +1 acquisition cycle (2M-1)n After the CWB write-back function is started at the moment, an image is forcibly refreshed once, the AP processor executes the steps A4 to A6 once, and the noise algorithm library obtains t (2M-1)n Target image sum t of time (2M-1)n Image noise at the moment.
Referring to FIG. 23, t at the integration period of the M +1 acquisition cycle (2M)1 Image refreshing at the moment, the AP processor executes the steps A1 to A6 once, and the noise algorithm library obtains t (2M)1 Target image sum t of time (2M)1 Image noise at the moment.
Referring to fig. 24, the integrated noise for the integration period of the M +1 th acquisition cycle is: duration of time t 2M To t (2M)1 T at the moment of time (2M-1)n Image noise at the moment and duration of time t (2M)1 To t 2M+1 T at the moment of time (2M)1 Image noise at time instants.
If the start-stop scheme of the embodiment shown in FIG. 19 is followed, in the embodiments shown in FIGS. 23 and 24, the start-stop scheme is started from t 0 Time to t 2M At time (M acquisition cycles), the image is forcibly refreshed M times in total.
If write back function is turned on in CWB (t) 1n 、t 1n ……t (2M-1)n ) The image is not forced to be refreshed afterwards. Then, referring to fig. 25, the integrated noise for the 2 nd acquisition cycle is: t for integration duration 01 The image noise at the moment, the integral noise at the 3 rd acquisition period is: t for integration duration 01 The image noise at time instant, … …, the integrated noise for the M +1 acquisition cycle is: duration of t 2M To t (2M)1 T of time 01 Image noise at the moment and duration of time t (2M)1 To t 2M+1 T at the moment of time (2M)1 Image noise at time instants.
As mentioned above, the process of refreshing the image is forced, and the image displayed on the display screen, namely t, is not changed 01 The image refreshed at that moment is the same as the image refreshed at the forced refresh moment, and correspondingly, t 01 The target image at the time and the target image at the forced refresh time are also the same. If brightness adjustment is ignored, t 01 The image noise at the time and the image noise at the forced refresh time are also the same. If brightness adjustment exists, the target image adopted at the moment of brightness adjustment is unchanged. Therefore, it is not necessary to force refreshing of the image in some scenes.
As can be understood from the above analysis, when the display screen is in an idle state for a long time, image noise that may interfere with the integration period may not be lost even if the image is not forcibly refreshed.
Of course, in the above embodiment, if t 01 When the moment is in the non-integral time period of the first acquisition cycle, the noise algorithm libraryT may not be obtained 01 Target image at time and image noise. Will need to be at t 1n The image is forced to refresh at all times.
In combination with the above various embodiments, the embodiments of the present application provide the technical solution shown in fig. 26. The embodiment shown in fig. 26 comprises the following steps:
at step 2601, the HWC in the AP processor starts the CWB write back function at a first preset time before the integration starts and looks at the last time the image is refreshed on the display.
In the embodiment of the application, whether the display screen needs to be forcibly refreshed or not, the CWB write-back function needs to be started at a first preset time before the integration starts, and then other factors are combined to determine whether the image needs to be forcibly refreshed or not.
For convenience of description, referring to fig. 27, a time corresponding to a first preset time before the start of integration in one acquisition period (T2) is selected as a reference time, which is T 3n . The embodiment of the application needs a time (t) corresponding to a first preset time before the integration starts 3n Time) initiates the CWB write back function and looks at the time the image was last refreshed on the display screen.
For convenience of description, the time when the image is refreshed on the display screen can be recorded as t k 。
As an example, when the image is refreshed last time no longer within the current non-integration period, the electronic device waits for the upper layer application to transmit the display parameters of the interface to the display engine service and then goes through the display engine service, the Surface flag, the HWC, and the like without forcing the refresh of the image. For the HWC, the HWC waits for a display parameter (which may be denoted as a second display parameter) sent by the Surface flag.
The HWC module receives the display parameters sent by the Surface flag module of the electronic equipment, and stores the moment when the display parameters are received and sixth display parameters;
the HWC module may obtain the time when the electronic device last refreshes the image, which may be the time when the display parameter sent by the Surface flag module last is received. The sixth display parameter may be set to the last display parameter that the HWC module acquired before the time the electronic device last refreshed the image was acquired. Accordingly, the time the image was last refreshed is the time the HWC module received the sixth display parameter.
In the embodiment of the application, the time difference between the time of last image refreshing and the current time is greater than the difference threshold value, which may be understood that the display screen has already entered the idle state, and the time difference between the time of last image refreshing and the current time is less than or equal to the difference threshold value, which may be understood that the display screen has not entered the refreshing state. Wherein the difference threshold may be determined based on empirical values.
The focus of the embodiment of the present application is to obtain the time of the last image refresh (when the HWC performs the image matting, the start time of the HWC performing the image matting from the last refresh) to determine whether the image needs to be forcibly refreshed according to the time of the last image refresh (or the start time of the HWC performing the image matting from the last refresh).
The focus of the embodiment of the application is not to confirm the current state of the display screen. The current state of the display screen is convenient for understanding the reason why the power consumption is increased in the idle state of the display screen in the above-described embodiments.
Referring to the embodiment shown in FIG. 27, the time t when the image is refreshed on the display screen k During the current non-integration period (t) 3 To t 3n In between), then the display screen is at t 3n The image displayed at the moment of time is t k The image being refreshed at all times. t is t k The time is at t 3 Time to t 3n Between the moments, this time period is the time when the CWB write back function stops. I.e. HWC has not retrieved t k The target image at that moment, and correspondingly, the noise algorithm library also does not obtain t k Image noise at time instants. If the image is not forced to be refreshed at this time, the following may occur:
(1) at t 3n Time of dayThereafter, the first change (image refresh or brightness adjustment) of the content displayed on the display screen is t b And adjusting the brightness at the moment.
t b At a time t 3n Time to t 4 Time between the times and t b Time to t 4 Without image refresh and brightness adjustment between moments. t is t b Backlight noise pair t of time instants 4 Time to t 5 There is a disturbance in the initial ambient light for the integration period corresponding to the instant. At the calculation of t b In the case of backlight noise at the time, the latest target image buffered by the HWC is not t k Target image of time, but t k A target image before the moment, t calculated b Backlight noise errors at time instants. T resulting in a noisy algorithm library calculation 4 Time to t 5 The target ambient light for the integration period corresponding to the instant is inaccurate.
t b At a time t 4 At time t b Backlight noise pair t of time instants 4 Time to t 5 There is a disturbance in the initial ambient light for the integration period corresponding to the instant. At the calculation of t b In the case of backlight noise at time, the latest target image cached by HWC is not t k The target image of the moment, the calculated t b Backlight noise error at time of day, resulting in t calculated by noise algorithm library 4 Time to t 5 The target ambient light for the integration period corresponding to the instant of time is not accurate.
t b At a time t 4 Time to t 5 In the case of time between times, t 4 Time to t b Initial ambient light exposure t between moments k Image noise interference at a moment. While HWC does not get t k At the moment of image noise, t is used in the integration process k One-time fusion noise before the moment (possibly backlight noise and also image noise) is taken as the pair t 4 Time to t b The initial ambient light between the moments causes interfering fusion noise, resulting in t calculated by the noise algorithm library 4 Time to t 5 The target ambient light for the integration period corresponding to the instant of time is not accurate. In addition, t b Backlight noise pair of time t 4 Time to t 5 The initial ambient light of the integration period corresponding to the instant of time is disturbed. At the calculation of t b In the case of backlight noise at time, the latest target image cached by HWC is not t k The target image at the moment in time, resulting in computed backlight noise errors. T calculated by noise algorithm library 4 Time to t 5 The target ambient light for the integration period corresponding to the instant is inaccurate.
(2) At t 3n The last change (image refresh or brightness adjustment) of the content displayed on the display screen after the moment is t b Image refresh at the moment. t is t b At a time t 4 Time to t 5 In the case of time between times, t 4 Time to t b Initial ambient light exposure t between moments k Interference of image noise at the moment. While the noise algorithm library does not calculate t k Image noise at time instants. Noise Algorithm library calculation of t 4 Time to t 5 Target ambient light at a time instant using t stored in a noise memory k The fusion noise before the moment acts as the fusion noise of the first sub-period of the interference integration period, resulting in t calculated by the noise algorithm library 4 Time to t 5 The target ambient light for the integration period corresponding to the instant of time is not accurate.
However, it can be understood through the above analysis that if the time when the image is refreshed last on the display screen is within the current non-integration time period, the image needs to be refreshed forcibly to obtain a target image corresponding to the image currently displayed on the display screen and image noise corresponding to the target image, and certainly, the target image corresponding to the image currently displayed on the display screen and the image noise corresponding to the target image can be understood as t 3n Target image and image noise at the moment.
As another embodiment of the present application, if the time when the image is refreshed on the display screen last time is not within the current non-integration time period, the image is not forced to be refreshed.
In the embodiment of the application, the moment t if the image is refreshed on the display screen for the last time k Is not at t 3 Time t and 3n between the moments, thenCan be in an integration period within the current acquisition cycle or in a previous or earlier acquisition cycle.
If t is k In the integration time period in the acquisition cycle, the CWB write-back function is started during the integration time period in the acquisition cycle, so the HWC can obtain t k The target image at the moment, the noise algorithm library can also obtain t k The target image at the moment and the image noise, and therefore, the image does not need to be forcibly refreshed.
If t is k During the last acquisition cycle or earlier, it is not necessary to consider whether to force the image to be refreshed, since it was already performed according to the embodiment shown in fig. 26 at the time of the last acquisition cycle. This case will be verified later in the application by means of fig. 28 to 30 (t) k Last acquisition cycle or earlier acquisition cycle) whether forced refreshing of the image is not required. Reference is made in particular to the description of fig. 28 to 30.
In the embodiment of the application, the time t of refreshing the image on the display screen is judged k The method of whether it is within the non-integration period of the current cycle may refer to the manner shown in fig. 27.
The first method is as follows: judgment T22 (T) 3n -t k ) And T21 (T) 3n -t 3 ) Of (c) is used. If T22 (T) 3n -t k ) Less than T21 (T) 3n -t 3 ) It means that the time when the image is refreshed on the display screen is within the current non-integration period. Otherwise, the time when the image is refreshed on the display screen is not in the current non-integration time period. In this embodiment, T22 may be recorded as a first difference value and T21 may be recorded as a second difference value.
The second method comprises the following steps: determine t k And t 3 Of (c) is used. If t is k Greater than t 3 And is less than t 3n It indicates that the moment when the image was last refreshed on the display screen is within the current non-integration period. Otherwise, the time of refreshing the image on the display screen is not in the current non-integration time period. In the embodiment of the present application, if t 3 Image refresh is performed at time t 3 The CWB write back function will not stop until after that time. Thus, at t 3 In the case where there is an image refresh at the moment,HWC can get t 3 The target image at the moment and the noise algorithm library can also acquire t 3 Target image noise at a time. Therefore, T22 (T) may be set 3n -t k ) Equal to T21 (T) 3n -t 3 ) The moment at which the display screen switches to the idle state is within the current integration period. In the same way, t k Is equal to t 3 The moment at which the display screen switches to the idle state is within the current integration period.
The third method comprises the following steps: it is checked whether the time of the last refresh of the image and the time of the last matting are less than a certain threshold (because there may be a difference between the image refresh time and the start time of the HWC execution to fetch the target image). If the count is less than a threshold, the image indicating the last refresh is over-matting by the HWC, and the image indicating the last refresh is not in the current non-integration period. If the threshold is greater than or equal to a threshold, it indicates that the image of the last refresh has not been subjected to matting by the HWC, and it indicates that the threshold is set according to actual conditions in the current non-integration period. In this embodiment, the threshold may be denoted as a first threshold.
In the embodiment of the present application, a time when the HWC acquires the display parameters of the interface from the surface flag may be used as a time when the image is refreshed this time, a time when the HWC acquires the synthesized image through underlying hardware may be used as a time when the image is refreshed this time, and a time when the image is sent to be displayed by the display subsystem may be used as an image refreshing time. Whichever time is taken as the time of refreshing the image this time may cause a slight difference between the time of refreshing the image this time and the time when the HWC starts to execute matting from the image refreshed this time, for example, 0.5ms, 0.8ms, 1ms, etc. Of course, the time at which the image is refreshed this time and the time at which the HWC starts performing matting from the refreshed image this time may also be equal. The time of the adjacent one-time image refresh is different from the time of the current image refresh by one refresh period, and the refresh period is 1000ms/60 to 16.7ms taking a refresh frequency of 60Hz as an example. Taking the refresh frequency of 120Hz as an example, the refresh period is 1000ms/120 to 8.3 ms. Thus, the threshold in this example may be a relatively small value, e.g., 2ms or the like, relative to the refresh period.
When the electronic device refreshes the image, it indicates that the HWC can receive the display parameter sent by the Surface flag (the display parameter is denoted as the fourth display parameter). The currently refreshed image may be denoted as a fifth image.
Step 2603', if the electronic device does not refresh the image while waiting for the second preset time period, the image is forcibly refreshed.
In this embodiment of the application, if the electronic device is always refreshing the image, a second preset time period (for example, 17ms) has a refresh action, and the HWC has already acquired the latest target image, and then lags the second preset time to decide whether to perform the forced refresh of the image, so that an additional action of forcibly refreshing the image can be avoided, and power consumption can be further reduced.
The HWC module waits for the second preset time period, and if the display parameter sent by the Surface flanger is not received (which may be referred to as the third display parameter), the image needs to be forcibly refreshed.
In the following, three embodiments (all take HWC cutout every time the display refreshes an image as an example) are used to verify that the display is in an idle state for a long time (in the above embodiment, if t is k In the last acquisition period or an earlier acquisition period) whether all image noise and backlight noise interfering with the initial ambient light acquired during the integration period can be obtained.
Referring to fig. 28, the time when the image is refreshed on the display screen last time is the integration time period of the last acquisition cycle, and the image is not refreshed again all the time. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t 1n Time of day, t 2n Time of day, t 3n The CWB write back function is enabled at all times.
First acquisition cycle, t 0 At that time, the ambient light sensor begins to collect initial ambient light, t 0 At time, CWB write back workCan already be started. At t k The last time the image is refreshed on the display screen at that moment, and the noise algorithm library obtains t k Image noise at time instants.
At t 1n And the moment when the image is refreshed on the display screen for the last time is not in the non-integral time period. Display screen t 1n The image displayed at the moment of time is t k The image displayed at the moment, if the backlight adjustment is ignored, t 1n The image noise at time t k Image noise at time instants. And the noise algorithm library has already acquired t k Image noise at time, therefore, t 1n The image is no longer forced to refresh at that moment.
In the second acquisition period, there is no image refresh and brightness adjustment is ignored. After the integration of the second acquisition period is finished, the fusion noises disturbing the integration time period are all t k Image noise at the moment.
At t 3n And the moment when the display screen refreshes the image for the last time is not in the non-integral time period of the time. Display screen t 3n The image displayed at the moment of time is t k The image displayed at any moment, if the backlight adjustment is ignored, then t 3n The image noise at that moment is t k Image noise at the moment. While HWC has already obtained t k Image noise at time, therefore, t 3n The image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the image is refreshed last time in the integration period of the last acquisition cycle and the image is not refreshed again, the image noise interfering with each integration period can be obtained without forcibly refreshing the image.
Referring to FIG. 29, t in the non-integration period of the previous acquisition cycle is the time at which the image was last refreshed on the display screen 1n Before the moment, and the image is not refreshed again. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t 1n Time t 2n Time of day、t 3n The CWB write back function is enabled all the time.
First acquisition cycle, t 0 At the moment, the ambient light sensor starts to collect the initial ambient light, t 0 At this point, the CWB write back function has been enabled. At t k At the moment that the display screen refreshes the image for the last time, the write-back function of the CWB is in a stop state, and the HWC does not obtain t k Image noise at the moment.
At t 1n The moment when the image is refreshed on the display screen is the non-integral time period, and the image needs to be refreshed forcibly (the moment becomes the moment t when the image is refreshed last time) k ') to obtain t 1n Image noise at time instants.
Certainly, in practical application, even if a decision is made to forcibly refresh a new image, the method can wait for a certain time period, if the display screen refreshes the image according to the refresh frequency during the waiting for the certain time period, the image does not need to be forcibly refreshed, and if the refreshed image is not monitored after waiting for the certain time period, the image can be forcibly refreshed. This example takes a forced refresh image as an example.
In the second acquisition period, no image is refreshed, brightness adjustment is ignored, and after the integration of the second acquisition period is finished, the fusion noises disturbing the integration time period of the second acquisition period are all t 1n Image noise at time instants.
At t 3n And the moment when the display screen refreshes the image last time is not in the non-integral time period. Display screen t 3n The image displayed at the moment of time is t 1n The image displayed at any moment, if the backlight adjustment is ignored, then t 3n The image noise at that moment is t 1n Image noise at the moment. While the noise algorithm library has already obtained t 1n Noise at time, therefore, t 3n The image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the moment when the image is refreshed on the display screen last time is before the write-back function of the non-integration period CWB of the last acquisition cycle is started, and the image is not refreshed again, the image noise that interferes with the initial ambient light of each integration period can be obtained without forcibly refreshing the image.
Referring to FIG. 30, t in the non-integration period of the previous acquisition cycle is the time at which the image was last refreshed on the display screen 1n After time t 2 Before the moment, and the image is not refreshed again. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t 1n Time t 2n Time t 3n The CWB write back function is enabled all the time.
A first acquisition cycle at t 1n At time, the CWB write back function is enabled.
At t k At the moment, the display screen refreshes the image, and then the noise algorithm library can acquire t k Image noise of the image displayed at the moment.
In the second acquisition cycle, there is no image refresh and brightness adjustment is ignored. After the integration of the second acquisition period is finished, the fusion noise of the interference integration time period is t k Image noise at time instants.
At t 3n And the moment when the display screen refreshes the image for the last time is not in the non-integral time period of the time. Display screen t 3n The image displayed at the moment of time is t k The image displayed at any moment, if the backlight adjustment is ignored, then t 3n The image noise at time t k Image noise at the moment. And the noise algorithm library has already acquired t k Noise at time, therefore, t 3n The image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the image is not refreshed again after the write-back function of the non-integration time CWB of the last acquisition cycle is started at the time when the image is refreshed last time on the display screen, the image noise that interferes with the initial ambient light of each integration time can be obtained without forcibly refreshing the image.
As can be understood by the examples of fig. 28 to 30: if the moment of refreshing the image on the display screen for the last time is not in the non-integral time period, the image does not need to be forcibly refreshed.
In the embodiment of the present application, the flowchart shown in fig. 28 is used, and on the basis of reducing the power consumption of the processor, it is possible to avoid obtaining no image noise that interferes with the integration time period, avoid obtaining a target image that is used when no backlight noise that interferes with the integration time period is obtained, and avoid a situation that a negative gain occurs in an idle state where the display screen is in for a long time.
As described above, the HWC in the AP processor may monitor the change of data in the core node during both the integration period and the non-integration period, and when the data stored in the core node changes, the HWC obtains the brightness to be adjusted from the core node, and transmits the brightness to the noise algorithm library to calculate and obtain the backlight noise.
In practice, the method can be carried out as follows.
During the stop of the CWB write-back function, when the HWC monitors the change of the data in the kernel node, the HWC can obtain the brightness to be adjusted, however, the HWC does not transmit the brightness value to be adjusted to the noise algorithm library any more, and accordingly, the noise algorithm library does not calculate and obtain the corresponding backlight noise.
When the CWB write back function is to be enabled (e.g., 1ms before the CWB write back function is enabled, etc.) or enabled, if the HWC does not monitor a change in the data stored in the kernel node while the CWB write back function is disabled, it indicates that the brightness value of the display screen has not changed. The HWC performs the start-stop method of the CWB write-back function provided in any of the embodiments described above.
When the CWB write back function is to be enabled or started, if the HWC monitors that the data stored in the kernel node has changed during the time the CWB write back function is stopped, it indicates that the brightness value of the display has changed. The HWC needs to send the latest brightness value monitored to the noise algorithm library. Then, the HWC performs the start/stop method of the CWB write-back function provided in any of the above embodiments. During the stop of the CWB write-back function, if there are multiple brightness adjustments, the noise algorithm library only needs to know the value after the last brightness adjustment, that is, the HWC sends the monitored brightness to be adjusted corresponding to the latest brightness change of the display screen to the noise algorithm library. The backlight noise corresponding to the brightness to be adjusted is avoided being frequently calculated by the noise algorithm library, and therefore power consumption is reduced.
As one example, after the HWC module sets a matte mark as the first character; the HWC module monitors whether data in a core node of the electronic device changes or not, wherein the core node stores a brightness value;
In response to a change in data in a core node of the electronic device, the HWC module retrieving a first luminance value from the core node;
after the HWC module obtains a first luminance value from the kernel node, the HWC module obtains a second luminance value from the kernel node in response to a change in data in the kernel node of the electronic device;
in response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
And when the noise algorithm library calculates and obtains the image noise corresponding to the image which is forcibly refreshed, calculating and obtaining a first image noise based on the target image corresponding to the image which is forcibly refreshed and the second brightness value.
After the HWC module sets a matte mark as a second character, the HWC module monitors whether data in a kernel node of the electronic device is changed, and the kernel node stores a brightness value;
in response to a change in data in a core node of the electronic device, the HWC module obtaining a third luminance value from the core node;
the HWC module sends the second luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, the HWC module retrieves a fourth luma value from a core node of the electronic device in response to a change in data in the core node;
The HWC module sends the fourth luma value to the noise algorithm library.
In the above embodiments, the HWC may or may not force the image to be refreshed.
And if the HWC forcibly refreshes the image, calculating the brightness value of the newly transmitted image corresponding to the forcibly refreshed image and the target image corresponding to the forcibly refreshed image. The backlight noise corresponding to the brightness adjusting moment does not interfere with the initial ambient light collected in the next integral time period. The fusion noise which interferes with the initial ambient light collected in the next integration time period may be image noise corresponding to a forced refreshing image, and in this case, the value of the image noise is correct, and the target ambient light in the next integration time period is not wrong.
If the HWC does not force the image to be refreshed, indicating that the target image of the image currently displayed on the display screen is stored in the noise algorithm library (the latest frame of target image stored in the noise algorithm library).
If the brightness adjusting time is earlier than the refreshing time of the image currently displayed on the display screen, the backlight noise corresponding to the brightness adjusting time does not interfere with the initial ambient light collected in the next integration time period. The fusion noise that interferes with the initial ambient light collected in the next integration period may be image noise corresponding to an image currently displayed on the display screen. And the value of the image noise is correct. And will not cause target ambient light errors for the next integration period.
If the brightness adjusting time is later than the refreshing time of the image currently displayed on the display screen, the image noise corresponding to the refreshing time of the image currently displayed on the display screen cannot interfere with the initial ambient light collected in the next integral time period. The fusion noise interfering with the initial ambient light collected in the next integration period may be the backlight noise at the latest brightness adjustment time, and the backlight noise is generated by the latest target image acquired by the display screen and the adjusted brightness value at the latest time.
Thus, whenever the CWB write back function is enabled, whether or not the image is forced to be refreshed, during the time that the CWB write back function is disabled, if the HWC detects a brightness change; the HWC sends the latest brightness value monitored to the noise algorithm library when the CWB write back function is to be initiated. And calculating by a noise algorithm library to obtain the backlight noise. The HWC continues to perform the start-stop method of the CWB write-back function provided by any of the embodiments described above.
As another embodiment of the present application, during the start of the CWB write back function, the HWC may retrieve the target image from the CWB write back memory once every other frame.
As an example, when the display screen is refreshed at a frequency of 90Hz, which is equivalent to refreshing the image every 1000ms/90 ═ 11.11ms, the HWC fetches the target image from the CWB write-back memory every other frame specifically:
when the electronic device refreshes the image for the ith time (taking the image matting refreshed for the ith time as an example), after the HWC obtains the synthesized image, the HWC checks the matting mark as a second character, the HWC determines that the image refreshed this time is a matting frame, and the HWC continues to execute subsequent steps according to the above embodiment.
When the image is refreshed for the (i + 1) th time, after the HWC obtains the synthesized image, the sectional mark is checked, the HWC checks the sectional mark as a second character (during the start of the CWB write-back function, the sectional mark is a second character), the HWC obtains the time difference between the last time of determining the sectional frame by the HWC (the time of determining the sectional frame by the refresh image for the (i) th time) and the current time, and the time difference is smaller than the sectional frame difference threshold (11.11 ms, or other time values, e.g., 11.5ms, 11.8ms, 12ms, etc.), which indicates that the last refreshed image of the image refreshed for the (i + 1) th time is already the sectional frame, and the image refreshed for the (i + 1) th time is not subjected to sectional view any more.
When the image is refreshed for the (i + 2) th time, after the HWC obtains the synthesized image, the HWC checks the matting flag as a second character (during the start of the CWB write-back function, the matting flag is a second character), the HWC obtains the time difference between the last time of determining the matting frame (the i-th time of refreshing the image is determined as the matting frame) and the current time, and the time difference is greater than or equal to the matting frame difference threshold (may be 11.11ms, and may also be other time values, for example, 11.5ms, 11.8ms, 12ms, and the like), then it indicates that the image refreshed for the (i + 2) th time is the matting frame.
In the above example, the time difference is: the HWC last determines the time difference between the moment of the matting frame and the current moment. Actually, the time difference between the time when the HWC last transmitted the synthesized image (carrying the information to be scratched) to the OLED driver and the current time may be also used, or the time difference between the time when the HWC last started to perform scratching and the current time may be also used. The above manner of obtaining the time difference is only used as an example, and in practical applications, other manners of determining the time difference may also be used. The time difference is obtained in different ways, and correspondingly, the threshold value of the difference value of the matting frame can be different.
As an example, the interval between two image refreshes is theoretically 11.1ms, and the current time is the time when the HWC looks to get the matte mark as the second character. If the time difference is: the time difference between the moment of the matting frame and the current moment is determined last time, and the time difference is theoretically 11.1ms (the last frame is the matting frame) or 22.2ms (the last frame is not the matting frame). Thus, the matte frame difference threshold may be any value between 11.1 and 22.2. If the time difference is: the time difference is theoretically (11.1-t) ms (the last frame is a matting frame) or (22.2-t) ms (the last frame is not a matting frame), and t is the time difference between the time of the matting frame determined by the HWC and the time of transmitting the synthesized image to the OLED drive. Thus, the matte frame difference threshold may be any value between 11.1-t and 22.2-t.
If the display is refreshed at a 120Hz rate, the 11.1ms in the above example would need to be changed to 16.7ms based on the 120Hz rate. Thus, in inter-frame matting, the matte frame difference threshold is also related to the current refresh frequency of the display screen.
Of course, the above-mentioned time points are only used as examples and do not limit the present application in any way.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the above method example, for example, each functional unit may be divided for each function, or two or more functions may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation. The following description will take the example of dividing each functional unit corresponding to each function:
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments may be implemented.
Embodiments of the present application further provide a computer program product, which when run on an electronic device, enables the electronic device to implement the steps in the above method embodiments.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to a first device, including recording media, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
The embodiments of the present application further provide a chip system, where the chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement the steps of any of the method embodiments of the present application. The chip system may be a single chip or a chip module composed of a plurality of chips.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and they should be construed as being included in the present application.
Claims (34)
1. A method for monitoring noise is applied to an electronic device, and the electronic device comprises: a HWC module, a display subsystem, and a library of noise algorithms, the method comprising:
in response to receiving first information indicating that the HWC module set a write back flag to be a first flag, the HWC module sets the write back flag to be the first flag;
in response to receiving a first image, the HWC module querying the write-back flag as a first flag, the first image being an image currently to be refreshed by the electronic device;
the HWC module sends the first image to the display subsystem based on the first flag;
the display subsystem stops writing back a memory to the electronic device to store a second image, wherein the second image is an image corresponding to an area, which contains a first target image, on the first image, the first target image is an image in a first area, and the first area is an area, which is located above an environment sensor of the electronic device, on a display screen of the electronic device;
in response to reaching a first time, the HWC module sets the write back flag to a second flag;
The HWC module acquires a third image, wherein the third image is an image which is currently forced to be refreshed by the electronic equipment;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
in response to receiving the third image and the second information, the display subsystem stores a fourth image to a write-back memory of the electronic device, where the fourth image is an image corresponding to an area on the third image that includes a second target image, and the second target image is an image in the first area;
the HWC module acquires the second target image from the write-back memory;
the HWC module sends the second target image to a noise algorithm library;
and the noise algorithm base calculates and obtains first image noise based on the second target image.
2. The method of claim 1, wherein the first information comprises a first duration of time for which the display subsystem stops storing images to the write-back memory; the first moment is as follows: the write-back flag is set to be a time when a first time length passes after the time of the first flag;
Or the first information includes a first time length, a first value and a second time, the first time length is a time length when the display subsystem stops storing the image to the write-back memory, and the second time is an end time when an ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting the delay time length from the first time length, the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information, and the first value is an ambient light intensity value acquired by the ambient light sensor.
3. The method of claim 1 or 2, wherein the HWC module acquiring the third image comprises:
the HWC module sends a first signal to a surface flag of the electronic device;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
The HWC module derives the third image based on the first display parameters.
4. The method of claim 2, wherein after the HWC module sets the write back flag to the second flag, and before the HWC module acquires a third image, further comprising:
the HWC module acquires the last image refreshing time of the electronic equipment;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the HWC module acquires the third image.
5. The method of claim 4, wherein the HWC module, after obtaining the time the image was last refreshed by the electronic device, further comprises:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the HWC module waits for a Surface flag module of the electronic equipment to send a second display parameter.
6. The method of claim 4, wherein the HWC module obtaining the first image if a time of last image refresh of the electronic device satisfies a first preset condition comprises:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the HWC module waits for a second time length;
And if the HWC module does not receive the third display parameter sent by the Surface flag within the second duration, the HWC module acquires the first image.
7. The method of claim 6, wherein the method further comprises:
if the HWC module receives a fourth display parameter sent by a Surface flag within the second duration, the HWC module acquires a fifth image based on the fourth display parameter, wherein the fifth image is an image to be refreshed currently by the electronic device;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the fifth image and third information to the display subsystem based on the second mark, wherein the third information is used for instructing the display subsystem to store a sixth image in a write-back memory of the electronic device, and the sixth image is an image corresponding to an area containing a third target image on the fifth image;
in response to receiving the fifth image and the third information, the display subsystem stores a sixth image in a writeback memory of the electronic device, a third target image being an image within the first region;
the HWC module obtains the third target image from the write-back memory;
The HWC module sends the third target image to a noise algorithm library;
and the noise algorithm library calculates and obtains second image noise based on the third target image.
8. The method of claim 4, wherein the first information includes a first value and a second time, the second time being an end time when an ambient light sensor of the electronic device collected the first value;
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time.
9. The method of claim 4, wherein the first information further comprises a first value and a second time, the second time is an end time when an ambient light sensor of the electronic device collects the first value, and the time when the electronic device last refreshes an image meets a first preset condition comprises:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
The moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
and a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time.
10. The method of claim 4, wherein the electronic device further comprises a display screen, and the first preset condition being satisfied by the moment when the image is last refreshed by the electronic device comprises:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
11. The method of claim 1, wherein the method further comprises:
after the HWC module sets a write back flag to a first flag; the HWC module monitors whether data in a kernel node of the electronic equipment is changed or not, and the kernel node stores a brightness value;
In response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a first luminance value from the core node;
after the HWC module acquires a first brightness value from the kernel node, the HWC module acquires a second brightness value from the kernel node in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
12. The method of claim 11, wherein the method further comprises:
after the HWC module sets a write back flag to a second flag, the HWC module monitors whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a third luma value from the core node;
the HWC module sends the third luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, in response to listening that data in a core node of the electronic device has changed, the HWC module retrieving a fourth luma value from the core node;
The HWC module sends the fourth luma value to the noise algorithm library.
13. The method of claim 11 or 12, wherein the computing of the first image noise based on the second target image by the noise algorithm library comprises:
and calculating to obtain first image noise based on the second target image and the second brightness value by the noise algorithm library.
14. The method of claim 1, wherein the HWC module receiving a first image comprises:
the HWC module receives a fifth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module derives the first image based on the fifth display parameter.
15. The method of claim 4, wherein the HWC module obtaining the time before the last refresh image of the electronic device comprises:
the HWC module receives a sixth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module stores a time at which the HWC module received the sixth display parameter;
the HWC module acquires the last image refreshing time of the electronic device, and the method comprises the following steps:
the HWC module obtains the stored time when the sixth display parameter is received, wherein the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before obtaining the last image refreshing time of the electronic device.
16. The method of claim 3, wherein the first display parameter comprises: and synthesizing one or more of the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
17. A method for monitoring noise, applied to an electronic device including a first processor, the method comprising:
the method comprises the steps that a first processor receives first information, the first information is information which is sent to the first processor by a second processor in the electronic equipment after integration of an ambient light sensor in the electronic equipment is finished and is used for determining a first moment, and the second processor is a processor used for assisting the first processor in processing an ambient light sensor event;
after the first processor receives the first information, responding to receiving of a first image, the first processor stops obtaining a first target image from the first image, wherein the first target image is an image in a first area, the first image is an image to be refreshed currently of the electronic device, and the first area is an area, located above an environment sensor of the electronic device, on a display screen of the electronic device;
After a first moment, the first processor acquires a third image, wherein the third image is an image which is currently and forcibly refreshed by the electronic equipment, and the first moment is the moment when the first processor starts to acquire a target image from the refreshed image;
the first processor acquires a second target image from the third image, wherein the second target image is an image in the first area;
the first processor computes a first image noise based on the second target image.
18. The method of claim 17, wherein the method further comprises:
in response to receiving the first information, the first processor setting a write-back flag as a first flag by a HWC module of the electronic device;
the first processor, in response to receiving a first image, ceasing to acquire a first target image from the first image comprises:
in response to receiving the first image, the first processor querying, via the HWC module, the write-back flag as a first flag;
the first processor sending, by the HWC module, the first image to a display subsystem of the electronic device based on the first flag;
The first processor stops storing a second image to a write-back memory of the electronic equipment through the display subsystem, wherein the second image is an image corresponding to an area, which contains a first target image, on the first image, and the first target image is an image in a first area;
the method further comprises the following steps:
in response to reaching a first time, the first processor setting, by the HWC module, the write-back flag to a second flag;
the first processor acquires a third image, acquires a second target image from the third image, and the second target image is an image in the first area, and comprises:
the first processor obtaining a third image through the HWC module;
the first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sending, by the HWC module, the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
In response to receiving the third image and the second information, the first processor stores a fourth image to a write-back memory of the electronic device through the display subsystem, wherein the fourth image is an image corresponding to an area, which includes a second target image, on the third image, and the second target image is an image in the first area;
the first processor retrieves the second target image from the write-back memory through the HWC module;
the method further comprises the following steps:
the first processor sending, by the HWC module, the second target image to a noise algorithm library;
the first processor obtains first image noise through the noise algorithm library based on the second target image calculation.
19. The method of claim 18, wherein the first information comprises a first duration of time for which the display subsystem stops storing images to the write-back memory; the first moment is as follows: the write-back flag is set to be a time when a first time length passes after the time of the first flag;
or the first information includes a first time length, a first value and a second time, the first time length is a time length when the display subsystem stops storing the image to the write-back memory, and the second time is an end time when an ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting the delay time length from the first time length, the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information, and the first value is an ambient light intensity value acquired by the ambient light sensor.
20. The method of claim 18 or 19, wherein the first processor obtaining a first image via the HWC module comprises:
the first processor sends a first signal to a surface flag of the electronic device through the HWC module;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameter.
21. The method of claim 19, wherein after the first processor sets the write back flag to the second flag via the HWC module and before the first processor acquires a third image via the HWC module, further comprising:
the first processor acquires the moment when the image is refreshed on the electronic device last time through the HWC module;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the first processor acquires the third image through the HWC module.
22. The method of claim 21, wherein after the first processor obtains, via the HWC module, a time at which the electronic device last refreshed an image, further comprising:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the first processor waits for a Surface flag module of the electronic equipment to send a second display parameter through the HWC module.
23. The method of claim 21, wherein the first processor obtaining the first image through the HWC module if a first preset condition is met at a time when the electronic device last refreshed comprises:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the first processor waits for a second time length through the HWC module;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second time length, the first processor acquires the first image through the HWC module.
24. The method of claim 23, wherein the method further comprises:
if the HWC module receives a fourth display parameter sent by Surface flanger within the second duration, the first processor acquires a fifth image based on the fourth display parameter through the HWC module, wherein the fifth image is an image to be refreshed currently of the electronic device;
The first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sends, by the HWC module, the fifth image and third information to the display subsystem based on the second flag, where the third information is used to instruct the display subsystem to store a sixth image in a write-back memory of the electronic device, and the sixth image is an image corresponding to a region of the fifth image that includes a third target image;
in response to receiving the fifth image and the third information, the first processor stores a sixth image in a write-back memory of the electronic device through the display subsystem, the third target image being an image within the first region;
the first processor retrieves the third target image from the write-back memory through the HWC module;
the first processor sending, by the HWC module, the third target image to a noise algorithm library of the electronic device;
and the first processor calculates and obtains second image noise based on the third target image through the noise algorithm library.
25. The method of any of claims 21 to 24, wherein the electronic device further comprises a display screen, the first information comprises a first value and a second time, the second time being an end time when an ambient light sensor of the electronic device collects the first value;
The electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time;
or, the electronic device meeting the first preset condition at the moment of last image refreshing includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time;
or,
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
The moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
26. The method of claim 18, wherein the method further comprises:
after the first processor sets a write back flag to a first flag via the HWC module; the first processor monitoring, by the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a first luminance value from the core node;
after the first processor acquires a first luminance value from the kernel node through the HWC module, the first processor acquires a second luminance value from the kernel node through the HWC module in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the first processor sends the second luma value to the noise algorithm library through the HWC module.
27. The method of claim 26, wherein the method further comprises:
after the first processor sets a write back flag to a second flag via the HWC module, the first processor monitors, via the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to monitoring that a change occurs in data in a core node of the electronic device, the first processor obtaining, by the HWC module, a third luma value from the core node;
the first processor sending, by the HWC module, the third luminance value to the noise algorithm library;
after the first processor sends the third luma value to the noise algorithm library through the HWC module, in response to snooping that data in a kernel node of the electronic device has changed, the first processor retrieves a fourth luma value from the kernel node through the HWC module;
the first processor sends the fourth luma value to the noise algorithm library through the HWC module.
28. The method of claim 26, wherein the first processor obtaining a first image noise based on the second target image calculation via the noise algorithm library comprises:
The first processor calculates a first image noise based on the second target image and the second luminance value through the noise algorithm library.
29. The method of claim 18, wherein the first processor receiving, by the HWC module, a first image comprises:
the first processor receives a fifth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor obtains, by the HWC module, the first image based on the fifth display parameter.
30. The method of claim 21, wherein the first processor, prior to a time when the first processor obtains the last refresh image by the HWC module, comprises:
the first processor receives a sixth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor storing, by the HWC module, a time at which the sixth display parameter was received by the HWC module;
the first processor acquires, by the HWC module, a time when the image is last refreshed on the electronic device, where the time includes:
the first processor acquires, by the HWC module, the stored time when the sixth display parameter is received, where the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before the time when the electronic device last refreshes an image is acquired.
31. The method of claim 20, wherein the first display parameter comprises: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
32. An electronic device, characterized in that the electronic device comprises a first processor for executing a computer program stored in a memory, to cause the electronic device to implement the method of any of claims 1 to 16 or the method of any of claims 17 to 31.
33. A chip system comprising a first processor coupled to a memory, the first processor executing a computer program stored in the memory to implement the method of any of claims 17 to 31.
34. A computer-readable storage medium, in which a computer program is stored which, when run on a processor, implements the method of any one of claims 1 to 16 or the method of any one of claims 17 to 31.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211137769.9A CN115564668A (en) | 2021-05-31 | 2021-05-31 | Noise monitoring method, electronic equipment and chip system |
CN202110606261.8A CN113808030B (en) | 2021-05-31 | 2021-05-31 | Noise monitoring method, electronic equipment and chip system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110606261.8A CN113808030B (en) | 2021-05-31 | 2021-05-31 | Noise monitoring method, electronic equipment and chip system |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211137769.9A Division CN115564668A (en) | 2021-05-31 | 2021-05-31 | Noise monitoring method, electronic equipment and chip system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113808030A CN113808030A (en) | 2021-12-17 |
CN113808030B true CN113808030B (en) | 2022-09-30 |
Family
ID=78942437
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211137769.9A Pending CN115564668A (en) | 2021-05-31 | 2021-05-31 | Noise monitoring method, electronic equipment and chip system |
CN202110606261.8A Active CN113808030B (en) | 2021-05-31 | 2021-05-31 | Noise monitoring method, electronic equipment and chip system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211137769.9A Pending CN115564668A (en) | 2021-05-31 | 2021-05-31 | Noise monitoring method, electronic equipment and chip system |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN115564668A (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1326166A (en) * | 2000-05-31 | 2001-12-12 | 三星电子株式会社 | Method for expressing mode repeatability of images |
CN106610879A (en) * | 2016-12-23 | 2017-05-03 | 盛科网络(苏州)有限公司 | Method for improving CPU (Central Processing Unit) noise test efficiency of chip |
CN207165238U (en) * | 2017-05-17 | 2018-03-30 | 西安紫光国芯半导体有限公司 | A kind of memory of the write-back when carrying out read operation |
CN107945747A (en) * | 2017-11-22 | 2018-04-20 | 广东欧珀移动通信有限公司 | Environment light detection method, device, storage medium and electronic equipment |
CN108885775A (en) * | 2016-04-05 | 2018-11-23 | 华为技术有限公司 | A kind of display methods and terminal |
CN111754954A (en) * | 2020-07-10 | 2020-10-09 | Oppo(重庆)智能科技有限公司 | Screen brightness adjusting method and device, storage medium and electronic equipment |
CN112229507A (en) * | 2020-10-15 | 2021-01-15 | Tcl通讯(宁波)有限公司 | Ambient light detection method and device, storage medium and mobile terminal |
CN112312031A (en) * | 2019-07-30 | 2021-02-02 | 辉达公司 | Enhanced high dynamic range imaging and tone mapping |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8076628B2 (en) * | 2008-09-25 | 2011-12-13 | Apple Inc. | Ambient light sensor with reduced sensitivity to noise from infrared sources |
US8987652B2 (en) * | 2012-12-13 | 2015-03-24 | Apple Inc. | Electronic device with display and low-noise ambient light sensor with a control circuitry that periodically disables the display |
US10475148B2 (en) * | 2017-04-24 | 2019-11-12 | Intel Corporation | Fragmented graphic cores for deep learning using LED displays |
US20200294468A1 (en) * | 2019-03-13 | 2020-09-17 | Apple Inc. | Electronic Devices With Ambient Light Sensor Systems |
US20210086364A1 (en) * | 2019-09-20 | 2021-03-25 | Nvidia Corporation | Vision-based teleoperation of dexterous robotic system |
CN110677596A (en) * | 2019-11-04 | 2020-01-10 | 深圳市灵明光子科技有限公司 | Ambient light adjusting device, ambient light adjusting method, image sensor and electronic device |
-
2021
- 2021-05-31 CN CN202211137769.9A patent/CN115564668A/en active Pending
- 2021-05-31 CN CN202110606261.8A patent/CN113808030B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1326166A (en) * | 2000-05-31 | 2001-12-12 | 三星电子株式会社 | Method for expressing mode repeatability of images |
CN108885775A (en) * | 2016-04-05 | 2018-11-23 | 华为技术有限公司 | A kind of display methods and terminal |
CN106610879A (en) * | 2016-12-23 | 2017-05-03 | 盛科网络(苏州)有限公司 | Method for improving CPU (Central Processing Unit) noise test efficiency of chip |
CN207165238U (en) * | 2017-05-17 | 2018-03-30 | 西安紫光国芯半导体有限公司 | A kind of memory of the write-back when carrying out read operation |
CN107945747A (en) * | 2017-11-22 | 2018-04-20 | 广东欧珀移动通信有限公司 | Environment light detection method, device, storage medium and electronic equipment |
CN112312031A (en) * | 2019-07-30 | 2021-02-02 | 辉达公司 | Enhanced high dynamic range imaging and tone mapping |
CN111754954A (en) * | 2020-07-10 | 2020-10-09 | Oppo(重庆)智能科技有限公司 | Screen brightness adjusting method and device, storage medium and electronic equipment |
CN112229507A (en) * | 2020-10-15 | 2021-01-15 | Tcl通讯(宁波)有限公司 | Ambient light detection method and device, storage medium and mobile terminal |
Non-Patent Citations (2)
Title |
---|
Correlation between noise-after-write and magnetic domain structure conversions in thin-film heads by electron microscopy;Kobayashi K et al;《IEEE Translation Journal on Magnetics in Japan》;20021231;全文 * |
基于ZYNQ-7000的视频图像处理系统设计与实现;阮远忠等;《软件导刊》;20181231;第17卷(第09期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN115564668A (en) | 2023-01-03 |
CN113808030A (en) | 2021-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113475057B (en) | Video frame rate control method and related device | |
CN113804290B (en) | Ambient light detection method, electronic device and chip system | |
CN112119641B (en) | Method and device for realizing automatic translation through multiple TWS (time and frequency) earphones connected in forwarding mode | |
CN110750772A (en) | Electronic equipment and sensor control method | |
CN111182140B (en) | Motor control method and device, computer readable medium and terminal equipment | |
CN114095666A (en) | Photographing method, electronic device and computer-readable storage medium | |
CN116991354A (en) | Data processing method and related device | |
CN111526407B (en) | Screen content display method and device | |
CN111741283A (en) | Image processing apparatus and method | |
CN112469012A (en) | Bluetooth communication method and related device | |
CN114257920B (en) | Audio playing method and system and electronic equipment | |
CN113808030B (en) | Noise monitoring method, electronic equipment and chip system | |
WO2022199613A1 (en) | Method and apparatus for synchronous playback | |
CN113923351B (en) | Method, device and storage medium for exiting multi-channel video shooting | |
CN113837990B (en) | Noise monitoring method, electronic equipment, chip system and storage medium | |
CN113820008B (en) | Ambient light detection method, electronic device and chip system | |
CN115762108A (en) | Remote control method, remote control device and controlled device | |
CN113596320A (en) | Video shooting variable speed recording method, device, storage medium and program product | |
CN113467904A (en) | Method and device for determining collaboration mode, electronic equipment and readable storage medium | |
CN114740986A (en) | Handwriting input display method and related equipment | |
CN113391735A (en) | Display form adjusting method and device, electronic equipment and storage medium | |
CN113672454B (en) | Screen freezing monitoring method, electronic equipment and computer readable storage medium | |
CN116069223B (en) | Anti-shake method, anti-shake device and wearable equipment | |
CN115931115A (en) | Detection method of ambient light, electronic equipment, chip system and storage medium | |
CN115712368A (en) | Volume display method, electronic device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |