CN115564668A - Noise monitoring method, electronic equipment and chip system - Google Patents

Noise monitoring method, electronic equipment and chip system Download PDF

Info

Publication number
CN115564668A
CN115564668A CN202211137769.9A CN202211137769A CN115564668A CN 115564668 A CN115564668 A CN 115564668A CN 202211137769 A CN202211137769 A CN 202211137769A CN 115564668 A CN115564668 A CN 115564668A
Authority
CN
China
Prior art keywords
image
time
hwc
module
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211137769.9A
Other languages
Chinese (zh)
Inventor
张文礼
汤中峰
黄邦邦
王思文
张佳祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211137769.9A priority Critical patent/CN115564668A/en
Publication of CN115564668A publication Critical patent/CN115564668A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4204Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application provides a noise monitoring method, electronic equipment and a chip system, relates to the technical field of ambient light sensors, and can solve the problem of overlarge power consumption of the electronic equipment. The detection method comprises the following steps: an ambient light sensor of the electronic equipment collects ambient light in a collection period, and before the ambient light sensor collects the ambient light each time, a memory write-back function is started to obtain image noise during the collection of the ambient light; stopping the memory write-back function after the ambient light sensor finishes collecting the ambient light every time so as to avoid image noise outside the period of calculating and collecting the ambient light by the electronic equipment; the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. Because the noise interfering with the ambient light may be related to the image displayed on the display screen at the starting time of collecting the ambient light, the image can be forcibly refreshed to obtain the image currently displayed on the display screen after the memory write-back function is started, so that the noise interfering with the ambient light is obtained.

Description

Noise monitoring method, electronic equipment and chip system
The application is a divisional application of a Chinese patent application with the application number of 202110606261.8 and the name of "a noise monitoring method, electronic equipment and chip system" filed by the national intellectual property office in 2021, 5/31.
Technical Field
The embodiment of the application relates to the field of ambient light sensors, in particular to a control method of electronic equipment, the electronic equipment and a chip system.
Background
With the development of electronic devices, the display screen of the electronic device has a higher and higher occupancy rate. In pursuit of an excellent screen occupation ratio, an ambient Light sensor on an electronic device may be disposed below an OLED (Organic Light-Emitting Diode) screen of the electronic device. The OLED screen itself emits light, which causes the ambient light collected by the ambient light sensor disposed below the OLED screen to include the light emitted by the OLED screen itself, resulting in inaccuracy of the ambient light collected by the ambient light sensor.
Currently, in order to accurately measure ambient light, ambient light collected by an ambient light sensor and noise generated by a display screen of an electronic device may be obtained. Then, the real ambient light is obtained based on the ambient light collected by the ambient light sensor and noise generated by the display screen of the electronic device. In this method, noise generated by the display screen of the electronic device is related to an image displayed by the display screen of the electronic device, and therefore, the image displayed by the display screen of the electronic device needs to be acquired.
Disclosure of Invention
The embodiment of the application provides a control method of an electronic device, the electronic device and a chip system, and solves the problem of overlarge power consumption when the electronic device acquires noise.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a noise monitoring method, which is applied to an electronic device, where the electronic device includes: a HWC module, a display subsystem, and a library of noise algorithms, the method comprising:
in response to receiving the first information, the HWC module sets a write back flag to a first flag;
in response to receiving the first image, the HWC module queries the write-back flag to be a first flag;
the HWC module sends the first image to the display subsystem based on the first flag;
the display subsystem stops writing back a memory to the electronic equipment to store a second image comprising a first target image on the first image, wherein the first target image is an image in a first area;
in response to reaching a first time, the HWC module sets the write-back flag to a second flag;
the HWC module obtains a third image;
the HWC module queries the write-back flag as the second flag;
The HWC module sends the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
in response to receiving the third image and the second information, the display subsystem stores, to a write-back memory of the electronic device, a fourth image that includes a second target image on the third image, the second target image being an image within the first region;
the HWC module acquires the second target image from the write-back memory;
the HWC module sends the second target image to a noise algorithm library;
and the noise algorithm library calculates and obtains first image noise based on the second target image.
In the embodiment of the application, after the ambient light sensor finishes collecting the ambient light every time, the SCP processor may send the first information to the AP processor, and on one side of the AP processor, the HWC module sets the write-back flag as the first flag, and the memory write-back function is stopped. The HWC module may set the write back flag to the second flag and may also force a refresh of an image, such as a third image, before the ambient light sensor next acquires ambient light, such as at a first time. And under the condition that the write-back mark is the second mark, starting a memory write-back function, if the image is refreshed, obtaining a target image of the refreshed image, namely the target image on the third image, sending the target image obtained according to the third image to a noise algorithm library by the HWC, and calculating by the noise algorithm library to obtain the image noise. By the method, the HWC can be controlled to obtain the target image of the current refreshed image only when the refreshed image exists during the period that the ambient light sensor collects the ambient light, and the HWC does not obtain the target image of the current refreshed image any more when the refreshed image exists at the time other than the period that the ambient light sensor collects the ambient light. In practice, whether the HWC obtains the target image from the currently refreshed image matte is set by the write-back flag. According to the embodiment of the application, the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. In a possible implementation manner of the first aspect, the first information includes a first duration, where the first duration is a duration for the display subsystem to stop storing the image to the write-back memory; the first moment is as follows: the write-back mark is set to a time when a first time length passes after the time of the first mark;
Or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting a delay time length from the first time length, and the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information.
In one possible implementation of the first aspect, the HWC module obtaining the third image includes:
the HWC module sends a first signal to a surface flag of the electronic device;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameter.
In one possible implementation manner of the first aspect, after the HWC module sets the write-back flag to be the second flag and before the HWC module acquires the third image, the method further includes:
the HWC module acquires the moment when the image is refreshed on the electronic equipment last time;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the HWC module acquires the third image.
In one possible implementation manner of the first aspect, after the HWC module obtains the time when the electronic device last refreshes the image, the method further includes:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the HWC module waits for a Surface flag module of the electronic equipment to send a second display parameter.
In a possible implementation manner of the first aspect, if a time when the electronic device last refreshes an image meets a first preset condition, the HWC module obtains the first image, including:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the HWC module waits for a second time length;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second duration, the HWC module acquires the first image.
In a possible implementation manner of the first aspect, the method further includes:
if the HWC module receives a fourth display parameter sent by a Surface flag within the second duration, the HWC module acquires a fifth image based on the fourth display parameter;
the HWC module queries the write back flag to be the second flag;
the HWC module sends the fifth image and third information to the display subsystem based on the second flag;
in response to receiving the fifth image and the third information, the display subsystem stores a sixth image on the fifth image that includes a third target image in a write-back memory of the electronic device, the third target image being an image within the first region;
the HWC module acquires the third target image from the write-back memory;
the HWC module sends the third target image to a noise algorithm library;
and the noise algorithm library calculates and obtains second image noise based on the third target image.
In a possible implementation manner of the first aspect, the first information includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device acquires the first value;
The electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time.
In a possible implementation manner of the first aspect, the first information further includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device collects the first value, and a time when an image is last refreshed by the electronic device meets a first preset condition includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
and a first difference between the last image refreshing time of the electronic equipment and the current time is greater than or equal to a second difference between the second time and the current time.
In a possible implementation manner of the first aspect, the meeting of the first preset condition by the electronic device at the time of last refreshing the image includes:
The moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
In a possible implementation manner of the first aspect, the method further includes:
after the HWC module sets a write back flag to a first flag; the HWC module monitors whether data in a kernel node of the electronic equipment is changed or not, and the kernel node stores a brightness value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a first luminance value from the core node;
after the HWC module obtains a first luminance value from the kernel node, the HWC module obtains a second luminance value from the kernel node in response to monitoring that data in the kernel node of the electronic device changes;
In response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
In a possible implementation manner of the first aspect, the method further includes:
after the HWC module sets a write back flag to a second flag, the HWC module monitors a core node of the electronic device for a change in data, the core node storing a luminance value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a third luminance value from the core node;
the HWC module sends the third luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, in response to listening that data in a core node of the electronic device has changed, the HWC module retrieving a fourth luma value from the core node;
the HWC module sends the fourth luma value to the noise algorithm library.
In one possible implementation manner of the first aspect, the calculating, by the noise algorithm library, the first image noise based on the second target image includes:
and calculating to obtain first image noise based on the second target image and the second brightness value by the noise algorithm library.
In one possible implementation of the first aspect, the HWC module receiving the first image includes:
the HWC module receives a fifth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module derives the first image based on the fifth display parameter.
In a possible implementation manner of the first aspect, the first area is an area on a display screen of the electronic device, which is located above an ambient light sensor of the electronic device.
In one possible implementation manner of the first aspect, the obtaining, by the HWC module, an image last refreshed by the electronic device includes:
the HWC module receives a sixth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module stores a time at which the HWC module received the sixth display parameter;
the HWC module acquires the last time when the electronic device refreshes the image, and the method comprises the following steps:
the HWC module obtains the stored time when the sixth display parameter is received, where the time when the sixth display parameter is received is the time when the HWC module last stored display parameter was received before the time when the HWC module last refreshed the image was obtained.
In a possible implementation manner of the first aspect, the first display parameter includes: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
In a second aspect, an embodiment of the present application provides a noise monitoring method, which is applied to an electronic device, where the electronic device includes: a first processor, the method comprising:
the first processor receives first information;
after the first processor receives the first information, responding to receiving a first image, the first processor stops acquiring a first target image from the first image, wherein the first target image is an image in a first area;
after the first time is reached, the first processor acquires a third image;
the first processor acquires a second target image from the third image, wherein the second target image is an image in the first area.
In the embodiment of the application, an ambient light sensor of electronic equipment collects ambient light in a collection period, and before the ambient light sensor collects the ambient light each time, a memory write-back function is started to obtain a target image of a refreshed image, so that image noise during the collection of the ambient light is obtained; stopping the memory write-back function after the ambient light sensor finishes collecting the ambient light every time so as to avoid image noise except the period of calculating and collecting the ambient light by the electronic equipment; the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. Because the noise interfering with the ambient light may be related to the image displayed on the display screen at the starting time of collecting the ambient light, after the memory write-back function is started, the image, that is, the third image, may be forcibly refreshed to obtain the image currently displayed on the display screen, so as to obtain the noise interfering with the ambient light.
In one possible implementation manner of the second aspect, the method further includes:
in response to receiving the first information, the first processor setting a write back flag as a first flag by a HWC module of the electronic device;
the stopping, in response to receiving the first image, the first processor from acquiring the first target image from the first image comprises:
in response to receiving a first image, the first processor querying, by the HWC module, that the write-back flag is a first flag;
the first processor sending, by the HWC module, the first image to a display subsystem of the electronic device based on the first flag;
the first processor stops storing a second image containing a first target image on the first image to a write-back memory of the electronic equipment through the display subsystem, wherein the first target image is an image in a first area;
the method further comprises the following steps:
in response to reaching a first time, the first processor setting, by the HWC module, the write-back flag to a second flag;
the first processor acquires a third image, acquires a second target image from the third image, and the second target image is an image in the first area, and comprises:
The first processor obtaining a third image through the HWC module;
the first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sending, by the HWC module, the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
in response to receiving the third image and the second information, the first processor stores a fourth image including a second target image on the third image to a write-back memory of the electronic device through the display subsystem, wherein the second target image is an image in the first area;
the first processor retrieves the second target image from the write-back memory through the HWC module;
the method further comprises the following steps:
the first processor sending, by the HWC module, the second target image to a noise algorithm library;
the first processor obtains first image noise through calculation of the noise algorithm base based on the second target image.
In a possible implementation manner of the second aspect, the first information includes a first duration, where the first duration is a duration for the display subsystem to stop storing the image to the write-back memory; the first moment is as follows: the write-back mark is set to a time when a first time length passes after the time of the first mark;
or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is when a second time length passes after the write-back flag is set as the first flag, the second time length is the time length obtained by subtracting the delay from the first time length, and the time length of the delay is the time length obtained by subtracting the second time from the time when the HWC module receives the first information.
In one possible implementation of the second aspect, the first processor obtaining, by the HWC module, the first image includes:
the first processor sending a first signal to a surface maker of the electronic device through the HWC module;
In response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameters.
In one possible implementation manner of the second aspect, after the first processor sets the write-back flag to be the second flag through the HWC module, and before the first processor acquires a third image through the HWC module, the method further includes:
the first processor acquires the moment when the image is refreshed on the electronic device last time through the HWC module;
and if the moment when the image of the electronic equipment is refreshed last time meets a first preset condition, the first processor acquires the third image through the HWC module.
In one possible implementation manner of the second aspect, after the first processor obtains, by the HWC module, a time when the electronic device last refreshes an image, the method further includes:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the first processor waits for a Surface flag module of the electronic equipment to send a second display parameter through the HWC module.
In a possible implementation manner of the second aspect, if a time when the electronic device last refreshes an image satisfies a first preset condition, the acquiring, by the first processor, the first image by the HWC module includes:
if the moment of refreshing the image of the electronic equipment last time meets a first preset condition, the first processor waits for a second time length through the HWC module;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second time length, the first processor acquires the first image through the HWC module.
In one possible implementation manner of the second aspect, the method further includes:
if the HWC module receives a fourth display parameter sent by Surface flag within the second duration, the first processor acquires a fifth image based on the fourth display parameter through the HWC module;
the first processor querying, via the HWC module, the write back flag as the second flag;
the first processor sending, by the HWC module, the fifth image and third information to the display subsystem based on the second flag;
in response to receiving the fifth image and the third information, the first processor stores, by the display subsystem, a sixth image including a third target image on the fifth image in a write-back memory of the electronic device, the third target image being an image within the first region;
The first processor retrieves the third target image from the write-back memory through the HWC module;
the first processor sending, by the HWC module, the third target image to a noise algorithm library of the electronic device;
and the first processor calculates and obtains second image noise based on the third target image through the noise algorithm library.
In a possible implementation manner of the second aspect, the first information includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device acquires the first value;
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time;
or, the electronic device meeting the first preset condition at the moment of last image refreshing includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
The moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time;
alternatively, the first and second liquid crystal display panels may be,
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
In one possible implementation manner of the second aspect, the method further includes:
after the first processor sets a write back flag as a first flag through the HWC module; the first processor monitoring, by the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
In response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a first luminance value from the core node;
after the first processor acquires a first brightness value from the kernel node through the HWC module, the first processor acquires a second brightness value from the kernel node through the HWC module in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the first processor sends the second luma value to the noise algorithm library through the HWC module.
In one possible implementation manner of the second aspect, the method further includes:
after the first processor sets a write back flag to a second flag via the HWC module, the first processor monitors, via the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a third luminance value from the core node;
The first processor sending, by the HWC module, the third luminance value to the noise algorithm library;
after the first processor sends the third luma value to the noise algorithm library through the HWC module, in response to snooping that data in a kernel node of the electronic device has changed, the first processor retrieves a fourth luma value from the kernel node through the HWC module;
the first processor sends the fourth luma value to the noise algorithm library through the HWC module.
In one possible implementation manner of the second aspect, the obtaining, by the first processor, the first image noise based on the second target image calculation through the noise algorithm library includes:
the first processor calculates a first image noise based on the second target image and the second luminance value through the noise algorithm library.
In one possible implementation of the second aspect, the first processor receiving, by the HWC module, the first image comprises:
the first processor receives a fifth display parameter sent by a Surface flanger module of the electronic device through the HWC module;
the first processor obtains, by the HWC module, the first image based on the fifth display parameter.
In a possible implementation manner of the second aspect, the first area is an area on a display screen of the electronic device, which is located above an ambient light sensor of the electronic device.
In one possible implementation manner of the second aspect, the obtaining, by the first processor, the last time the image is refreshed by the electronic device through the HWC module includes:
the first processor receives a sixth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor storing, by the HWC module, a time at which the sixth display parameter was received by the HWC module;
the first processor acquires, by the HWC module, a time when the image is last refreshed on the electronic device, where the time includes:
the first processor obtains, by the HWC module, the stored time at which the sixth display parameter was received, where the time at which the sixth display parameter was received is the time at which the HWC module last stored the received display parameter before the time at which the HWC module last refreshed the image was obtained.
In one possible implementation manner of the second aspect, the first display parameter includes: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
In a third aspect, an electronic device is provided, comprising a processor configured to execute a computer program stored in a memory, to implement the method of any of the first aspect or the method of any of the second aspect of the present application.
In a fourth aspect, a chip system is provided, which comprises a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any one of the second aspects of the present application.
In a fifth aspect, there is provided a computer readable storage medium storing a computer program which, when executed by one or more processors, performs the method of any one of the first or second aspects of the present application.
In a sixth aspect, embodiments of the present application provide a computer program product, which, when run on an apparatus, causes the apparatus to perform the method of any one of the first aspect or the second aspect of the present application.
It is understood that the beneficial effects of the second to sixth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
Fig. 2 is a diagram illustrating a positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating another positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present application;
fig. 4 is a diagram illustrating another positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present disclosure;
fig. 5 is a positional relationship diagram of a target area on a display screen according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a positional relationship between an ambient light sensor and a target area on a display screen according to an embodiment of the present disclosure;
FIG. 7 is a diagram of a technical architecture on which the method for detecting ambient light provided by embodiments of the present application relies;
fig. 8 is a schematic diagram of an acquisition cycle of the ambient light sensor for acquiring ambient light according to an embodiment of the present application;
FIG. 9 is a schematic diagram of time nodes for image refresh and backlight adjustment during an acquisition cycle in the embodiment shown in FIG. 8;
FIG. 10 is a timing flow diagram of a method for detecting ambient light based on the technical architecture shown in FIG. 7;
fig. 11 is a timing flow chart of various modules in the AP processor provided by the embodiment of the present application in the embodiment shown in fig. 10;
FIG. 12 is a diagram of another technical architecture upon which the method for detecting ambient light provided by embodiments of the present application relies;
FIG. 13 is another timing flow diagram of a method for detecting ambient light based on the technical architecture shown in FIG. 12;
FIG. 14 is a schematic diagram of calculating integral noise based on image noise and backlight noise at each time node provided by the embodiment shown in FIG. 9;
fig. 15 is a schematic diagram of each time node for performing image refreshing and backlight adjustment in an acquisition period in the time axis direction according to the embodiment of the present application;
FIG. 16 is a schematic diagram illustrating the calculation of integral noise based on the image noise and backlight noise at each time node provided by the embodiment shown in FIG. 15;
fig. 17 is a schematic diagram of a start-stop scheme of a CWB write-back function according to an embodiment of the present disclosure;
fig. 18 is a schematic flowchart illustrating a process of transmitting first information from an SCP processor to an AP processor according to an embodiment of the present application;
fig. 19 is a schematic diagram of a start-stop scheme of a CWB write-back function of forced refresh images according to an embodiment of the present application;
FIG. 20 is a schematic diagram illustrating events at various times in a start-stop scheme for a CWB write-back function provided by the implementation shown in FIG. 19;
figure 21 is a schematic diagram of the start-stop scheme for obtaining integral noise using the CWB write back function provided by the embodiment shown in figures 19 and 20;
FIG. 22 is a diagram illustrating a refresh state and an idle state of a display screen according to an embodiment of the present disclosure;
fig. 23 is a start-stop scheme of a CWB write-back function for forcibly refreshing an image when a display screen is in an idle state for a long time according to the embodiment of the present application;
fig. 24 is a start-stop scheme of the CWB write-back function using the forced refresh image shown in fig. 23 according to the embodiment of the present application;
fig. 25 is a start-stop scheme provided in an embodiment of the present application, which uses the CWB write-back function shown in fig. 17;
fig. 26 is a schematic flowchart of another start/stop scheme of the CWB writeback function provided in this embodiment of the present application;
FIG. 27 is a diagram illustrating a method for determining whether to forcibly refresh an image according to the embodiment shown in FIG. 26;
FIG. 28 is a schematic diagram of obtaining integral noise using the start-stop scheme shown in FIG. 26 according to an embodiment of the present disclosure;
FIG. 29 is a schematic diagram of another embodiment of the present application for obtaining integral noise using the start-stop scheme shown in FIG. 26;
fig. 30 is a schematic diagram of another embodiment of the present application for obtaining integral noise by using the start-stop scheme shown in fig. 26.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in the embodiments of the present application, "one or more" means one, two or more; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a exists singly, A and B exist simultaneously, and B exists singly, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," "fourth," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless otherwise specifically stated.
The noise monitoring method provided by the embodiment of the application can be suitable for electronic equipment provided with an OLED screen. The electronic device may be a tablet computer, a wearable device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or other electronic devices. The embodiment of the present application does not limit the specific type of the electronic device.
Fig. 1 shows a schematic structural diagram of an electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a touch sensor 180K, an ambient light sensor 180L, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. For example, the processor 110 is configured to execute the noise monitoring method in the embodiment of the present application.
Wherein the controller may be a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area can store an operating system and an application program required by at least one function. The storage data area may store data created during use of the electronic device 100.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc.
In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave.
In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio signals into analog audio signals for output and also to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to implement a noise reduction function in addition to listening to voice information. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on. For example, the microphone 170C may be used to collect voice information related to embodiments of the present application.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A.
In some embodiments, the touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic apparatus 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may employ an organic light-emitting diode (OLED). In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The embodiment of the present application does not particularly limit the specific structure of the main execution body of monitoring one kind of noise, as long as the program recorded with the noise monitoring method of the embodiment of the present application can be run to perform processing by the noise monitoring method provided by the embodiment of the present application. For example, an execution main body of the noise monitoring method provided by the embodiment of the present application may be a functional module capable of calling a program and executing the program in the electronic device, or a communication device, such as a chip, applied to the electronic device.
Fig. 2 is a front position relationship diagram of a display screen and an ambient light sensor in an electronic device according to an embodiment of the present application.
As shown in fig. 2, the projection of the ambient light sensor on the display screen of the electronic device is located at the upper half of the display screen of the electronic device. When a user holds the electronic equipment by hand, the ambient light sensor positioned at the upper half part of the electronic equipment can detect the light intensity and the color temperature of the environment at the front side (the orientation of the display screen in the electronic equipment) of the electronic equipment, and the light intensity and the color temperature are used for adjusting the brightness and the color temperature of the display screen of the electronic equipment, so that a better visual effect can be achieved. For example, the display screen may not be too bright in dark environments to cause glare, and may not be too dark in bright environments to cause poor visibility.
Fig. 3 is a side view of the display screen and the ambient light sensor in the electronic device. The display screen of the electronic equipment comprises from top to bottom: glass apron (printing opacity), display module assembly and protection pad pasting, wherein, the upper and lower all be used for showing the position relation when electronic equipment's display screen upwards places here. Because the ambient light sensor need gather the ambient light of the top of electronic equipment's display screen, consequently, can dig some with the display module assembly in the display screen, ambient light sensor is placed to this part, is equivalent to ambient light sensor and places the below of the glass apron in the display screen in, and the display module assembly is located the same layer. Note that the detection direction of the ambient light sensor coincides with the orientation of the display screen in the electronic device (the orientation of the display screen in the electronic device is upward in fig. 3). Obviously, this arrangement of the ambient light sensor sacrifices a portion of the display area. When a high screen occupation ratio is pursued, the arrangement mode of the ambient light sensor is not applicable.
As shown in fig. 4, another arrangement of the ambient light sensor is provided in the embodiment of the present application. And transferring the ambient light sensor from the lower part of the glass cover plate to the lower part of the display module. For example, the ambient light sensor is located below an Active Area (AA) area in the OLED display module, and the AA area is an area in the display module where image content can be displayed. This arrangement of the ambient light sensor does not sacrifice the display area. However, the OLED screen is a self-luminous display screen, when the OLED screen displays an image, a user can see the image from above the display screen, and similarly, the ambient light sensor located below the OLED screen can also collect light corresponding to the image displayed on the OLED screen. Therefore, the ambient light collected by the ambient light sensor includes light emitted by the display screen and actual ambient light from the outside. If the external real ambient light is to be accurately obtained, the light emitted by the display screen needs to be obtained in addition to the ambient light collected by the ambient light sensor.
As can be understood from fig. 4, since the ambient light sensor is located below the AA area, the AA area in the display module is not sacrificed due to the arrangement of the ambient light sensor. Therefore, the projection of the ambient light sensor on the display screen can be located in any area of the front of the display screen, and is not limited to the following arrangement: the projection of the ambient light sensor on the display screen is located at the top of the front of the display screen.
Regardless of which region of the display screen is located below the AA region, the projected area of the ambient light sensor on the display screen is much smaller than the area of the display screen itself. Thus, not all of the light emitted by the entire display screen may interfere with the ambient light collected by the ambient light sensor. But the light emitted from the display area above the ambient light sensor in the display screen and the light emitted from the display area above a certain range around the ambient light sensor will interfere with the ambient light collected by the ambient light sensor.
As an example, the light sensing area of the ambient light sensor has a light sensing angle, and the ambient light sensor may receive light within the light sensing angle but not light outside the light sensing angle. In fig. 5, the light emitted from point a above the ambient light sensor (within the photosensitive angle) and the light emitted from point B above a certain range around the ambient light sensor (within the photosensitive angle) both interfere with the ambient light collected by the ambient light sensor. While the light emitted from point C (located outside the light sensing angle) farther away from the ambient light sensor in fig. 5 will not interfere with the ambient light collected by the ambient light sensor. For convenience of description, a display area of the display screen that interferes with the ambient light collected by the ambient light sensor may be referred to as a target area (the target area may be referred to as a first area). The location of the target area in the display screen is determined by the specific location of the ambient light sensor under the AA area. As an example, the target area may be a square area centered at a center point of the ambient light sensor with a side length of a certain length (e.g., 80 microns, 90 microns, 100 microns). Of course, the target area may also be an area of other shape obtained by measurement that interferes with the light collected by the ambient light sensor.
As another example, fig. 6 is a schematic front view of an OLED screen of an electronic device provided in an embodiment of the present application, and as shown in fig. 6, the electronic device includes a housing, an OLED screen of the electronic device displays an interface, a corresponding area of the display interface in the display screen is an AA area, and an ambient light sensor is located behind the AA area. The center point of the target area coincides with the center point of the ambient light sensor.
It should be noted that, the ambient light sensor is a single device, and the manufacturer may be different, and the shape of the external appearance may also be different. The central point of the ambient light sensor in the embodiment of the present application is the central point of the photosensitive area where the ambient light sensor collects ambient light. In addition, the target area shown in FIG. 6 is larger than the projected area of the ambient light sensor on the OLED screen. In practical application. The target area may also be less than or equal to the projection area of the ambient light sensor on the OLED screen. However, the target area is typically larger than the photosensitive area of the ambient light sensor. As mentioned above, the actual ambient light from the outside is equal to the ambient light collected by the ambient light sensor minus the light emitted by the display screen. While the light emitted by the display screen has been determined to be the light emitted by the target area. The emitted light of the target area is light generated by the display content of the target area. And the interference of the display content to the ambient light collected by the ambient light sensor comes from two parts: RGB pixel information of the display image and the brightness of the display image. As can be understood from the above analysis, the interference to the ambient light collected by the ambient light sensor is: RGB pixel information of an image displayed by the target area and luminance information of the target area. As an example, if the pixel value of a pixel is (r, g, b) and the luminance is L, the normalized luminance of the pixel is: l x (r/255) 2.2 ,L×(g/255) 2.2 ,L×(b/255) 2.2
For convenience of description, an image corresponding to the target area may be denoted as a target image, and interference of RGB pixel information and luminance information of the target image on ambient light collected by the ambient light sensor may be denoted as fusion noise. The ambient light collected by the ambient light sensor can be recorded as initial ambient light, and the external real ambient light can be recorded as target ambient light.
From the above description it can be derived: the target ambient light is equal to the initial ambient light minus the fusion noise at each instant in the time period in which the initial ambient light was collected. In the embodiment of the present application, a process of calculating the fusion noise together according to the RGB pixel information and the luminance information is referred to as a noise fusion process.
When the display screen is in a display state, the RGB pixel information of the image displayed in the target area may change, and the brightness information of the displayed image may also change. The fusion noise may be changed whether the RGB pixel information of the image displayed in the target area is changed or the luminance information of the displayed image is changed. Therefore, it is necessary to calculate the fusion noise thereafter from the changed information (RGB pixel information or luminance information). If the image of the target area is unchanged for a long time, the fusion noise is calculated only when the brightness of the display screen is changed. Therefore, in order to reduce the frequency of calculating the fusion noise, the target region may be a region in which the frequency of change of the image displayed on the display screen is low. For example, a status bar area at the top of the front of the electronic device. The projection of the ambient light sensor on the display screen is located to the right in the status bar area of the display screen. Of course, the position of the ambient light sensor may be a position to the left in the status bar area or a position in the middle in the status bar area, and the embodiment of the present application does not limit the specific position of the ambient light sensor.
A technical architecture corresponding to the method for obtaining the target ambient light through the initial ambient light and the content displayed on the display screen provided by the embodiment of the present application will be described below by using fig. 7.
As shown in fig. 7, the processor in the electronic device is a multi-core processor, which at least includes: an AP (application processor) processor and an SCP (sensor processor) processor. The AP processor is an application processor in the electronic device, and an operating system, a user interface and an application program are operated on the AP processor. The SCP processor is a co-processor that may assist the AP processor in performing events related to images, sensors (e.g., ambient light sensors), and the like.
Only the AP processor and SCP processor are shown in fig. 7. In practical applications, the multi-core processor may also include other processors. For example, when the electronic device is a mobile phone, the multi-core processor may further include a Baseband (BP) processor that runs mobile phone radio frequency communication control software and is responsible for sending and receiving data.
The AP processor in fig. 7 only shows the content related to the embodiment of the present application, and the implementation of the embodiment of the present application needs to rely on: an Application Layer (Application), a Java Framework Layer (Framework Java), a native Framework Layer (Framework native), a Hardware Abstraction Layer (Hardware Abstraction Layer, HAL), a kernel Layer (kernel), and a Hardware Layer (Hardware).
The SCP processor in fig. 7 may be understood as a sensor control center (sensor hub) which can control the sensors and process data related to the sensors. The implementation of the embodiment of the present application needs to rely on: a cooperative application layer (Hub APK), a cooperative framework layer (Hub FWK), a cooperative driving layer (Hub DRV) and a cooperative hardware layer (Hub hardware).
Various applications exist in the application layer of the AP processor, and application a and application B are shown in fig. 7. Taking application a as an example, after the user starts application a, the display screen will display the interface of application a. Specifically, the application a sends the display parameters (for example, the memory address, the color, and the like of the interface to be displayed) of the interface to be displayed by the application a to the display engine service.
And the display engine service in the AP processor sends the received display parameters of the interface to be displayed to a surfaceFlinger of a Framework layer (Framework native) of the AP processor.
The surface Flinger in the native Framework layer (Framework native) of the AP processor is responsible for the fusion of the control interface (surface). As an example, an overlap region of at least two interfaces that overlap is calculated. The interface here may be an interface presented by a status bar, a system bar, the application itself (interface to be displayed by application a), wallpaper, background, etc. Therefore, the SurfaceFlinger can obtain the display parameters of the interface to be displayed by the application a and the display parameters of other interfaces.
The Hardware abstraction layer of the AP processor is provided with a HWC (Hardware component HAL), and the HWC is a module for synthesizing and displaying an interface in a system and provides Hardware support for a surfaflinger service. Step A1 is that the surfefinger sends the display parameters (e.g., memory address, color, etc.) of each interface to the HWC through the interfaces (e.g., setLayerBuffer, setLayerColor, etc.) for interface fusion. In practical applications, the display parameters may include: the location, size, color, memory address, etc. of the interface of the composite image on the display screen of the electronic device.
Generally, in image synthesis (for example, when an electronic device displays an image, it is necessary to synthesize a status bar, a system bar, an application itself, and a wallpaper background), the HWC obtains a synthesized image according to display parameters of each interface through hardware (for example, a hardware synthesizer) underlying the HWC. The HWC in the hardware abstraction layer of the AP processor sends the underlying hardware-synthesized image to the OLED driver, see step A2.
In practical applications, the HWC module may obtain the synthesized image based on the display parameters sent by the surfefinger in any manner.
The OLED drive of the kernel layer of the AP processor gives the synthesized image to the display subsystem (DSS) of the hardware layer of the AP processor, see step A3. The display subsystem (DSS) in the hardware layer of the AP processor may perform secondary processing (e.g., HDR10 processing for enhancing image quality) on the combined image, and may display the secondary processed image after the secondary processing. In practical application, the secondary treatment may not be performed. Taking the example of not performing the secondary processing, the display subsystem of the AP processor hardware layer sends the synthesized image to the OLED screen for display.
If the starting of the application a is taken as an example, the synthesized image displayed on the OLED screen is an interface synthesized by the interface to be displayed by the application a and the interface corresponding to the status bar.
The OLED screen can complete image refreshing and displaying once according to the mode.
In the embodiment of the present application, before the image after the secondary processing (or the image after the synthesis) is sent to be displayed, the display subsystem (DSS) may be controlled to store the whole frame of image (which may also be an image larger than the target area in the whole frame of image, or an image corresponding to the target area in the whole frame of image) in the memory of the kernel layer of the AP processor, and since the process belongs to Concurrent Write-Back image frame data, the memory may be recorded as a Write-Back (CWB) memory, see step A4.
In the embodiment of the present application, for example, the display subsystem stores the entire frame image in the CWB memory of the AP processor, and after the display subsystem successfully stores the entire frame image in the CWB memory, the display subsystem may send a signal indicating that the storage is successful to the HWC. The whole frame image corresponding to the image stored in the CWB memory by the display subsystem may be recorded as an image to be refreshed (the image to be refreshed may also be understood as an image after the current refresh).
The AP processor may also be configured to allow the HWC to access the CWB memory. The HWC may obtain the target image from the CWB memory after receiving a signal indicating that the storage sent by the subsystem was successful, see step A5.
It should be noted that, regardless of whether the image of the whole frame image or the image of the partial region in the whole frame image is stored in the CWB memory, the HWC can obtain the target image from the CWB memory. The process of the HWC obtaining the target image from the CWB memory may be denoted as HWC matting from the CWB memory.
The range of the target image can be understood as the length and width limited range size of the target image, the range of the image to be refreshed is the range of the whole frame image, and the length and width limited range size can also be adopted.
As an example, the size of the image to be refreshed is X1 (pixel) × Y1 (pixel), the size of the target image is X2 (pixel) × Y2 (pixel), and the size of the image stored in the CWB memory is X3 (pixel) × Y3 (pixel). X3 satisfies that X1 is not less than X3 not less than X2, and Y3 satisfies that Y1 is not less than Y3 not less than Y2.
Of course, when X3= X1 and Y3= Y1, the image stored in the CWB memory is an entire frame image. When X3= X2 and Y3= Y2, the image stored in the CWB memory is the target image.
Continuing to take application a as an example, when application a has a brightness adjustment requirement due to switching of the interface, application a sends the brightness to be adjusted to the display engine service.
And the display engine service in the AP processor sends the brightness to be adjusted to the kernel node in the kernel layer of the AP processor so as to adjust the brightness of the OLED screen by related hardware according to the brightness to be adjusted stored in the kernel node.
According to the mode, the OLED screen can complete one-time brightness adjustment.
In the embodiment of the present application, the HWC may be further configured to obtain the brightness to be adjusted from the kernel node, and the brightness to be adjusted may also be recorded as the brightness after the current adjustment, which is specifically referred to in step A5'.
In specific implementation, the HWC may monitor whether data stored in the kernel node changes based on the uevent mechanism, and obtain currently stored data, that is, a brightness value to be adjusted (the brightness value to be adjusted is used to adjust the brightness of the display screen, and therefore may also be recorded as the brightness value of the display screen) from the kernel node after monitoring that the data in the kernel node changes. After obtaining the target image or the brightness information to be adjusted, the HWC may send the target image or the brightness information to be adjusted to a noise algorithm library of a hardware abstraction layer of the AP processor. See step A6. The noise algorithm library can calculate and obtain the fusion noise at the refreshing time of the target image after the target image is obtained every time. After each brightness is obtained, the fusion noise at the brightness adjusting moment is calculated and obtained. And the noise algorithm library stores the fusion noise obtained by calculation in a noise memory of the noise algorithm library.
In practical applications, after the HWC obtains the target image, the HWC may store the target image, and the HWC may send the storage address of the target image to the noise algorithm library, and the noise algorithm library may buffer the target image of a frame at the latest moment in a manner of recording the address. After the HWC obtains the brightness to be adjusted, the HWC may send the brightness to be adjusted to a noise algorithm library, which may buffer a brightness at the latest moment. For convenience of description, the subsequent embodiments of the present application are described in terms of the HWC sending the target image to the noise algorithm library, and in practical applications, the HWC may obtain the target image and then store the target image, and send the storage address of the target image to the noise algorithm library.
As an example, after receiving the storage address of the first frame target image, the noise algorithm library buffers the storage address of the first frame target image. And each time a new storage address of the target image is received, the new storage address of the target image is used as the latest storage address of the cached target image. Correspondingly, the noise algorithm library buffers the first brightness after receiving the first brightness, and the new brightness is taken as the latest brightness buffered every time a new brightness is received. In the embodiment of the application, the noise algorithm library caches the acquired target image and the acquired brightness value in the data storage library. The target image and the luminance value stored in the data store may be recorded as screen data, i.e. the screen data stored in the data store includes: a target image and a luminance value.
In addition, in order to describe the transfer relationship between parameters such as the target image and the brightness to be adjusted, the HWC sends the parameters such as the target image and the brightness to be adjusted to the noise algorithm library in the embodiment of the present application as an example. In practice, the relationship between the HWC and the noise algorithm library calls the noise algorithm library for the HWC. When the HWC calls the noise algorithm library, the HWC inputs parameters such as a target image (a memory address of the target image), brightness to be adjusted, and the like as arguments of a calculation model in the noise algorithm library to the noise algorithm library. Other parameters will not be exemplified.
Because brightness adjustment and image refreshing are two completely independent processes, the image may be refreshed at a certain time, and the brightness remains unchanged, then the target image and the current brightness corresponding to the refreshed image are adopted when the fusion noise at the time is calculated (the brightness value stored in the noise algorithm library and stored latest before the time represented by the timestamp of the target image). For convenience of description, the fusion noise at the image refresh time calculated due to the image refresh may be regarded as the image noise at the image refresh time. Similarly, if the image is not refreshed at a certain time and the brightness is adjusted, the adjusted brightness and the current target image (the target image stored in the noise algorithm library and newly stored before the time indicated by the timestamp of the brightness value) are used for calculating the fusion noise at the certain time. For convenience of description, the fusion noise at the luminance adjustment timing calculated due to the luminance adjustment may be written as the backlight noise at the luminance adjustment timing.
The target image and the brightness sent by the HWC to the noise algorithm library are both time-stamped, and accordingly, both the image noise and the backlight noise obtained by the computation of the noise algorithm library are also time-stamped. The timestamp of the image noise is the same as the timestamp of the target image, and the timestamp of the backlight noise is the same as the timestamp of the brightness to be adjusted. The timestamp of the image noise should be the image refresh moment in the strict sense. In practical applications, another time node close to the image refresh time may be used as the image refresh time, for example, a start time (or an end time, or any time between the start time and the end time) when the HWC performs matting to obtain the target image from the CWB memory may be used as the image refresh time. The time stamp of the backlight noise should be strictly speaking the backlight adjustment instant. In practical applications, other time nodes close to the backlight adjusting time may also be used as the backlight adjusting time, for example, the start time (or the end time, or any time between the start time and the end time) when the HWC executes to obtain the brightness to be adjusted from the kernel node is used as the brightness adjusting time. The timestamp of the image noise and the timestamp of the backlight noise facilitate denoising of the initial ambient light collected by the subsequent ambient light sensor and the ambient light sensor over a time span to obtain the target ambient light. The noise algorithm library stores the image noise and the backlight noise in a noise memory, stores the time stamp of the image noise when the noise algorithm library stores the image noise, and stores the time stamp of the backlight noise when the noise algorithm library stores the backlight noise.
An Ambient Light Sensor (ALS) in the co-hardware layer of the SCP processor collects initial ambient light at a certain collection period after start-up (typically, after the electronic device is powered on, the ambient light sensor is started up). The ambient light sensor of the SCP processor transmits the initial ambient light information to the ambient light sensor driver (ALS DRV) of the co-driver layer (Hub DRV) layer of the SCP processor, see step E2.
The initial ambient light information transmitted by the SCP processor to the AP processor includes a first value, a first time and a second time, where the first value can be understood as a raw value of the initial ambient light, the first time is an integration start time at which the ambient light sensor acquires the first value, and the second time is an integration end time at which the ambient light sensor acquires the first value.
And in a cooperative driving (Hub DRV) layer of an SCP processor, an ambient light sensor driving (ALS DRV) carries out preprocessing on initial ambient light information to obtain raw values on four channels of the RGBC. The co-driver layer of the SCP processor transmits the raw values on the RGBC four channels to the ambient light sensor application of the co-application layer of the SCP processor, see step E3.
The ambient light sensor of the co-application layer of the SCP processor sends the raw values on the RGBC four channels and other relevant data (e.g. start time and end time of each time the ambient light sensor collects initial ambient light) to the HWC of the AP processor through a first inter-core communication (communication between the ambient light sensor application of the SCP processor and the HWC of the AP processor), see step E4.
After the HWC in the AP processor obtains the initial ambient light data reported by the SCP processor, the HWC in the AP processor may send the initial ambient light data to the noise algorithm library. See step A6.
As described above, the noise algorithm library may calculate the image noise at the image refresh timing and the backlight noise at the luminance adjustment timing, and store the calculated image noise and backlight noise in the noise memory in the noise algorithm library. In practical application, the noise algorithm library can calculate and obtain image noise at the image refreshing time and backlight noise at the brightness adjusting time. The integral noise between the acquisition start time and the acquisition end time of the initial ambient light may also be obtained from the image noise and the backlight noise stored in the noise memory after the acquisition start time and the acquisition end time of the initial ambient light are obtained. And the noise algorithm library deducts integral noise between the acquisition starting time and the acquisition ending time of the initial environment light from the initial environment light to obtain the target environment light.
As can be understood from the above description of the noise algorithm library, the noise calculation library includes a plurality of calculation models, for example, a first algorithm model, for obtaining the fusion noise according to the target image and the luminance calculation. And the second algorithm model is used for obtaining integral noise between the acquisition starting time and the acquisition ending time of the initial environment light according to the fusion noise at each moment. And the third algorithm model is used for obtaining the target ambient light according to the initial ambient light and the integral noise. In practical applications, the noise algorithm library may further include other calculation models, for example, if the raw values on the four channels of the initial ambient light are filtered in a process of obtaining the target ambient light based on the target image, the brightness, and the initial ambient light, there is a model of filtering the raw values on the four channels of the initial ambient light, and the embodiment of the present application does not exemplify any other models.
The inputs to the library of noise algorithms include: the target image and brightness acquired by the HWC at various times, and the initial ambient light correlation data acquired by the HWC from the SCP processor. The output of the noise algorithm library is: and the raw value of the target environment light can be recorded as a second value. In the embodiment of the present application, the process of sending the target image, the brightness, and the initial ambient light from the HWC to the noise algorithm library is all denoted as step A6.
The noise computation library also needs to return the target data to the HWC after obtaining the target ambient light, which is denoted as step A7. In practical applications, the output of the noise algorithm library is raw values on four channels of the target ambient light.
The HWC in the AP processor sends the raw values on the four channels of the target ambient light returned by the noise algorithm library to the ambient light sensor application in the cooperative application layer of the SCP processor through first inter-core communication, see step A8.
After the ambient light sensor application of the co-driver layer of the SCP processor obtains the raw values on the target ambient light four-channel, the raw values on the target ambient light four-channel are stored in the ambient light memory of the co-driver layer. See step E5.
The co-driver layer of the SCP processor is provided with a calculation module that obtains from memory the raw values on the target ambient light four channels, see step E6. When the integration of the ambient light sensor is finished each time, an integration interrupt signal is generated, the ambient light sensor sends the integration interrupt signal to the ambient light sensor driver, the ambient light sensor driver calls the calculation module, and the calculation module is triggered to obtain raw values of the target ambient light on four channels from the storage.
The ambient light sensor drive triggers the calculation module to acquire the raw value of the target ambient light after the integration is finished, so that the raw value of the target ambient light in the previous integration period is acquired at the moment.
Taking the embodiment shown in FIG. 8 as an example, at t 1 After the time integral is finished, the ambient light sensor obtains t 0 Time to t 1 Initial ambient light at time, SCP processor will t 0 Time to t 1 The initial ambient light at the moment is sent to an AP processor, and the AP processor obtains t through calculation 0 Time to t 1 Raw value of the target ambient light at the time. AP processor will t 0 Time to t 1 The raw value of the target ambient light at the time is sent to the SCP processor. The SCP processor will store t 0 Time to t 1 The raw value of the target ambient light at the time is entered into the memory of the SCP processor.
At t 3 After the time integral is finished, the ambient light sensor obtains t 2 Time to t 3 Initial environment of time of dayLight, SCP processor will t 2 Time to t 3 The initial ambient light at the time is sent to the AP processor. An integral interruption signal is generated after the integration of the ambient light sensor is finished every time, the ambient light sensor sends the integral interruption signal to the ambient light sensor drive, the ambient light sensor drive calls the calculation module, and the calculation module is triggered to obtain the currently stored t from the storage 0 Time to t 1 Raw value of the target ambient light at the time. Since this time is t 3 After the moment, the calculation module therefore at t 3 After the moment according to t obtained 0 Time to t 1 And calculating the raw value of the target ambient light at the moment to obtain the lux value of the target ambient light. That is, the SCP processor calculates the lux value of the target ambient light obtained in the T2 period as the lux value of the real ambient light in the T1 period.
As previously mentioned, the ambient light sensor in the SCP processor will end up integrating (t) 3 Time) is followed by an integration interrupt signal (which gives the ambient light sensor drive) and at t 3 And after the moment, the initial ambient light of the T2 period is sent to the AP processor, the target ambient light is sent to the SCP processor after the AP processor calculates and obtains the target ambient light, and the SCP processor stores the target ambient light of the T2 period in a memory. If the SCP processor calculates the lux value using the raw value of the target ambient light for the T2 period, it will start waiting until the AP processor transmits the target ambient light to the memory of the SCP processor, starting with the reception of the integration interrupt signal from the ambient light sensor driver. The ambient light sensor in the SCP processor is driven to invoke the calculation module to obtain the raw value of the target ambient light for the T2 period from the memory. The waiting time includes at least: the process of transmitting the initial ambient light to the AP processor by the SCP processor, the process of calculating and obtaining the target ambient light by the AP processor based on the initial ambient light and other related data and the process of transmitting the target ambient light to a memory in the SCP processor by the AP processor are respectively corresponding to the time, and the time is relatively longer and is not fixed. Therefore, the ambient light sensor driver in the SCP processor can be set to call the calculation module after receiving the integral interrupt signal of the second acquisition cycle device The memory fetches the raw value of the target ambient light of the previous cycle, thereby calculating the lux value from the raw value of the target ambient light of the previous cycle. The lux value of the target ambient light can be recorded as a third value, and the third value and the second value are the lux value and the raw value of the same target ambient light.
Taking the collection period shown in fig. 8 as an example, if the raw value of the initial ambient light collected in the collection period T1 is the first value. The raw value of the target ambient light corresponding to the acquisition period T1, which is obtained according to the raw value of the initial ambient light acquired by the acquisition period T1, is a second value. And the lux value of the target environment light corresponding to the acquisition period T1, which is obtained according to the raw value of the target environment light corresponding to the acquisition period T1, is a third value. The raw value of the initial ambient light acquired during the acquisition period T2 may be recorded as a fourth value. The fourth value is the initial ambient light acquired in an acquisition period subsequent to the acquisition period corresponding to the first value (or the acquisition period corresponding to the second value, or the acquisition period corresponding to the third value).
And a calculation module in a co-driving layer of the SCP processor obtains the lux value of the target ambient light according to the raw value on the four channels of the target ambient light. And the calculation module in the SCP processor sends the calculated lux value of the target ambient light to the ambient light sensor application of the co-application layer through the interface of the co-framework layer, which refers to steps E7 and E8.
The ambient light sensor application of the co-application layer in the SCP processor transmits the lux value of the target ambient light to the light service (light service) of the raw framework layer in the AP processor through the second inter-core communication (communication of the optical service from the SCP processor to the AP processor), see step E9.
A light service (light service) may send the lux value of the target ambient light to the display engine service. The display engine service may send the lux value of the target ambient light to the upper layer to facilitate an application in the application layer to determine whether to adjust the brightness. The display engine service can also send the lux value of the target ambient light to the kernel node so as to enable related hardware to adjust the brightness of the display screen according to the lux value of the target ambient light stored by the kernel node.
After describing the technical architecture on which the method of obtaining the target ambient light depends, the process of obtaining the target ambient light from the target image, the brightness, and the initial ambient light collected by the ambient light sensor will be described from the perspective of the collection period of the ambient light sensor.
As can be understood from the above examples, the target image and the brightness to be adjusted are both obtained by the HWC, and therefore, there is a sequential order in the processes of obtaining the target image and obtaining the brightness to be adjusted by the HWC. After the HWC acquires the target image or the brightness to be adjusted, the target image or the brightness to be adjusted is sent to the noise algorithm library, and the process that the HWC sends the target image or the brightness to be adjusted to the noise algorithm library also has a sequence. Correspondingly, the time when the noise algorithm library receives the target image and the brightness to be adjusted also has a sequence. However, even if there is a chronological order in the time when the noise algorithm library receives the target image and the brightness to be adjusted, the timestamps of the target image and the brightness to be adjusted may be the same since the HWC may be within the same time metric level when acquiring the target image and the brightness to be adjusted. As an example, within the same millisecond (5 th millisecond), the HWC performs the acquisition of the brightness to be adjusted first and then performs the acquisition of the target image. Although there is a precedence in the execution of the HWC, the time stamps of the target image and the brightness to be adjusted are both 5 th msec.
Referring to fig. 8, the ambient light sensor collects the ambient light at a time period T from which the ambient light sensor collects the ambient light 0 To t 2 (acquisition period T1) from T 2 To t 4 (acquisition period T2), from T 4 To t 6 (acquisition period T3) is one acquisition period. In the acquisition period of T1, the real acquisition time of the ambient light sensor is T 0 To t 1 From t 1 To t 2 The ambient light sensor may be in a sleep state for this period of time. The embodiment of the present application is described by taking as an example that the acquisition period of the ambient light is fixed (i.e., the values of T1, T2, and T3 are the same) and the duration of the integration period is fixed.
As an example, it may be in 350ms (t) 2 -t 0 ) As one acquisition cycle. The actual acquisition time of the ambient light sensor in one acquisition period is 50ms (t) 1 -t 0 ) Then the ambient light sensor will have 300ms (t) in one acquisition period 2 -t 1 ) Is in a dormant state. The above examples of 350ms, 50ms and 300ms are for example only and not intended to be limiting.
For ease of description, the time period (e.g., t) for which the ambient light sensor actually collects may be 0 To t 1 ) Noted as an integration period, a period of time (e.g., t) during which the environmental sensor does not initiate acquisition 1 To t 2 ) Noted as a non-integration period.
The image displayed on the display screen of the electronic device is refreshed at a certain frequency. Taking 60Hz as an example, it is equivalent to refreshing the display screen of the electronic device 60 times per second, or refreshing the image every 16.7 ms. Image refresh occurs during the acquisition period of the ambient light sensor when the display screen of the electronic device displays images. When the image displayed on the display screen is refreshed, the AP processor performs steps A1 to A6 (transmission target image) in the technical architecture shown in fig. 7. HWC in AP processor from t 0 Starting at the moment, the CWB is controlled to write back all the time, i.e. the above steps are repeated all the time as long as there is an image refresh.
Note that, in the present embodiment, a refresh rate of 60Hz is taken as an example. In practice, the refresh rate may be 120Hz or other refresh rates. In the embodiment of the present application, the above steps A1 to A6 (sending target images) need to be repeatedly executed every refresh of one frame, and in practical applications, the above steps A1 to A6 (sending target images) may also be repeatedly executed every other frame (or two frames, etc.).
The brightness adjustment does not have a fixed periodicity, so the brightness adjustment may also occur during the acquisition period of the ambient light sensor. When the brightness is adjusted, the HWC also performs steps A5' to A6 (sending the brightness to be adjusted) in the technical architecture shown in fig. 7.
After each integration of the ambient light sensor (i.e. at t) 1 After that, t 3 After that, t 5 Then \8230;) the data of the initial ambient light collected by the current integration process (for example, r on four channels of the initial ambient light) is reported by the SCP processoraw value and the integration start time and the integration end time of the current integration process) to the HWC of the AP processor, the HWC of the AP processor sends the relevant data of the initial ambient light to the noise algorithm library, and the target ambient light is obtained through calculation of the noise algorithm library.
Referring to FIG. 9, taking an acquisition cycle as an example, at t 01 Time (sum t) 0 The same time), t 03 Time t 04 Time t and 11 the moments are all image refreshing moments at t 02 Time and t 12 The time is the brightness adjustment time. Thus, the AP processor can compute t in real time 01 Image noise at time t 02 Backlight noise at time t 03 Image noise at time t 04 Image noise at time, t 11 Image noise and t at time 12 Backlight noise at the moment. At the end of this integration (t) 1 Time of day), the noise memory of the AP processor stores: t is t 01 Image noise at time t 02 Backlight noise at time t 03 Image noise and t at time 04 Image noise at the moment.
At the end of this integration (t) 1 Time), the ambient light sensor obtains the initial ambient light of the integration and the integration time period. The SCP processor reports the data of the initial ambient light to the AP processor, and a noise calculation module in the AP processor obtains t from a noise memory according to the starting time and the ending time of the current integration time period 01 Image noise at time t 02 Backlight noise at time t 03 Image noise at time, t 04 Image noise at time instants. And the noise calculation library calculates and obtains target environment light according to the initial environment light collected in the integral time period and the image noise and backlight noise influencing the integral time period.
During a non-integration period (t) 1 To t 2 ) Since the HWC always controls the CWB write back, therefore, the HWC is on t 11 The refreshed image at the moment is also subjected to matting to obtain a target image, and the noise algorithm library also calculates t 11 Image noise at time instants. Non-integral time period t 12 Variation in brightness at any time, noiseAlgorithm library also calculates t 12 Backlight noise at the moment. However, when the target ambient light is obtained by calculation, the required fusion noise is the fusion noise that interferes with the initial ambient light obtained in the current integration period, and therefore, t is not required 11 Image noise and t at time 12 The backlight noise at the moment can also obtain the target ambient light of the current integration time period. In practical application, the noise algorithm library computer obtains t 11 Image noise and t at time 12 After the backlight noise at the moment, t also needs to be set 11 Image noise sum of time of day t 12 The backlight noise at the moment is stored in a noise memory.
The above example describes the process of acquiring the target ambient light from the perspective of the technical architecture based on fig. 7 and from the perspective of the acquisition period of the ambient light sensor based on fig. 9, respectively. A time sequence process diagram for acquiring the target ambient light provided by the embodiment shown in fig. 10 will be described below with reference to the technical architecture shown in fig. 7 and one acquisition cycle of the ambient light sensor shown in fig. 9.
It can be understood from the above description that the process of the image refreshing triggering the AP processor to calculate the image noise, the process of the brightness adjusting triggering the AP processor to calculate the backlight noise, and the process of the SCP processor controlling the underlying hardware ambient light sensor to collect the initial ambient light are performed independently, and there is no chronological order. And the noise calculation library of the AP processor processes the target image, the brightness and the initial ambient light obtained in the three independent processes to obtain the target ambient light.
The same step numbers in the embodiment shown in fig. 10 and in the technical architecture shown in fig. 7 indicate that the same steps are performed. In order to avoid repetitive description, the contents detailed in the embodiment shown in fig. 7 will be briefly described in the embodiment shown in fig. 10.
From t, in connection with FIG. 9 0 The moment begins and the image is refreshed. At the same time, the ambient light sensor enters an integration period to begin collecting initial ambient light.
Accordingly, in FIG. 10, step E1, the ambient light sensor in the co-hardware layer of the SCP processor enters an integration period, from t 0 (t 01 ) The initial ambient light is collected at the beginning of the time.
Step A1, image t 0 (t 01 ) And refreshing at the moment, and sending the display parameters of the interface to the HWC in the hardware abstraction layer of the AP processor by the SurfaceFlinger in the native framework layer of the AP processor. The HWC may send the received display parameters of each layer interface sent by the surfafinger to the hardware at the bottom of the HWC, and the hardware at the bottom of the HWC obtains the image synthesized by each layer interface according to the display parameters of each layer interface. The hardware underlying the HWC returns the synthesized image to the HWC.
In step A2, the HWC in the hardware abstraction layer of the AP processor sends the synthesized image to the OLED driver in the kernel layer of the AP processor. And step A3, the OLED in the kernel layer of the AP processor drives and sends the synthesized image to a display subsystem of a hardware layer of the AP processor.
And step A4, the display subsystem in the hardware layer of the AP processor stores the image before display in a CWB memory in the kernel layer of the AP processor.
In the embodiment of the present application, the HWC waits for a successful store signal from the display subsystem after sending the synthesized image to the OLED driver.
The display subsystem will send a signal to the HWC that the image was successfully stored in the CWB memory before being displayed. After receiving the signal that the display subsystem is successfully stored, the HWC performs cutout operation on the image before display stored in the CWB memory in the kernel layer to obtain a target image.
And step A5, the HWC in the hardware abstraction layer of the AP processor performs matting to obtain a target image from the image before display stored in the CWB memory in the kernel layer.
Step A6, after obtaining a target image, the HWC in the hardware abstraction layer of the AP processor sends the target image to a noise algorithm library of the layer, and after receiving the target image, the noise algorithm library calculates t according to the target image and cached current brightness information 01 Image noise at time instants. In the execution process of the step A1 to the step A6, the ambient light sensor in the assistant hardware layer of the SCP processor is always in the integration process in one acquisition period.
In conjunction with FIG. 9, at t 02 At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t 02 At that moment, the brightness of the display screen changes, triggering the execution of step B1 in fig. 10.
In FIG. 10, step B1 (step A5' in the architecture shown in FIG. 7), the HWC of the hardware abstraction layer of the AP processor obtains t from a kernel node in the kernel layer of the AP processor 02 Luminance information of the time instant.
Step B2 (step A6), HWC of hardware abstraction layer of AP processor will t 02 The brightness information of the moment is sent to a noise algorithm library according to t 02 The brightness information of the moment and the cached currently displayed target image are calculated to obtain t 02 Backlight noise at the moment.
In the execution process of the step B1 to the step B2, the ambient light sensor in the assistant hardware layer of the SCP processor is always in an integration process in an acquisition period.
After step B2, the noise memory of the noise algorithm library stores t 01 Image noise sum of time of day t 02 Backlight noise at the moment.
In conjunction with FIG. 9, at t 03 At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t 03 At that time, the image is refreshed once.
In fig. 10, since the image is refreshed, the steps C1 to C6 are continuously performed, and the steps C1 to C6 can refer to the descriptions in A1 to A6, and are not described again here.
In the execution process of the step C1 to the step C6, the ambient light sensor in the co-hardware layer of the SCP processor is still in the integration process within one acquisition period.
After step C6, the noise memory of the noise algorithm library stores t 01 Image noise at time, t 02 Backlight noise and t at time 03 Image noise at the moment.
See FIG. 9, at t 04 At that moment, the ambient light sensor is still in the integration period, collecting the initial environmentLight. At t 04 At that time, the image is refreshed once.
In fig. 10, since the image is refreshed, the steps D1 to D6 are continuously performed, and the steps D1 to D6 can refer to the descriptions in A1 to A6, which are not repeated herein.
In the execution process of the step D1 to the step D6, the ambient light sensor in the assistant hardware layer of the SCP processor is still in the integration process in one acquisition period.
After step D6, the noise memory of the noise algorithm library stores t 01 Image noise at time, t 02 Backlight noise at time, t 03 Image noise and t at time 04 Image noise at time instants.
In conjunction with FIG. 9, at t 1 At this time, the current integration of the ambient light sensor is finished, and the integration of the ambient light sensor is finished (t) 1 Time), the ambient light sensor obtains the initial ambient light, and in fig. 10, the SCP processor starts to execute step E2, step E3, and step E4, and transmits the correlation data (raw value, integration start time, and integration end time on the RGBC four channels) of the initial ambient light to the HWC of the hardware abstraction layer of the AP processor.
In conjunction with FIG. 9, during non-integration periods, the image may also be refreshed (e.g., t 11 Temporal image refresh), brightness may also change (e.g., t) 12 The brightness changes at the moment). Therefore, in the non-integration period, steps F1 to F6 still exist in fig. 10 (steps F1 to F5 in fig. 10 are omitted, and specifically, refer to steps A1 to A5), so that t 11 The image noise at the time is stored in a noise memory of a noise algorithm library. In the non-integration period, there still exists step G1 to step G2 (step G1 in fig. 9 is omitted, and specifically, refer to step B1), so that t 12 The backlight noise at a time is stored in a noise memory of a noise algorithm library.
Step A6', the HWC in the hardware abstraction layer of the AP processor sends the initial ambient light data to the noise algorithm library. And the noise algorithm library calculates and obtains the target ambient light according to the data of the initial ambient light and the image noise and the backlight noise which interfere with the initial ambient light.
As can be understood from fig. 10, the integration start time and the integration end time of the ambient light sensor are controlled by the corresponding clock of the ambient light sensor; the process of calculating the image noise by the AP processor is controlled by an image refreshing clock; the process of the AP processor calculating the backlight noise is controlled by the timing of the backlight adjustment. Therefore, the execution of step A1 (or steps C1, D1, F1) is triggered by an image refresh. The execution of step B1 (or step G1) is triggered by the brightness adjustment. The integration starting time and the integration ending time of the ambient light sensor are completely carried out according to the preset acquisition period and the integration duration of each time. The execution of step E2 is therefore triggered by the event that the ambient light sensor integration is over.
From the triggering event perspective, these three processes are completely independent. However, the results obtained by the three processes (image noise, backlight noise, and initial ambient light) are correlated by the denoising process after the ambient light sensor integration period is over. The initial ambient light fused in the denoising process is the initial ambient light acquired by the ambient light sensor in the current acquisition period, and the image noise and the backlight noise removed in the denoising process are image noise and backlight noise capable of causing interference to the initial ambient light acquired in the current acquisition period.
The embodiment of the application can obtain by analyzing the structure of the ambient light under the screen: factors disturbing the ambient light collected by the ambient light sensor include the display content of the display area directly above the photosensitive area of the ambient light sensor and the display area directly above a certain area around the photosensitive area of the ambient light sensor. The display content is divided into two parts: RGB pixel information and luminance information of the display image. Therefore, the noise calculation library in the embodiment of the present application obtains the fusion noise according to the fusion of the RGB pixel information and the luminance information of the target image. Then, integral noise of an integral period of the initial ambient light is obtained from the fusion noise. The target ambient light is obtained by removing integral noise that interferes with the initial ambient light from the initial ambient light obtained from the ambient light sensor integration period. Because the interference part is removed, accurate target environment light can be obtained, and the universality is strong.
In addition, since the AP processor of the electronic device can obtain the target image and the luminance information, accordingly, the AP processor obtains image noise and backlight noise. The SCP processor may obtain initial ambient light. Thus, the SCP processor may send the initial ambient light to the AP processor, where the initial ambient light and the fusion noise are processed by the AP processor to obtain the target ambient light. The problem that the AP processor frequently sends the target image (or image noise) and the brightness information (or backlight noise) to the SCP processor, and the inter-core communication is too frequent and consumes more power is avoided.
Furthermore, the DSS in the AP processor may store the image before display (the image to be displayed this time refreshed on the display screen) in the CWB memory. The HWC in the AP processor extracts a target image from an image before display sending stored in the CWB memory so as to calculate and obtain fusion noise, and the fusion noise obtained by the method is accurate and has low power consumption.
It should be noted that, in the case of displaying an image on the display screen, the brightness of the display screen needs to be adjusted according to the target ambient light. In the case where the display screen does not display any image, it is not necessary to adjust the brightness of the display screen in accordance with the target ambient light. Therefore, the AP processor also needs to monitor the display screen for on and off screen events. When the screen is bright, the detection method of the ambient light provided by the embodiment of the application is executed. When the screen is turned off, the AP processor may not perform steps A4 to A6. Similarly, the SCP processor may also control the ambient light sensor to stop collecting the initial ambient light when the screen is turned off, and the SCP processor may not perform step E2 to step E5.
To provide a clearer understanding of the execution inside the AP processor, a timing diagram between various modules inside the AP processor will be described, which is obtained by obtaining t in the embodiment shown in fig. 10 01 Image noise at time t 02 The backlight noise at the time is described as an example.
In the embodiment shown in fig. 11, when refreshing an image, the respective modules in the AP processor perform the following steps:
step 1100, after the display engine service obtains the display parameters of the interface to be displayed from the application in the application layer, the display engine service sends the display parameters of the interface to be displayed to the surface flicker.
In step 1101, after the surface flinger obtains the display parameters of the interface to be displayed of the application a from the display engine service, the display parameters (e.g., memory address, color, etc.) of each interface (the interface to be displayed of the application a, the status bar interface, etc.) are sent to the HWC through the interfaces (e.g., setLayerBuffer, setLayerColor).
And 1102, after the HWC receives the display parameters of each interface, acquiring a synthesized image according to the display parameters of the interface to be displayed through hardware of the HWC bottom layer.
And step 1103, after the hwc obtains the image synthesized by the hardware of the bottom layer, the synthesized image is sent to the OLED driver.
And step 1104, after receiving the synthesized image sent by the HWC, the oled driver sends the synthesized image to the display subsystem.
Step 1105, after the display subsystem receives the synthesized image, it performs a secondary processing on the synthesized image to obtain the image before display.
At step 1106, the display subsystem stores the pre-display image in the CWB memory.
It should be noted that, since the OLED screen needs to refresh the image, the display subsystem needs to send the image before being sent to the display screen for display.
In the embodiment of the application, the step of sending the image before being sent and displayed to the display screen by the display subsystem and the step of storing the image before being sent and displayed in the CWB memory by the display subsystem are two independent steps without strict precedence order.
In step 1107, after the display subsystem successfully stores the pre-rendered image in the CWB memory, it may send a store success signal to the HWC.
In step 1108, after receiving the signal that the storage is successful, the HWC performs matting to obtain the target image from the pre-rendering image stored in the CWB memory, and the time when the HWC starts to obtain the target image is used as the timestamp of the target image.
In step 1109, the hwc sends the target image and the timestamp to the noise algorithm library after acquiring the target image and the timestamp.
Step 1110, the noise algorithm library calculates and obtains the image noise (t) at the refreshing time corresponding to the target image 01 Image noise at the time of day). The timestamp of the image noise is the timestamp of the target image from which the image noise is obtained. A noise algorithm bank stores the image noise and a timestamp of the image noise.
During brightness adjustment, each submodule in the AP processor executes the following steps:
and 1111, after the display engine service obtains the brightness to be adjusted from the application a in the application layer, the display engine service sends the brightness to be adjusted to the kernel node.
In step 1112, the hwc acquires the brightness to be adjusted from the core node after monitoring that the data in the core node changes. The time when the HWC executes the retrieval of the brightness to be adjusted from the kernel node is a time stamp of the brightness to be adjusted.
In practical applications, the HWC always listens to the kernel node for data changes.
In step 1113, the HWC sends the adjusted brightness and the timestamp of the brightness to be adjusted to the noise algorithm library.
Step 1114, the noise algorithm library calculates the backlight noise (t) at the adjustment time of the brightness to be adjusted 02 Backlight noise at the moment). The timestamp of the backlight noise is the timestamp of the brightness to be adjusted of the backlight noise. A noise algorithm base stores the backlight noise and a time stamp of the backlight noise.
After the end of an integration period, the SCP processor sends the initial ambient light collected during the integration period to the HWC in the AP processor.
In step 1115, the HWC of the ap processor receives the initial ambient light sent by the SCP processor and the integration start time and integration end time of the initial ambient light.
In step 1116, after receiving the initial ambient light sent by the SCP processor and the integration start time and the integration end time of the initial ambient light, the hwc sends the initial ambient light and the integration start time and the integration end time of the initial ambient light to the noise algorithm library.
In step 1117, the noise algorithm library calculates the integration noise according to the image noise and the corresponding timestamp, the backlight noise and the corresponding timestamp and the integration start time and the integration end time of the initial ambient light. And the noise algorithm library is used for calculating and obtaining the backlight noise according to the integral noise and the initial ambient light.
The embodiment of the application mainly describes a sequential logic diagram among modules when the AP processor obtains target ambient light.
In the above embodiments, the example is that after the AP processor acquires the target image and the luminance information, the AP processor calculates the fusion noise, the SCP processor acquires the initial ambient light, and then sends the initial ambient light to the AP processor, the AP processor processes the fusion noise to obtain the integral noise of the integral time period of the initial ambient light, and then obtains the target ambient light according to the initial ambient light and the integral noise.
In practical applications, the AP processor may also obtain the target image and the brightness information and then send the target image and the brightness information to the SCP processor. The SCP processor fuses the target image and the brightness information to obtain fusion noise and integral noise of an integral time period of the initial ambient light, and then obtains the target ambient light according to the fusion noise and the initial ambient light.
In practical application, after the AP processor acquires a target image and brightness information, the AP processor calculates fusion noise and sends the fusion noise obtained by calculation to the SCP processor. The SCP processor obtains integral noise of an integral time period according to the received fusion noise, and obtains target ambient light according to the integral noise of the integral time period and initial ambient light collected by the ambient light sensor.
Referring to fig. 12, the fusion noise is calculated and obtained at the AP processor according to the embodiment of the present disclosure; and calculating integral noise at the SCP processor, and obtaining target ambient light according to the integral noise and the target ambient light.
As mentioned above, the process of obtaining the target ambient light can be briefly described as follows:
step 1, calculating image noise according to a target image.
And 2, calculating backlight noise according to the brightness.
And 3, calculating target ambient light (raw values on four channels) according to the image noise, the backlight noise and the initial ambient light.
In the technical architecture shown in fig. 7, step 3, the target ambient light is calculated according to the image noise, the backlight noise and the initial ambient light, and is implemented in the noise algorithm library of the AP processor. The noise algorithm library of the AP processor can calculate the image noise and the backlight noise. The initial ambient light is derived from the driving of the ambient light sensor of the SCP processor. Therefore, the AP processor noise algorithm library needs to acquire the initial ambient light related data reported by the SCP processor (steps E3 to E4). The AP processor finally returns the calculated values on the four channels of the target ambient light to the SCP processor to obtain the Lux value of the target ambient light (step A8, step E5, step E6).
In the technical architecture shown in fig. 12, step 3, the target ambient light is calculated according to the image noise, the backlight noise and the initial ambient light, and is implemented in the denoising module of the SCP processor. The image noise and backlight noise are acquired by the AP processor and the initial ambient light is acquired by the ambient light sensor drive of the SCP processor. Therefore, the denoising module of the SCP processor needs to acquire the image noise and the backlight noise transmitted by the AP processor (step A8, step E5, step E6), and also needs the ambient light sensor of the SCP processor to drive the transmitted initial ambient light (step E3).
In view of the above analysis, in the technical architecture shown in fig. 7, the calculations of step 1 to step 3 need to be implemented in the noise algorithm library of the AP processor. In the technical architecture shown in fig. 12, step 1 and step 2 need to be implemented in the noise algorithm library of the AP processor, and step 3 needs to be implemented in the computation module of the scp processor.
For a clearer understanding of the process of obtaining the target ambient light corresponding to the technical architecture shown in fig. 12, reference is made to a timing chart shown in fig. 13. From t in connection with events at various times in FIG. 9 0 At the beginning of time, the image is brushedNew. At the same time, the ambient light sensor enters an integration period to begin collecting initial ambient light.
Accordingly, in FIG. 13, step E1, the ambient light sensor in the co-hardware layer of the SCP processor enters an integration period from t 0 (t 01 ) The initial ambient light is collected at the beginning of the time.
Step A1 to step A6, refer to the description of step A1 to step A6 in the embodiment shown in fig. 7.
Step A7, noise algorithm base in hardware abstraction layer in AP processor will t 01 The image noise at that moment is sent to the HWC of the same layer.
Step A8, calculating and obtaining t in the AP processor 01 After the image noise of the moment, t 01 The image noise at the moment is sent to the ambient light sensor application of the co-application layer of the SCP processor.
Step A9 (step E5 in the architecture shown in FIG. 12), the ambient light sensor application of the cooperative application layer of the SCP processor assigns t 01 And the image noise at the moment is sent to a noise memory of the SCP processor co-driving layer.
Step B1 to step B2 refer to the description of step B1 to step B2 in the embodiment shown in fig. 7.
Step B3, the noise algorithm library in the hardware abstraction layer in the AP processor converts t 02 The backlight noise at the moment is sent to the HWC of the same layer.
Step B4, calculating and obtaining t in the AP processor 02 After the backlight noise of the moment, t 02 The backlight noise at the moment is sent to the ambient light sensor application of the co-application layer of the SCP processor.
Step B5 (step E5 in the architecture shown in FIG. 11), ambient light sensor application t of the co-application layer of the SCP processor 02 And the backlight noise at the moment is sent to a noise memory of the SCP processor co-driving layer.
Steps C1 to C9, and steps D1 to D9 refer to the description of steps A1 to A9, and are not repeated.
After the ambient light sensor integration is over, the SCP processor is triggered to perform step E2, step E2 is described with reference to the embodiment shown in fig. 7.
And E3 to E6, the SCP processor cooperates with a denoising module in the driving layer to take out the fusion noise from the noise memory of the layer, and the raw values on the four channels of the initial ambient light are obtained from the ambient light sensor of the layer. And calculating according to raw values on four channels of the initial ambient light and image noise and backlight noise which interfere with the initial ambient light to obtain the target ambient light. During non-integration periods, the image may also be refreshed (e.g., t 11 Temporal image refresh), brightness may also change (e.g., t 12 The brightness changes at the moment). Therefore, in the non-integration period, steps F1 to F9 still exist in fig. 13 (steps F1 to F5 in fig. 13 are omitted, and specifically, steps A1 to A5 in fig. 13 may be referred to), so that t 11 The image noise at the time is stored in a noise memory of the SCP processor. In the non-integration period, there are still steps G1 to G5 (step G1 in fig. 13 is omitted, and in particular, step B1 in fig. 13 may be referred to), so that t 12 The backlight noise at a time is stored in a noise memory of a noise algorithm library.
The process of obtaining the target ambient light from the target image, the brightness, and the initial ambient light calculation by the noise algorithm library in the embodiment shown in fig. 7 will be described below.
Step one, when a noise calculation base obtains a target image, calculating and obtaining image noise at the refreshing time of the target image according to the target image and the brightness of a display screen at the refreshing time of the target image; and when the noise calculation library obtains a brightness, calculating and obtaining the backlight noise at the brightness adjusting time according to the brightness and the target image at the brightness adjusting time.
Although the image noise and the backlight noise are different names, the calculation process is calculated based on a frame of the target image and a luminance value.
Firstly, weighting and operation are carried out according to the RGB value of each pixel point and the weighting coefficient of each pixel point, and the weighted RGB value of the target image is obtained. And determining the weighting coefficient of each pixel point according to the distance between the coordinate of the pixel point and the reference coordinate of the target image. The coordinates of the center point of the photosensitive area of the ambient light sensor may be used as reference coordinates of the target image.
And step two, the noise calculation library obtains fusion noise according to the weighted RGB value and the brightness of the target image. The fusion noise may be obtained by a table lookup method (in the table, fusion noise corresponding to the weighted RGB value of the target image and the luminance is set), or may be obtained by a preset functional relationship (the independent variable is the weighted RGB value and the luminance of the target image, and the dependent variable is the fusion noise). The fusion noise obtained at this time is the raw value of four channels.
And step three, calculating and obtaining integral noise in the integral time period of the initial environment light by the noise calculation base according to the fusion noise at each moment.
It should be noted that image noise is not generated by the image refresh process itself. In the integration time period, in the time period before the image refreshing, the interference to the initial environment light is the image noise corresponding to the image before the refreshing, and in the time period after the image refreshing, the interference to the initial environment light is the image noise corresponding to the image after the refreshing.
Similarly, the backlight noise is not generated by the process of adjusting the brightness. In the integration time period, in the time period before brightness adjustment, the interference on the initial environment light is backlight noise corresponding to the brightness before the adjustment, and in the time period after the brightness adjustment, the interference on the initial environment light is backlight noise corresponding to the brightness after the adjustment.
As described above, the noise memory stores the image noise and the backlight noise at each time point calculated by the noise algorithm library. The noise stored in the noise memory is collectively referred to as fusion noise or first noise.
Step A1, the first processor takes out first noise from an outlet position of the noise memory through the noise algorithm library, and the first processor updates the outlet position of the noise memory or the first noise of the outlet position through the noise algorithm library;
step B1, if the timestamp corresponding to the first noise which is taken out currently is before the first time or the first time, the first processor continues to execute the step A1 through a noise algorithm library until the first noise which is taken out currently is after the first time;
step B2, if the first noise currently extracted is after the first time, the first processor executes the following steps through the noise algorithm library:
Step C1, if the timestamp of the first noise which is taken out at present is after the first time for the first time and before the second time, calculating and obtaining integral noise between the first time and the time corresponding to the timestamp of the first noise which is taken out at present according to the first noise which is taken out at last time, and continuing to execute from the step A1;
step C2, if the timestamp of the first noise taken out at present is after the first time for the first time and after the second time or the second time, calculating and obtaining integral noise between the first time and the second time according to the first noise taken out at the last time, and continuing to execute the step D1;
step C3, if the timestamp of the first noise taken out at present is not after the first time for the first time and before the second time, calculating and obtaining integral noise between the time corresponding to the timestamp of the first noise taken out at last time and the time corresponding to the timestamp of the first noise taken out at present according to the first noise taken out at last time; and continuing to execute from the step A1;
step C4, if the timestamp of the first noise which is taken out at present is not after the first time for the first time and is after the second time or the second time, calculating integral noise between the time corresponding to the timestamp of the first noise which is taken out at last time and the second time according to the first noise which is taken out at last time, and continuing to execute the step D1;
And D1, obtaining a second value according to the integral noise between the first time and the second time and the first value.
When the noisy memory is a FIFO (First Input First Output) memory. The FIFO memory is a first-in first-out double-port buffer, one of two ports of the memory is an input port of the memory, and the other port of the memory is an output port of the memory. In the structure of the memory, the data which enters the memory firstly is shifted out, and correspondingly, the sequence of the shifted-out data is consistent with the sequence of the input data. The outlet position of the FIFO memory is the storage address corresponding to the output port of the FIFO memory.
When the FIFO memory shifts out a datum, the process is as follows: the fused noise stored in the exit position is removed from the exit position (first position), and then the data in the second position from the exit position is moved to the exit position, and the data in the third position from the exit position is moved to the second position from the exit position, \ 8230; \8230; sequentially.
Of course, in practical applications, after the fused noise stored in the first position (A1) is removed from the exit position (first position, A1), the exit position of the memory may be updated to the second position (A2). After the merged noise stored in the current exit position (A2) is removed again, the exit position of the memory is continuously updated to a third position (A3) \8230, which is performed in sequence.
The process of obtaining the second value based on the above calculation may refer to the embodiment described with reference to fig. 14 to the embodiment shown in fig. 16.
Referring to fig. 14, fig. 14 is a process of calculating integral noise according to image noise and backlight noise by the noise calculation library in the AP processor provided in the embodiment of the present application. The various times in the process may be compared to the descriptions of the various times in the embodiments shown in fig. 9 and 10: at t 01 Refreshing the image at all times to obtain t 01 Image noise at a time; at t 02 Adjusting brightness at a moment to obtain t 02 Backlight noise at the moment; at t 03 Refreshing the image at all times to obtain t 03 Image noise at the moment; at t 04 Refreshing the image at all times to obtain t 04 Image noise at time instants.
From t 01 Time to t 02 At the moment, the displayed image is t 01 The brightness of the display screen of the image after the moment refreshing is t 01 Brightness at time (t) 01 The brightness at the moment is the brightness value stored in the noise algorithm library at t 01 Latest stored brightness value before time of day),t 01 The image noise at time t 01 The brightness of the image after the moment refreshing on the display screen is t 01 Noise in the case of the brightness of the time instant. Thus, the initial ambient light includes a duration of "t 02 -t 01 ", time stamp t 01 The image noise of (1).
From t 02 Time to t 03 At the moment, the brightness of the display screen is t 02 The brightness after the moment adjustment is t, the image displayed by the display screen 01 Image after temporal refresh, t 02 Backlight noise at time t 02 The brightness after the moment adjustment is displayed on the display screen to display t 01 Noise in the case of time-adjusted images. Thus, the initial ambient light includes a duration of "t 03 -t 02 ", the time stamp is t 02 Backlight noise at the moment.
From t 03 Time to t 04 At the moment, the displayed image is t 03 The brightness of the display screen of the image after the moment refreshing is t 02 Brightness adjusted at time t 03 The image noise at time t 03 The brightness of the image refreshed at any moment on the display screen is t 02 Noise in the case of the luminance after the time adjustment. Thus, the initial ambient light includes a duration of "t 04 -t 03 ", the time stamp is t 03 The image noise of (2).
From t 04 Time to t 1 At the moment, the displayed image is t 04 The brightness of the display screen of the image after the moment refreshing is t 02 Brightness adjusted at time t 04 The image noise at time t 04 The brightness of the image refreshed at any moment on the display screen is t 02 Noise in the case of the luminance after the time adjustment. Thus, the initial ambient light includes a duration of "t 1 -t 04 ", the time stamp is t 04 The image noise of (1).
Based on the above understanding, the AP processor, when calculating the integral noise:
t 01 Image noise pair t of time 01 Time to t 02 The initial ambient light at that moment causes interference;
t 02 backlight noise pair t of time instants 02 Time to t 03 The initial ambient light at the moment causes interference;
t 03 image noise pair t of time 03 Time to t 04 The initial ambient light at that moment causes interference;
t 04 image noise pair t of time 04 Time to t 1 The initial ambient light at the moment causes interference.
Thus, t can be calculated separately 01 Time to t 02 Integral noise at time t 02 Time to t 03 Integral noise at time t 03 Time to t 04 Integral noise at time, t 04 Time to t 1 Integral noise at time.
For t 01 Time to t 02 The integral noise at time is: (t) 02 -t 01 )/(t 1 -t 0 )×N t01
For t 02 Time to t 03 The integral noise at time is: (t) 03 -t 02 )/(t 1 -t 0 )×N t02
For t 03 Time to t 04 The integral noise at time is: (t) 04 -t 03 )/(t 1 -t 0 )×N t03
For t 04 Time to t 1 The integral noise at time is: (t) 1 -t 04 )/(t 1 -t 0 )×N t04
Wherein N is t01 Denotes t 01 Fusion noise of time of day, N t02 Represents t 02 Fusion noise of time of day, N t03 Represents t 03 Merging noise of moments, N t04 Represents t 04 Fusion noise at time.
While each sub-period (t) within the integration period 01 To t 02 ,t 02 To t 03 ,t 03 To t 04 ,t 04 To t 1 ) The integrated noise of (a) together is the integrated noise of the whole integration period.
The start time of the integration period in the above example is exactly the time of the image refresh, i.e., the image noise at the start time of the integration period can be obtained.
In practical applications, it is possible that the start time of the integration period is not the time of image refresh nor the time of backlight adjustment. In this case, it is necessary to acquire the fusion noise corresponding to the change time (image refresh time or backlight adjustment time) that is the latest before the start of the current integration period.
Referring to fig. 15, an integration time period (t) is obtained for a noise calculation library in an AP processor provided in an embodiment of the present application 01 Time to t 1 Time of day), t 01 The time is not the starting time of the current integration time period, but is the image refreshing time of one time in the current integration time period. The latest change time (image refresh time or brightness adjustment time) before the start of the current integration period is t -1 The time is an image refresh time.
Referring to fig. 16, if the latest change time before the start of the current integration period is the image refresh time, the image noise corresponding to the image refresh time will be referred to t 0 Time to t 01 The initial ambient light at the moment causes interference.
Of course, if the latest change time is the brightness adjustment time, the backlight noise corresponding to the brightness adjustment time will be t 0 To t 01 The initial ambient light at the moment causes interference.
In the embodiment shown in fig. 16, the integration noise corresponding to each sub-period in the integration period is:
for t 0 Time to t 01 The integral noise at time is: (t) 01 -t 0 )/(t 1 -t 0 )×N t-1
For t 01 Time to t 02 The integral noise at time is: (t) 02 -t 01 )/(t 1 -t 0 )×N t01
For t 02 Time to t 03 The integral noise at time is: (t) 03 -t 02 )/(t 1 -t 0 )×N t02
For t 03 Time to t 04 The integral noise at time is: (t) 04 -t 03 )/(t 1 -t 0 )×N t03
For t 04 Time to t 1 The integral noise at time is: (t) 1 -t 04 )/(t 1 -t 0 )×N t04
Wherein N is t-1 Denotes t -1 Merging noise of moments, N t01 Denotes t 01 Merging noise of moments, N t02 Represents t 02 Merging noise of moments, N t03 Denotes t 03 Fusion noise of time of day, N t04 Represents t 04 Fusion noise at time.
As can be understood from the above example, the obtained integral noise is also a raw value on four channels.
The timestamps in the above examples are different, and in practical applications, the HWC may perform both the process of acquiring the target image and the process of acquiring the brightness to be adjusted within one time measurement unit (e.g., within 1 ms). However, the time stamp of the target image acquired at this time and the brightness to be adjusted are the same.
If the target image and the brightness value with the same timestamp exist and the noise algorithm library receives the target image first, the noise algorithm library calculates image noise according to the latest brightness value before the target image and the target image, and calculates backlight noise according to the target image and the brightness value with the same timestamp when calculating the backlight noise corresponding to the brightness value;
If the target image and the brightness value with the same timestamp exist and the noise algorithm library receives the brightness value first, the noise algorithm library calculates backlight noise according to the brightness value and the latest target image before the brightness value, and when image noise corresponding to the target image is calculated, image noise is calculated according to the target image and the brightness value with the same timestamp.
The noise algorithm library firstly receives a target image, then firstly calculates to obtain image noise, and firstly stores the image noise to a noise memory. The fusion noise stored in the noise memory has a time sequence, that is, before being stored in the noise memory, whether the fusion noise to be stored currently is behind the timestamp of the fusion noise stored last time is judged, if the fusion noise to be stored currently is behind the timestamp of the fusion noise stored last time, the fusion noise to be stored currently is stored, and if the fusion noise to be stored currently is before or the same as the timestamp of the fusion noise stored last time, the noise to be stored currently is discarded. Therefore, the backlight noise obtained by the post-calculation is discarded.
In practice, the timestamp of the target image may be the time when the HWC starts to perform a write-back memory fetch of the target image from the CWB as the timestamp of the target image. The timestamp of the luma value may be the time at which the HWC started to perform the fetching of the luma value from the kernel node as the timestamp of the luma value. The HWC may switch to capture luminance values during the process of capturing the target image. Therefore, the HWC performs the capture of the target image first and then the luminance value, and the timestamp of the luminance value is later than the timestamp of the target image. In practical applications, the HWC may obtain the luminance value first and send the luminance value to the noise algorithm library, and the noise algorithm library calculates the backlight noise and stores the backlight noise. And obtaining a target image after HWC and sending the target image to a noise algorithm library, and calculating by the noise algorithm library to obtain image noise and storing the image noise. This results in the time at which the timestamp of the image noise is currently ready to be stored being before the time at which the timestamp of the backlight noise was last stored.
And step four, removing integral noise of the whole integral time period from the initial ambient light by a noise algorithm library to obtain the target ambient light.
In the embodiment of the present application, the initial ambient light sent by the SCP processor to the HWC of the AP processor is initial ambient light data in the form of RGBC four-channel raw values. The HWC sends the initial ambient light data, also in the form of RGBC four-channel raw values, to the noise algorithm library. In step three we get raw values over the four channels that integrate the noise. Therefore, in this step, the raw value on the four channels of the target ambient light can be obtained by performing an operation on the raw value of the four channels of the initial ambient light and the raw value of the integrated noise four channels.
After calculating and obtaining raw values on four channels of the target ambient light, the noise algorithm library can send the raw values on the four channels of the target ambient light to the SCP processor, and the SCP processor calculates and obtains the lux value of the target ambient light according to the raw values on the four channels of the target ambient light.
As an example, the lux value may be weighted according to the raw value of each channel multiplied by a coefficient of each channel (which may be provided by the manufacturer of the ambient light sensor).
As described above, each time (or, every interval, twice, etc.) the electronic device refreshes an image, the DSS in the AP processor stores the image before display (which may be understood as an image to be refreshed in the current refreshing process or an image after the current refreshing) in the CWB memory. The HWC in the AP processor performs matting to obtain a target image from the image to be refreshed stored in the CWB memory, and then sends the target image to a noise algorithm library. For convenience of description, the step of the DSS in the AP processor storing the pre-sent image in the CWB memory, the HWC obtaining the target image from the CWB memory is denoted as a write back function of the CWB.
In the CWB write back function enabled state of the AP processor, the DSS stores the pre-presentation image in the CWB memory each time (or every other time, twice, etc.) the electronic device refreshes the image. After the DSS stores the image before display in the CWB memory, the DSS sends a message to the HWC module that the storage was successful. After the HWC receives the information that the store was successful, the HWC retrieves the target image from the CWB memory. Accordingly, the HWC module may send the target image to a library of noise algorithms.
That is, in the CWB write back function enable state of the AP processor, the AP processor performs steps A1 to A6 in fig. 7 each time (or, once, twice, etc. at intervals) the electronic device refreshes an image.
In the CWB write-back function stop state of the AP processor, each time (or every interval, twice, etc.) the electronic device refreshes the image, the steps A1 to A3 in the technical architecture shown in fig. 7 are executed according to the refresh display flow. However, the DSS no longer stores the pre-sent image in the CWB memory and, accordingly, the AP processor no longer performs subsequent correlation steps.
That is, in the CWB write back function stop state of the AP processor, the AP processor performs steps A1 to A3 in fig. 7 each time (or once, twice, etc. at intervals) the electronic device refreshes the image, however, steps A4 to A6 in the technical architecture shown in fig. 7 are not performed any more.
As mentioned above, after receiving the target image, the noise algorithm library calculates and obtains the image noise according to the received target image. Therefore, when the electronic device performs image refreshing during the activation state of the CWB write-back function of the AP processor, the corresponding image noise is not obtained, and when the electronic device performs image refreshing during the deactivation state of the CWB write-back function of the AP processor, the corresponding image noise is not obtained.
Taking fig. 9 and 14 as an example, at the end of integration (t) 1 Time of day), the noise algorithm library calculates to obtain t 0 Time to t 1 The fusion noise adopted when the target ambient light is at the moment is: t is t 01 Image noise at time t 02 Backlight noise at time t 03 Image noise and t at time 04 Image noise at time instants. The unnecessary fusion noise includes at least: t is t 11 Image noise and t at time 12 Backlight noise at the moment. That is, the image noise and the backlight noise corresponding to the time period from the start time (including the start time) to the end time of the integration time period interfere with the initial ambient light acquired by the current integration process, and the image noise and the backlight noise corresponding to the non-integration time period may not interfere with the initial ambient light acquired by the current integration process.
Therefore, to reduce power consumption, the AP processor may initiate the CWB write back function via the HWC control, and execute steps A4-A6, during the integration period of the ambient light sensor. During the non-integration period of the ambient light sensor, the CWB writeback function is stopped by the HWC control and the AP processor no longer performs steps A4-A6.
Referring to fig. 17, fig. 17 is a schematic diagram of a start-stop method of a CWB write-back function according to an embodiment of the present application.
As shown in fig. 17, during the integration period of the ambient light sensor (from t) 0 To t 1 From t 2 To t 3 From t 4 To t 5 ) The HWC controls the CWB writeback function to start, during the non-integration period (from t) 1 To t 2 From t 3 To t 4 From t 5 To t 6 ) The HWC controls the CWB write back function to stop. In this way, the image noise in each integration process can be obtained, and the power consumption of the AP processor can be reduced.
The embodiment of the application will focus on a start-stop method of the CWB write-back function. In the start-stop method of the CWB write-back function, the HWC in the AP processor may monitor whether the data in the core node changes during both the integration period and the non-integration period. When the change occurs, the HWC may obtain the brightness to be adjusted, and accordingly, the noise algorithm library may calculate the backlight noise for the entire acquisition period.
In the subsequent embodiments of the present application, for example, the HWC in the AP processor may monitor the change of data in the core node in both the integration time period and the non-integration time period, and when the data stored in the core node changes, the HWC acquires the brightness to be adjusted from the core node, and transmits the brightness to the noise algorithm library to calculate and obtain the backlight noise.
In addition, since the integration process of the ambient light sensor is controlled by the SCP processor side, the SCP processor needs to send time-related parameters during the process of acquiring the initial ambient light by the ambient light sensor to the AP processor.
As described above, the SCP processor may transmit the initial ambient light, the time related to the integration process of the current initial ambient light (for example, the current integration end time and the current integration duration, or the current integration start time and the current integration end time, etc.) to the HWC in the AP processor after obtaining the initial ambient light each time the integration ends. The SCP processor may also send the integration start time (or a period of time before) when the ambient light sensor is next ready to collect the initial ambient light to the HWC in the AP processor as the time when the CWB writeback function is initiated. The integration end time (or some time thereafter) at which the initial ambient light is next ready to be collected is sent to the HWC in the AP processor as the time at which the CWB write back function stops. I.e., the start and stop times at which the CWB write back function is sent by the SCP processor to the HWC in the AP processor.
In practical application, when the acquisition period of the ambient light sensor is fixed, the SCP processor may send the integration start time of the next acquisition of the initial ambient light to the AP processor as the time when the CWB write-back function is started. And the AP processor calculates and obtains the moment when the CWB write-back function stops according to the received starting moment and the received acquisition period of the CWB write-back function. Or the SCP processor reports the start time of the first integration, the integration duration, the sampling period and the like. The start time and stop time of the CWB write back function are determined by the AP processor based on the data.
It should be noted that, in the embodiment of the present application, the parameter related to the time during the process of acquiring the initial ambient light by the ambient light sensor, which is sent by the SCP processor to the AP processor, is not limited. As long as the AP processor can obtain the next starting time of the CWB write back function according to the received time-related parameter.
The activation time of the CWB write back function does not completely coincide with the integration start time, and the stop time of the CWB write back function does not completely coincide with the integration end time. The time period corresponding to the start state of the CWB write back function needs to include the integration time period in the acquisition cycle. Taking an acquisition cycle as an example, the starting time of the CWB write-back function is earlier than or equal to the starting time of the integration period of one acquisition cycle, and the stopping time of the CWB write-back function is later than or equal to the starting time of the integration period of one acquisition cycle.
As another example, the SCP processor may further send the initial ambient light, the integration duration corresponding to the initial ambient light (or the integration start time of this time), the integration end time of this time, and the sleep duration of the CWB write-back function to the HWC in the AP processor after the integration is ended. For convenience of description, information transmitted after the end of the integration may be collectively referred to as first information. The first information is not limited to the above information, and more or less information may be included in the above information.
As described above, the noise algorithm library calculates and obtains the target ambient light based on the initial ambient light, the integration duration (or the integration start time) corresponding to the initial ambient light, the integration end time of the current time, and the corresponding fusion noise. The detailed process refers to the description of the above embodiments and is not repeated.
The HWC in the AP processor needs to determine the time to initiate the CWB writeback function based on the sleep duration of the AP processor.
Of course, the first message may be split into a plurality of sub-messages, which are respectively sent to the HWC of the AP processor. The embodiment of the present application does not limit this.
Taking the example of sending the above information together to the AP processor, the SCP processor may transmit the time at which the SCP processor sends the first information together.
After receiving the first message, the HWC in the AP processor first controls the CWB write back function to stop. And then the HWC in the AP processor obtains the starting time of the CWB write-back function according to the partial information in the received first information or obtains the starting time of how long the CWB write-back function needs to be waited for.
Since the start time of the CWB write back function may be before the integration start time of the next cycle, it is not necessary to strictly control at a certain point in time. Therefore, the starting time of the CWB write back function can be obtained in any of the following ways or other ways not shown in the embodiments of the present application.
The HWC in the AP processor obtains the inter-core communication delay based on the time when the SCP processor sends the first information and the time when the AP processor receives the first information. Then, the HWC in the AP processor obtains, according to the inter-core communication delay and the sleep time, a time length (the sleep time length minus the inter-core communication delay) until the start time of the CWB write-back function is still waiting or obtains a start time of the CWB write-back function (the time when the AP processor receives the first message through the HWC plus a time length until the start time of the CWB write-back function is still waiting). The HWC in the AP processor initiates the CWB writeback function while waiting until the CWB writeback function is initiated.
As an example, in the case where the total duration of the non-integration period is 300ms, the sleep duration may be 240ms, 250ms, 260ms, 270ms, 280ms, or the like. Thus, even if there is an inter-core communication delay (e.g., 1 ms), the CWB write back function can be guaranteed to start before the next integration starts.
Certainly, in practical applications, the AP processor may also calculate, by using the HWC, the start time of the CWB write-back function (the integration end time plus the sleep time) or the time duration that the start time of the CWB write-back function should continue to wait from the start time of the CWB write-back function (the integration end time plus the sleep time minus the time when the AP processor receives the first information through the HWC) according to the integration end time and the sleep time sent by the SCP processor. In this example, the CWB write back function start time is noted as the first time. The first time is also the time when the second time length passes after the time when the cutout mark is set as the first character. The sleep duration in the first information may be recorded as a first duration. The second duration is: the sleep duration minus the duration of the delay. The time length of the delay is as follows: the time when the HWC module receives the first message is subtracted by the duration of the integration end time, which may be recorded as the second time. As described above, the first information received by the AP processor may further include an integration start time (first time), an integration end time (second time), an initial ambient light (first value), a sleep duration (first duration), and the like.
Since the scratch flag represents the start and stop of the CWB write back function, the length of time that the CWB write back function stops can also be understood as the length of time that the scratch flag is set to the first character.
As described above, the start time of the CWB write-back function is not strictly fixed at a certain time in the embodiment of the present application, and therefore, other calculation methods may also be adopted in the embodiment of the present application as long as the start time of the CWB write-back function is ensured to be before the integration start time. Therefore, the AP processor may also ignore the communication delay by using the sleep duration in the received first message as the sleep duration of the CWB write-back function.
The above examples are described by taking the time alignment of the AP processor and the SCP processor as an example, and in the case that the AP processor and the SCP processor are not time aligned, the time difference when the AP processor and the SCP processor are not time aligned needs to be considered on the basis of the time or the time obtained by the above calculation.
As mentioned above, the stop time of the CWB write back function is the time or after the AP processor receives the first message through the HWC, so after the CWB write back function is started, the HWC stops the CWB write back function in the currently started state at the time or after the next time the first message sent by the SCP processor is received.
As can be appreciated in the manner described above, the turn off time of the CWB writeback function is after the integration ends. The starting time of the CWB write back function may be determined according to the sleep duration of the AP processor.
Referring to fig. 18, on the AP processor side, each time the electronic device refreshes an image, the surface flanger sends the display parameters of the interface to the HWC (specifically, refer to step A1 in the embodiment shown in fig. 7), the HWC obtains a synthesized image based on the display parameters, the HWC needs to query the matting flag, and in the case that the matting flag indicates that the HWC starts, the HWC starts the CWB write-back function, and the AP processor may execute steps A1 to A6 in the technical architecture shown in fig. 7. I.e., the CWB write back function is enabled, the noise algorithm library can calculate the image noise and backlight noise during the activation of the CWB write back function.
In this example, the scratch marks can be written as write back marks.
The sectional mark in the above embodiment can also be in the form of an identifier. After the HWC receives the first information sent by the SCP processor, the HWC sets the identifier to a first character (e.g., 0, false), and after the HWC waits for the sleep duration, the HWC sets the identifier to a second character (e.g., 1, true). If the tag is the first character (e.g., 0, false), the HWC control stops the CWB write back function. If the identifier is the second character (e.g., 1, true), then the HWC control initiates the CWB write back function.
In this example, the first character may be marked as a first mark and the second character may be marked as a second mark.
The HWC controls whether the CWB write back function is enabled or disabled by querying the tag.
As an example, the HWC may query whether the identifier is currently the first character or the second character each time before performing step A2 in the technical architecture shown in fig. 7. If the character is the second character, which indicates that the write-back function of the CWB is in the starting state, the HWC transmits the information to be scratched when executing the step A2, and the display subsystem stores the image to be displayed in the CWB memory after receiving the synthesized image transmitted downwards by the HWC and the information to be scratched. If the character is the first character and indicates that the CWB write-back function is in a stop state, the HWC does not transmit the information that needs to be scratched (or transmits the information that does not need to be scratched all together) when executing step A2, and after the display subsystem receives the synthesized image that is transmitted downwards by the HWC, if the information that needs to be scratched (or receives the information that does not need to be scratched), the image to be displayed is not stored in the CWB memory. The HWC may not retrieve the target image.
As an example, when the matting flag is the second flag, if the electronic device refreshes the image (which may be denoted as the fifth image), the surface flipper transmits the display parameter of the interface (which may be denoted as the fourth display parameter) to the HWC, and the HWC may call the underlying hardware composite image after receiving the fourth display parameter. When the HWC transmits the synthesized image (which may be the fifth image or an image processed to obtain the fifth image) to the display subsystem, the HWC may transmit the information to be scratched (the information to be scratched may be referred to as the third information). The display subsystem receives the fifth image and the third information, and may store the fifth image, or a partial image in the fifth image (which may be referred to as a sixth image), and also a target image on the fifth image (which may be referred to as a third target image) in the CWB memory. The HWC obtains the target image from the CWB memory and sends the target image to a noise algorithm library. The library of noise algorithms may derive image noise (which may be referred to as second image noise) based on the target image.
As another example, when the matte mark is the first mark, if the electronic device refreshes the image (which may be referred to as the first image), the surface flicker transmits the display parameter of the interface (which may be referred to as the fifth display parameter) to the HWC, and the HWC may call the underlying hardware composite image after receiving the fifth display parameter. The HWC does not transmit the information to be scratched when transmitting the synthesized image (which may be the first image or an image that has been processed to obtain the first image) to the display subsystem. The display subsystem receives the first image and no longer stores the first image, a partial image (which may be referred to as a second image) in the first image, and a target image (which may be referred to as a first target image) on the first image in the CWB memory. Accordingly, the HWC cannot obtain the target image from the CWB memory and does not send the target image to the noise algorithm library. The noise algorithm library no longer derives image noise based on the target image.
As shown in fig. 18, on the SCP processor side, after the SCP processor is started, the driving of the ambient light sensor is initialized, and then the ambient light integration is started according to a preset acquisition period.
After the ambient light integration is finished, the on-off screen state can also be monitored, and after the on-off screen event is monitored by the HWC on the AP processor side, the AP processor sends related information to the SCP processor to trigger the change of the on-off screen state on the SCP processor side.
In the bright screen state, the SCP processor needs to send the acquired initial ambient light to the HWC of the AP processor, the HWC in the AP processor sends the initial ambient light to the noise algorithm library, and the noise algorithm library calculates a raw value of the target ambient light according to the received initial ambient light. The AP processor sends the raw value of the target ambient light to the ambient light memory of the SCP processor. Of course, in practical applications, the AP processor may also calculate the lux value of the target ambient light according to the raw value of the target ambient light. The AP processor sends the lux value of the target ambient light to the SCP processor.
Referring to fig. 18, in the bright screen state, the SCP processor needs to send the integration end time and the sleep duration to the AP processor.
And the AP processor receives the integral ending time and the dormancy duration reported by the SCP processor. The HWC sets the scratch flag to be first character, and in the case where the scratch flag is first character, the HWC will stop the CWB write back function.
After receiving the first information, a matting thread (a thread for executing to acquire a target image) in the HWC in the AP processor sets a matting mark as a first character. Then, the duration which should be dormant is obtained through calculation, a sleep function is called based on the duration which should be dormant, when the sleep function is called by the matting thread, the duration which should be dormant is transmitted (for example, 270 ms), and the matting thread can be dormant for 270ms. After sleeping for 270ms, the matting thread is finished sleeping. After the dormancy of the matting thread is finished, the matting thread sets the matting mark as a second character, and the CWB write-back function is started.
The SCP processor also needs to calculate a lux value of the target ambient light from the raw value of the target ambient light. In addition, ambient light integration continues to be initiated at the beginning of the next integration.
When the screen is turned off, the initial ambient light collected by the ambient light sensor is the real ambient light, and at this time, the SCP processor may no longer report the lux value of the initial ambient light collected in the integration time period to the AP processor. Since the CWB write back function need not be enabled to pick up the associated noise in the off-screen state, the SCP processor no longer counts the time for the next activation of the CWB write back function.
Certainly, in some scenes, for example, when the face is unlocked in the screen-off state, the electronic device needs to know whether the current environment is a dark environment, and the face needs to be supplemented with light in the dark ambient light. Therefore, the electronic device needs to know the current lux value of the real ambient light in this scenario. Therefore, even in the screen-off state, the ambient light sensor needs to collect ambient light, and the SCP processor reports the lux value of the collected ambient light to the AP processor when receiving the face unlocking request issued by the AP processor, so that the AP processor determines whether to supplement light according to the received lux value of the ambient light.
The embodiment of the application focuses on how the HWC controls the start and stop of the CWB write back function in the AP processor. Other details not shown may refer to the description in any of the embodiments above.
As mentioned above, the starting time and the stopping time of the CWB write back function in the AP processor are determined by the data reported by the SCP processor. Consider that inter-core communications between the AP processor and SCP processor may have data transmission delays. It is possible to provide that: after the AP processor determines that the display screen is bright, the HWC in the AP processor controls the CWB write-back function to be normally open, and after the HWC receives first information reported by the SCP processor, the HWC starts to control the CWB write-back function to be circularly started and stopped according to the start-stop scheme described in any embodiment.
Taking the embodiment shown in fig. 9 as an example, if the start-stop method of the CWB write-back function shown in fig. 17 is adopted:
HWC may obtain t 01 The target image at the moment, the noise algorithm base also calculates to obtain t 01 Image noise at a time;
HWC may obtain t 02 The brightness value to be adjusted at the moment, and the noise algorithm base also calculate to obtain t 02 Backlight noise at a moment;
HWC can obtain t 03 The target image at the moment and the noise algorithm base also calculate to obtain t 03 Image noise at the moment;
HWC can obtain t 04 The target image at the moment, the noise algorithm base also calculates to obtain t 04 Image noise at the moment;
HWC no longer gets t 11 The target image at that moment, the noise algorithm library does not calculate to obtain t 11 Image noise at the moment;
HWC may obtain t 12 The brightness value to be adjusted at the moment, and the noise algorithm base do not calculate to obtain t 12 Backlight noise at the moment.
If the electronic device refreshes an image at a frequency of 60Hz and a CWB write-back function is normally open in a scene where the electronic device plays a video, the HWC may acquire a target image 300 ms/(1000 ms/60) =18 times during a non-integration time period (300 ms, for example) in an acquisition cycle (350 ms, for example), and the noise algorithm library may calculate and store image noise 18 times.
By using the CWB write-back function start-stop scheme shown in fig. 17, within one acquisition cycle (350 ms), the process of acquiring the target image by the HWC for 18 times and the process of calculating the image noise by the noise algorithm library for 18 times can be reduced. Obviously, the power consumption can be reduced by adopting the CWB write back function start-stop scheme shown in fig. 17.
However, in the embodiment shown in figures 15 and 16,if t -1 The time of day is in the non-integration period of time in the last acquisition cycle. Since the non-integration period CWB write back function stops (steps A4 to A6 are no longer performed), the display subsystem no longer stores the image to be refreshed in the CWB memory and, correspondingly, the HWC does not get t -1 The target image at the moment and the noise algorithm library do not obtain t -1 Target image of time and t -1 Image noise at time instants. Accordingly, there is also no t in the noise memory -1 Image noise at the moment. Then the noise algorithm library will lose the t pair when calculating the integral noise for each sub-period 0 Time to t 01 The initial ambient light corresponding to the moment has interfering fusion noise. In the missing pair t 0 Time to t 01 Under the condition that the initial environment light corresponding to the moment has the interfered fusion noise, the noise algorithm library adopts t stored in the noise memory -1 The fusion noise before the time is taken as the interference t 0 Time to t 01 The fusion noise of the initial ambient light at the moment causes inaccuracy of the target ambient light calculated finally.
To solve this problem, the write-back function of the CWB may be controlled to be activated before the start of each integration period, and after the write-back function of the CWB is activated, the image is forced to be refreshed once, so as to ensure that the display subsystem stores the image to be refreshed in the CWB memory, and the HWC can obtain the target image corresponding to the image to be refreshed from the CWB memory. Correspondingly, the noise algorithm library also calculates and obtains the image noise corresponding to the moment of forcibly refreshing the image. The embodiment of the application records the image which is forcibly refreshed as the third image.
As described above, before the image is forced to be refreshed, the CWB write-back function is already enabled, that is, the matting flag is already marked as the second flag, and the HWC module transmits the information to be scratched (this information may be marked as the second information) when sending the image to be forced to be refreshed to the display subsystem. Accordingly, the image stored by the display subsystem into the CWB memory that can be a forced refresh may be a partial image of the forced refresh image (the partial image is denoted as the fourth image) or may be a target image (the target image is denoted as the second target image). As mentioned above, the target image may obtain corresponding image noise (which may be referred to as first image noise).
An interface for forced refreshing the image exists in the HWC, the HWC calls the interface when determining that the image needs to be forced refreshed, and the electronic device realizes forced refreshing the image once. When the HWC calls the interface, a first signal is sent to the Surface flag through the interface, and after receiving the first signal, the Surface flag acquires the latest cached display parameter in the cached display parameters from the cache, wherein the display parameter is marked as the first display parameter. The Surface flag sends the display parameter to the HWC module, the HWC calls underlying hardware based on the display parameter to obtain a synthesized image (the image is a third image), and checks that the scratch mark is a second mark, so that the HWC carries information to be scratched when sending the synthesized image to the display subsystem.
In practical applications, the latest cached display parameter of the display parameters cached by the Surface flicker may be the corresponding display parameter when the image is refreshed before. If the electronic device refreshes the image before forcibly refreshing the image, the refreshed image is the first image, and correspondingly, the latest cached display parameter of the display parameters cached by the Surface flicker may be the fifth display parameter for generating the first image. Thus, the third image may be the same as the first image. Therefore, the image that the electronic device implements forced refreshing may be an image currently displayed on the display screen of the electronic device (after the first image is refreshed last time, the display screen of the electronic device keeps displaying the first image). The process of forced refreshing the image is the same as the process of normal refreshing the image, and the forced refreshing image and the normal refreshing image are all displayed through the surface flag, the HWC, the OLED drive and the display subsystem. For a specific process, reference may be made to the description of the above embodiments, which is not repeated herein.
In the embodiment of the application, the purpose of forcibly refreshing the image is to display the image currently displayed on the display screen, and the image currently displayed on the display screen is the image refreshed last time on the display screen. In practical application, before the image to be displayed is sent to the display subsystem, a frame of image may be cached, and the frame of image may be understood as an image currently displayed on the display screen or an image refreshed on the display screen at the last time. The HWC retrieves the image from the cache and then passes the image down to the display subsystem along with the information that needs to be scratched. The display subsystem may store the image (or a region image of the image, or a target image corresponding to the image) in the CWB memory, and the HWC performs the step of retrieving the target image from the CWB memory.
As mentioned above, if the HWC needs to perform matting on a refreshed image to obtain a target image, the HWC can transmit the synthesized image downward to carry the information needed to perform matting. If the HWC does not need to matte the image currently to be refreshed, the HWC may not transmit the information that needs matting (or carry information that does not need matting). The display subsystem is based on whether the received image carries information needing to be scratched and is used as a basis for storing the information in the CWB memory. In the case where the received image carries information that needs matting, steps A4 to A6 in the technical architecture shown in fig. 7 are executed. In the case where the received image does not carry information that needs matting (or carries information that does not need matting), steps A4 to A6 in the technical architecture shown in fig. 7 are no longer performed.
The image is forcibly refreshed after the CWB write-back function is started, so when the AP processor executes steps A2 to A3 in the technical architecture shown in fig. 7, the transmitted data carries the information to be scratched.
Referring to fig. 19, a start-stop scheme for forcibly refreshing an image once after a CWB write-back function is started at a first preset time before integration starts is provided in the embodiment of the present application. In this embodiment and the following embodiments, for convenience of drawing, the time when the CWB write-back function is started and the time when the image is forcibly refreshed are set to be the same time. For convenience of drawing, the stop time of the CWB write back function and the integration end time are set to be the same time, and in practical applications, the stop time of the CWB write back function may be later than the integration end time.
As shown in FIG. 19, a first preset time (t) before the start of integration for each acquisition cycle 2 -t 1n 、t 4 -t 3n 、t 6 -t 5n ) Corresponding time (t) 1n 、t 3n 、t 5n ) After the CWB write back function is enabled, the image is forced to refresh once. Also understood as a second preset time (t) after the start of the non-integration period of each acquisition cycle 1n -t 1 、t 3n -t 3 、t 5n -t 5 ) Corresponding time (t) 1n 、t 3n 、t 5n ) After the CWB write back function is initiated, the image is forced to refresh once. The sum of the first preset time and the second preset time is the duration of a non-integration time period.
Taking the first acquisition cycle as an example, T is the non-integration period of the first acquisition cycle (T1) 1n At this point, the HWC in the AP processor controls the writeback function of the CWB to start and force a refresh of the image once after start. HWC can obtain t 1n The target image corresponding to the image moment is forcibly refreshed at the moment, and the noise algorithm library can calculate to obtain t 1n Image noise at time, noise algorithm library will t 1n The image noise at that time is stored in a noise memory. For other acquisition periods, reference may be made to this example of acquisition period, which is not described herein again.
To verify that the start-stop scheme of the CWB write-back function shown in fig. 19 does not lose the fusion noise that interferes with the initial ambient light collected during the integration period, see the embodiment shown in fig. 20, in which T is the non-integration period before the integration start time of the second collection cycle (T2) in the embodiment shown in fig. 20 1n At this point, the HWC in the AP processor controls the writeback function of the CWB to start and force a refresh of the image once after start. HWC can obtain t 1n The target image corresponding to the image moment is forcibly refreshed at the moment, and the noise algorithm library can cache t 1n Target image of time and calculating to obtain t 1n Image noise at time, noise algorithm library will t 1n The image noise at the time is stored in a noise memory.
At t 1n After the time to the start of the integration period (t) 2 ) Neither brightness adjustment nor image refresh is present.
If there is only one brightness adjustment during the second acquisition period (T2): t is t 21 And adjusting the brightness at the moment. Then noise is generatedThe algorithm library may be based on t 1n Target image corresponding to image refreshed at any time and t 21 Adjusted luminance of a moment of time obtains t 21 Backlight noise at the moment. Noise algorithm library will t 21 The backlight noise at the moment is sent to the noise memory.
The integration period ends during the second acquisition cycle (t) 3 Time) after, t is stored in the noise memory 1n Image noise sum of time of day t 21 Backlight noise at the moment.
Referring to fig. 21, the integral noise of the initial ambient light that interferes with the second acquisition period is:
duration of "t 2 Time to t 21 T at time "t 1n Image noise at a time;
Duration of "t 21 Time to t 2 T of time 21 Backlight noise at the moment.
As can be understood from the embodiment shown in fig. 20, if a start-stop scheme is used that forces the image to be refreshed before integration begins:
when there is no image refresh between the moment of forcibly refreshing the image to the moment of starting integration next time, the noise algorithm library can also obtain the first sub-time period (t) influencing the integral time period 2 Time to t 21 Time of day) of the fusion noise (t) 1n Fusion noise at time).
And, there is a brightness adjustment (t) between the time of this forced refresh of the image and the time of the next image refresh 21 Brightness adjustment of time), a target image (t) corresponding to the brightness adjustment time can be obtained 1n The target image corresponding to the moment) to obtain the correct brightness adjustment moment (t) 21 Time of day) corresponding to the backlight noise.
When the electronic device plays a video through the display screen, the image displayed on the display screen of the electronic device may be refreshed at a frequency of 60Hz, i.e., every 16.7 ms. The acquisition period of the ambient light sensor may be set to 350ms, the integration period to 50ms, and the non-integration period to 300ms. Even before the start of the integration period (e.g., t) 2 -t 1n =20 ms) initiates the CWB write back function and forces a refresh of the image once after the CWB write back function is initiated. This corresponds to a process of reducing (300-20)/16.7 =16.8 HWC acquisitions of the target image and the computation of the image noise by the noise algorithm library in one acquisition cycle.
In the above embodiment, t 2 -t 1n =20ms, in practice, t 2 -t 1n But may also be equal to other duration values. Example of the present application for t 2 -t 1n The corresponding time duration is set to ensure that the noise algorithm library can obtain the target image once and the image noise once before the integration starts. Therefore, the above embodiment can reduce the power consumption of the processor, and simultaneously can obtain accurate target ambient light.
In practical applications, when a display screen of an electronic device is on, the display screen may not be in a refresh state all the time, and may also be in an idle state for a long time.
When the display screen is bright, the display screen comprises: an idle state and a refresh state. In practical application, the moment when the display screen refreshes the image for the last time can be obtained, and whether the display screen is in a refreshing state or an idle state at present is judged according to the difference value between the current moment and the moment when the display screen refreshes the image for the last time. A threshold value may be set in advance, the display screen is currently in a refresh state when a difference between a current time and a time when the display screen refreshes an image for the last time is smaller than the threshold value, and the display screen is currently in an idle state when the difference between the current time and the time when the display screen refreshes the image for the last time is greater than or equal to the threshold value.
The embodiments of the present application do not intend to strictly distinguish between a refresh state and an idle state. It is only for explaining that when the image displayed on the display screen of the electronic device does not change for a long time (idle state), the image displayed on the display screen is always the corresponding image when the image is refreshed for the last time.
As another example, when a user views a certain interface of an electronic device, no operation is performed for a long time while there is no animation in the current interface. Before the display screen is turned off, the display screen is in an idle state. When the display screen of the electronic device plays a video, the display screen may refresh the image at a frequency of 60Hz, and the display screen is in a refresh state. In the embodiment of the application, the display screen is in a refreshing state, and the image displayed by the display screen may or may not change. The refresh state of the display screen the displayed content of the display screen does not change because: the image before the refresh obtained by the AP processor executing steps A1 to A3 in the technical architecture shown in fig. 7 is exactly the same as the image after the refresh obtained by the AP processor executing steps A1 to A3 in the technical architecture shown in fig. 7. The display screen is in an idle state, and the image displayed by the display screen does not change because: the AP process does not execute step A1 to step A3 in the technical architecture shown in fig. 7 any longer, and the display subsystem still sends the image obtained by the AP process executing step A1 to step A3 in the technical architecture shown in fig. 7 for the last time to the display screen of the electronic device according to the preset refresh frequency.
Referring to fig. 22, a schematic diagram of a refresh state and an idle state provided in an embodiment of the present application is shown. In fig. 22, the display screen is always on.
In the TS0 period, each module in the AP processor is matched with and synthesizes an image 1 to be refreshed on a display screen in the TS1 period.
In the TS1 period, the display subsystem sends the image 1 to a display screen, the display screen displays the image 1 which is synthesized by matching of all modules in the AP processor in the TS0 period, and meanwhile, all modules in the AP processor are matched with the image 2 to be refreshed on the display screen when the TS2 period is synthesized.
In the TS2 period, the display subsystem sends the image 2 to the display screen, the display screen displays the image 2 which is synthesized by all modules in the AP processor in the TS1 period in a matched mode, and meanwhile, all modules in the AP processor are matched with the image 3 to be refreshed on the display screen when the TS3 period is synthesized.
In the TS3 period, the display subsystem sends the image 3 to the display screen, the display screen displays the image 3 which is synthesized by matching of all modules in the AP processor in the TS2 period, and meanwhile, all modules in the AP processor are matched with and synthesize the image 4 to be refreshed on the display screen in the TS4 period.
From the start time of the TS4 period, the display enters an idle state.
In the TS4 period, the display subsystem sends the image 4 to the display screen, the display screen displays the image 4 which is synthesized by matching of each module in the AP processor in the TS3 period, and the AP processor does not synthesize the image to be refreshed any more.
In the TS5 period, the display subsystem sends the image 4 to the display screen, the display screen continues to display the image 4, and the AP processor no longer synthesizes the image to be refreshed.
In the TS6 period, the display subsystem sends image 4 to the display screen, the display screen continues to display image 5, and the AP processor no longer synthesizes the image to be refreshed.
In the above process, the periods TS0 to TS3 are the refresh states of the display screen, and the periods TS4 to TS6 are the idle states of the display screen. After the TS4 period and the TS4 period, the electronic device does not perform an image refresh action, the display screen enters an idle state, and after the display screen enters the idle state, the display subsystem still sends the image 4 synthesized by the AP processor for the last time to the display screen for display according to a preset frequency (the frequency is a refresh frequency of the display screen). The displayed image (image 4) is the last image refreshed before the display screen switches to the idle state. Although the display subsystem still sends the image 4 last synthesized by the AP processor to the display screen for display at the preset frequency (which is the refresh frequency of the display screen), the AP processor does not perform steps A1 to A2 in the technical architecture described in fig. 7.
Of course, in practical applications, the period from TS0 to TS4 may be recorded as the refresh state of the display screen, and the period from TS5 to TS6 may be recorded as the idle state of the display screen.
In the CWB write-back function enabled state, if the display screen is in a refresh state, the HWC may extract a target image corresponding to a currently refreshed image, and similarly, may also extract corresponding image noise. If the display screen is in the idle state, even if the CWB write-back function is activated, the AP processor does not perform the process of combining the images by the modules in steps A1 to A3 in the embodiment shown in fig. 7. Accordingly, the AP processor no longer performs steps A4 to A6 in the embodiment illustrated in fig. 7, and the noise algorithm library does not receive the target image during the idle state of the display screen, and does not obtain the image noise during the idle state of the display screen.
The display screen is idle for a long time, e.g. 1 minute, and the display screen does not need to refresh the image within this 1 minute. With a start-stop scheme that forces an image refresh once before the start of integration of the ambient light sensor, this may result in a forced refresh being required every 350ms within this 1 minute. This corresponds to about 60000ms/350ms =171.4 additional images refreshed within 1 minute. Therefore, in the case where the display screen is in the idle state for a long time, power consumption is increased again undoubtedly.
To more clearly understand the reason why the start-stop scheme of forcing the image refresh between the integration starts may lead to an increase in the power consumption of the processor when the display screen is in an idle state for a long time, this is illustrated by way of example in fig. 23.
Referring to FIG. 23, t at the integration period of the 1 st acquisition cycle 01 Refreshing the image once at any time, and storing t in a noise algorithm library correspondingly 01 Image sum of time t 01 Image noise at time instants.
Referring to FIG. 23, at t 01 After the moment, the display screen is at t of the integration time period of the M +1 acquisition cycle (2M)1 The image is refreshed once again at a time, which example ignores the brightness adjustment.
Referring to FIG. 23, t precedes the integration period of the 2 nd acquisition cycle 1n After the CWB write-back function is enabled at a time, the image is forced to refresh once (for convenience of description, the time of forced image refresh and the time of enabling the CWB write-back function are within the same time measurement unit, for example, both are within 1 ms), the AP processor performs steps A4 to A6 once, and the noise algorithm library obtains t 1n Target image sum t of time 1n Image noise at the moment.
Referring to fig. 24, the integrated noise for the integration period of the 2 nd acquisition cycle is: duration t being integral duration 1n Image noise at time instants.
Referring to FIG. 23, t precedes the integration period of the 3 rd acquisition cycle 3n After the CWB write-back function is started all the time, the image is refreshed once by forceThe AP processor executes the steps A4 to A6 once, and the noise algorithm base obtains t 3n Target image sum t of time 3n Image noise at time instants.
Referring to fig. 24, the integrated noise for the integration period of the 3 rd acquisition cycle is: duration t being the integration duration 3n Image noise at time instants.
……。
Referring to FIG. 23, t before the integration period of the M +1 acquisition cycle (2M-1)n After the CWB write-back function is started at any time, an image is forcibly refreshed once, the AP processor executes the steps A4 to A6 once, and the noise algorithm library obtains t (2M-1)n Target image of time and t (2M-1)n Image noise at time instants.
Referring to FIG. 23, t is the integration period of the M +1 acquisition cycle (2M)1 Image refreshing is carried out at any moment, the AP processor executes the steps A1 to A6 once, and the noise algorithm base obtains t (2M)1 Target image of time and t (2M)1 Image noise at the moment.
Referring to fig. 24, the integrated noise for the integration period of the M +1 th acquisition cycle is: duration of t 2M To t (2M)1 T of time (2M-1)n Image noise at the moment and duration of time t (2M)1 To t 2M+1 T at the moment of time (2M)1 Image noise at time instants.
If the start-stop scheme of the embodiment shown in FIG. 19 is followed, in the embodiments shown in FIGS. 23 and 24, the start-stop scheme is started from t 0 Time to t 2M At time (M acquisition cycles), the image is forcibly refreshed M times in total.
If write back function is turned on in CWB (t) 1n 、t 1n ……t (2M-1)n ) The image is not forced to be refreshed afterwards. Then, referring to fig. 25, the integrated noise for the 2 nd acquisition cycle is: t of duration integration time 01 The image noise at the moment, the integral noise at the 3 rd acquisition period is: t for integration duration 01 The integral noise of the M +1 acquisition period is: duration of time t 2M To t (2M)1 T at the moment of time 01 Image noise and duration at time of day of time ofDuration of time t (2M)1 To t 2M+1 T at the moment of time (2M)1 Image noise at time instants.
As mentioned above, the process of forcing the image refresh does not change the image displayed on the display screen, i.e. t 01 The image being refreshed at the moment is the same as the image being refreshed at the forced moment, respectively, t 01 The target image at the time and the target image at the forced refresh time are also the same. If brightness adjustment is ignored, t 01 The image noise at the time and the image noise at the forced refresh time are also the same. If brightness adjustment exists, the target image adopted at the moment of brightness adjustment is unchanged. Therefore, it is not necessary to force refreshing of the image in some scenes.
As can be understood from the above analysis, when the display screen is in the idle state for a long time, image noise that may interfere with the integration period may not be lost even if the image is not forcibly refreshed.
Of course, in the above embodiment, if t 01 If the time is in the non-integral time period of the first acquisition cycle, the noise algorithm library may not obtain t 01 Target image at time and image noise. Will need to be at t 1n The image is forced to refresh at all times.
In combination with the above various embodiments, the embodiments of the present application provide the technical solution shown in fig. 26. The embodiment shown in fig. 26 comprises the following steps:
at step 2601, the HWC in the ap processor starts the CWB write back function at a first preset time before the integration starts and looks at the last time the image is refreshed on the display.
In the embodiment of the application, whether the display screen needs to be forcibly refreshed or not, the CWB write-back function needs to be started at a first preset time before the integration starts, and then other factors are combined to determine whether the image needs to be forcibly refreshed or not.
For convenience of description, referring to fig. 27, a time corresponding to a first preset time before the start of integration in one acquisition period (T2) is selected as a reference time, which is T 3n . The embodiment of the application needs to be corresponding to the first preset time before the integration is startedCarving (t) 3n Time) initiates the CWB write back function and looks at the time the image was last refreshed on the display screen.
For convenience of description, the time when the image is refreshed on the display screen can be recorded as t k
As an example, when the image is refreshed last time no longer within the current non-integration period, the electronic device waits for the upper layer application to transmit the display parameters of the interface to the display engine service and then goes through the display engine service, the Surface flag, the HWC, and the like without forcing the refresh of the image. For the HWC, the HWC waits for a display parameter (which may be denoted as a second display parameter) sent by the Surface flag.
The HWC module receives the display parameters sent by the Surface flag module of the electronic equipment, and stores the moment when the display parameters are received and sixth display parameters;
the HWC module obtains the time when the electronic device refreshes the image last time, and may obtain the time when the display parameter sent by the Surface flanger module is received last time. The sixth display parameter may be set to the last display parameter that the HWC module acquired prior to the time the electronic device last refreshed the image was acquired. Accordingly, the time the image was last refreshed is the time the HWC module received the sixth display parameter.
Step 2602, if the time of last image refreshing of the display screen is within the current non-integration time period, wait for a second time (which may be recorded as a second time length).
In the embodiment of the present application, a time difference between a time of last image refreshing and a current time is greater than a difference threshold, which may be understood that the display screen has entered an idle state, and a time difference between a time of last image refreshing and a current time is less than or equal to the difference threshold, which may be understood that the display screen has not entered a refreshing state. Wherein the difference threshold may be determined based on empirical values.
The focus of the embodiments of the present application is to get the time of the last refresh image (when the HWC performed the matting, then the start time of the HWC performing the matting from the last refresh image) to determine whether forced refresh of the image is needed according to the time of the last refresh image (or the start time of the HWC performing the matting from the last refresh image).
The focus of the embodiment of the application is not to confirm the current state of the display screen. The current state of the display screen is convenient for understanding the reason why the power consumption increases in the idle state of the display screen in the above-described embodiments.
Referring to the embodiment shown in FIG. 27, the time t when the image is refreshed on the display screen k During the current non-integration period (t) 3 To t 3n In between), then the display screen is at t 3n The image displayed at the moment of time is t k The image refreshed at all times. t is t k The time is at t 3 Time to t 3n Between the moments, this time period is the time when the CWB write back function stops. I.e. the HWC has not acquired t k The target image at that moment, and correspondingly, the noise algorithm library also does not obtain t k Image noise at the moment. If the image is not forced to be refreshed at this time, the following may occur:
(1) At t 3n After the moment, the first change (image refresh or brightness adjustment) of the content displayed on the display screen is t b And adjusting the brightness at the moment.
t b At a time t 3n Time to t 4 Time between the times and t b Time to t 4 Without image refresh and brightness adjustment between moments. t is t b Backlight noise pair t of time instants 4 Time to t 5 The initial ambient light of the integration period corresponding to the instant of time is disturbed. At the calculation of t b In the case of backlight noise at the time, the latest target image buffered by the HWC is not t k Target image of time, but t k A target image before the moment, then calculated t b Backlight noise errors at time instants. T resulting in a noisy algorithm library calculation 4 Time to t 5 The target ambient light for the integration period corresponding to the instant is inaccurate.
t b At a time t 4 At time t b Backlight noise pair of time t 4 Time to t 5 Integration time corresponding to timeThere is interference with the initial ambient light of the segment. At the calculation of t b In the case of backlight noise at the time, the latest target image buffered by the HWC is not t k The target image of the moment, the calculated t b Backlight noise error at time of day, resulting in t calculated by noise algorithm library 4 Time to t 5 The target ambient light for the integration period corresponding to the instant of time is not accurate.
t b At a time t 4 Time to t 5 In the case of time between times, t 4 Time to t b Initial ambient light exposure t between moments k Image noise interference at a moment. While HWC does not obtain t k At the moment of image noise, t is used in the integration process k The primary fusion noise (possibly backlight noise and also image noise) before the moment is taken as the pair t 4 Time to t b The initial ambient light between the moments causes interfering fusion noise, resulting in t being calculated by the noise algorithm library 4 Time to t 5 The target ambient light for the integration period corresponding to the instant of time is not accurate. In addition, t b Backlight noise pair t of time instants 4 Time to t 5 There is a disturbance in the initial ambient light for the integration period corresponding to the instant. At the calculation of t b In the case of backlight noise at time, the latest target image cached by HWC is not t k The target image at the moment in time, resulting in a calculated backlight noise error. T calculated by noise algorithm library 4 Time to t 5 The target ambient light for the integration period corresponding to the instant is inaccurate.
(2) At t 3n The last change (image refresh or brightness adjustment) of the content displayed on the display screen after the moment is t b Image refresh at that moment. t is t b At a time t 4 Time to t 5 In the case of time instants between the time instants, t 4 Time to t b Initial ambient light exposure t between moments k Interference of image noise at the moment. While the noise algorithm library does not calculate t k Image noise at time instants. Noise Algorithm library computation t 4 Time to t 5 Using t stored in noise memory for target ambient light at a time k Merging noise before time as interference integralBlending noise of the first sub-period of the time period, resulting in t calculated by the noise algorithm library 4 Time to t 5 The target ambient light for the integration period corresponding to the instant is inaccurate.
However, it can be understood from the above analysis that if the time of last refreshing the image on the display screen is within the current non-integral time period, the image needs to be forcibly refreshed to obtain the target image corresponding to the image currently displayed on the display screen and the image noise corresponding to the target image, and certainly, the target image corresponding to the image currently displayed on the display screen and the image noise corresponding to the target image can be understood as t 3n Target image and image noise at the moment.
As another embodiment of the present application, if the time when the image is refreshed on the display screen last time is not within the current non-integration time period, the image is not forced to be refreshed.
In the embodiment of the application, the moment t when the image is refreshed on the display screen for the last time k Is out of t 3 Time and t 3n Between the moments, it may be in the integration period of the current acquisition cycle, or in the previous or earlier acquisition cycle.
If t is k In the integration time period in the acquisition cycle, the CWB write-back function is started during the integration time period in the acquisition cycle, so the HWC can obtain t k The target image at the moment, the noise algorithm library can also obtain t k The target image at the moment and the image noise, and therefore, the image does not need to be forcibly refreshed.
If t k During the last acquisition cycle or earlier, since it was already performed according to the embodiment shown in fig. 26 at the time of the last acquisition cycle, it is not necessary to consider whether to forcibly refresh the image. The present application will subsequently verify this situation (t) by means of fig. 28 to 30 k Last acquisition cycle or earlier acquisition cycle) whether forced refreshing of the image is not required. Reference is made in particular to the description of fig. 28 to 30.
In the embodiment of the application, the time t of refreshing the image on the display screen is judged k Whether or not within the non-integration period of the current cycleThe method can be as shown in fig. 27.
The first method is as follows: judgment T22 (T) 3n -t k ) And T21 (T) 3n -t 3 ) The size of (2). If T22 (T) 3n -t k ) Less than T21 (T) 3n -t 3 ) It means that the time of the last refresh of the image on the display screen is within the current non-integration period. Otherwise, the time when the image is refreshed on the display screen is not in the current non-integration time period. In this embodiment, T22 may be recorded as a first difference value, and T21 may be recorded as a second difference value.
The second method comprises the following steps: determine t k And t 3 Of (c) is used. If t k Greater than t 3 And is less than t 3n It indicates that the moment at which the image was last refreshed on the display screen is within the current non-integration period. Otherwise, the time of refreshing the image on the display screen is not in the current non-integration time period. In the embodiment of the present application, if t 3 Image refresh is performed at time t 3 The CWB write back function will not stop until after that time. Thus, at t 3 The HWC can retrieve t in the presence of an image refresh at that time 3 The target image at the moment and the noise algorithm library can also acquire t 3 Target image noise at a time. Therefore, T22 (T) may be set 3n -t k ) Equal to T21 (T) 3n -t 3 ) The moment at which the display screen switches to the idle state is within the current integration period. In the same way, t k Is equal to t 3 The moment at which the display screen switches to the idle state is within the current integration period.
The third method comprises the following steps: see if the time of the last refresh of the image and the time of the last matting are less than a certain threshold (because there may be a difference between the image refresh time and the start time of the HWC execution to fetch the target image). If the count is less than a threshold, the image indicating the last refresh is over-matting by the HWC, and the image indicating the last refresh is not in the current non-integration period. If the threshold value is greater than or equal to a certain threshold value, it indicates that the image refreshed last time has not been subjected to the matting processing by the HWC, and it indicates that the threshold value is set according to the actual situation in the current non-integration time period. In this embodiment, the threshold may be denoted as a first threshold.
In the embodiment of the present application, a time when the HWC acquires the display parameters of the interface from the surface flag may be used as a time when the image is refreshed this time, a time when the HWC obtains the synthesized image through the underlying hardware may be used as a time when the image is refreshed this time, and a time when the image is sent to be displayed by the display subsystem may be used as an image refresh time. Regardless of the time at which the image is refreshed this time, there may be a slight difference between the time at which the image is refreshed this time and the time at which the HWC starts to perform matting from the image refreshed this time, for example, 0.5ms,0.8ms,1ms, or the like. Of course, the time of this refresh of the image and the time when the HWC starts to perform matting from the refreshed image may also be equal. A current refresh picture is a picture adjacent to the current refresh picture, and the time of the current refresh picture and the time of the adjacent one refresh picture are usually different by one refresh cycle, and the refresh cycle is 1000ms/60=16.7ms, taking a refresh frequency of 60Hz as an example. Taking a refresh frequency of 120Hz as an example, the refresh period is 1000ms/120=8.3ms. Thus, the threshold in this example may be a relatively small value, e.g., 2ms, relative to the refresh period.
Step 2603, if the electronic device refreshes the image during the second preset time, the electronic device does not force to refresh the image.
When the electronic device refreshes the image, it indicates that the HWC can receive the display parameter sent by the Surface flag (the display parameter is denoted as a fourth display parameter). The currently refreshed image may be denoted as a fifth image.
Step 2603', if the electronic device does not refresh the image while waiting for the second preset time period, the image is forcibly refreshed.
In this embodiment of the application, if the electronic device is always refreshing the image, a second preset time period (for example, 17 ms) has a refresh action, and the HWC has already acquired the latest target image, and then lags the second preset time to decide whether to perform the forced refresh of the image, so that an additional action of forcibly refreshing the image can be avoided, and power consumption can be further reduced.
The HWC module waits for a second preset time period, and if the display parameter (which may be referred to as a third display parameter) sent by the Surface flicker is not received, the image needs to be forcibly refreshed.
In the following, three embodiments (all take HWC cutout every time the display refreshes an image as an example) are used to verify that the display is in an idle state for a long time (in the above embodiment, if t is k In the case of the last acquisition cycle or an earlier acquisition cycle) whether all image noise and backlight noise interfering with the initial ambient light acquired during the integration period can be obtained.
Referring to fig. 28, the time when the image is refreshed on the display screen last time is in the integration period of the last acquisition cycle, and the image is not refreshed again all the time. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t 1n Time t 2n Time of day, t 3n The CWB write back function is enabled all the time.
First acquisition cycle, t 0 At the moment, the ambient light sensor starts to collect the initial ambient light, t 0 At this point, the CWB write back function has been enabled. At t k The last time the image is refreshed on the display screen at that moment, and the noise algorithm library obtains t k Image noise at time instants.
At t 1n And the moment when the image is refreshed on the display screen for the last time is not in the non-integral time period. Display screen t 1n The image displayed at the moment of time is t k The image displayed at the moment, if the backlight adjustment is ignored, t 1n The image noise at time t k Image noise at time instants. And the noise algorithm library has already acquired t k Image noise at time, therefore, t 1n The image is no longer forced to refresh at that moment.
In the second acquisition period, there is no image refresh and brightness adjustment is ignored. After the integration of the second acquisition period is finished, the fusion noises disturbing the integration time period are all t k Image noise at time instants.
At t 3n And the moment when the display screen refreshes the image for the last time is not in the non-integral time period of the time. Display screen t 3n The image displayed at the moment of time is t k The image that is displayed at the moment of time,if the backlight adjustment is ignored, t 3n The image noise at time t k Image noise at time instants. While HWC has already obtained t k Image noise at time, therefore, t 3n The image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the image is refreshed last time in the integration period of the last acquisition cycle and the image is not refreshed again, the image noise interfering with each integration period can be obtained without forcibly refreshing the image.
Referring to FIG. 29, t in the non-integration period of the previous acquisition cycle is the time at which the image was last refreshed on the display screen 1n Before the time instant, and the image is not refreshed again. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t 1n Time of day, t 2n Time t 3n The CWB write back function is enabled at all times.
First acquisition cycle, t 0 At the moment, the ambient light sensor starts to collect the initial ambient light, t 0 At that point, the CWB write back function has been enabled. At t k At the moment that the display screen refreshes the image for the last time, at the moment that the CWB write-back function is in a stop state, the HWC does not obtain t k Image noise at time instants.
At t 1n The moment when the image is refreshed on the display screen is the non-integral time period, and the image needs to be refreshed forcibly (the moment becomes the moment t when the image is refreshed for the last time) k ') to obtain t 1n Image noise at the moment.
Certainly, in practical application, even if a decision is made to forcibly refresh a new image, the method can wait for a certain time period, if the display screen refreshes the image according to the refresh frequency during the waiting for the certain time period, the image does not need to be forcibly refreshed, and if the refreshed image is not monitored after waiting for the certain time period, the image can be forcibly refreshed. This example takes a forced refresh image as an example.
In the second acquisition period, no image refreshing is carried out, brightness adjustment is ignored, and after the integration of the second acquisition period is finished, the fusion noises disturbing the integration time period of the second acquisition period are all t 1n Image noise at the moment.
At t 3n And the moment when the display screen refreshes the image last time is not in the non-integral time period. Display screen t 3n The image displayed at the moment of time is t 1n The image displayed at the moment, if the backlight adjustment is ignored, t 3n The image noise at time t 1n Image noise at the moment. And the noise algorithm library has already acquired t 1n Noise at time, therefore, t 3n The image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the moment when the image is refreshed on the display screen last time is before the write-back function of the non-integration period CWB of the last acquisition cycle is started, and the image is not refreshed again, the image noise that interferes with the initial ambient light of each integration period can be obtained without forcibly refreshing the image.
Referring to FIG. 30, t in the non-integration period of the previous acquisition cycle is the time at which the image was last refreshed on the display screen 1n After time t 2 Before the time instant, and the image is not refreshed again. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t 1n Time t 2n Time t 3n The CWB write back function is enabled all the time.
A first acquisition cycle at t 1n At time, the CWB write back function is enabled.
At t k At the moment, the display screen refreshes the image, and then the noise algorithm library can acquire t k Image noise of the image displayed at the moment.
In the second acquisition period, there is no image refresh and brightness adjustment is ignored. After the integration of the second acquisition period is over, the interference integrationThe fusion noise of the time period is t k Image noise at the moment.
At t 3n And the moment when the display screen refreshes the image for the last time is not in the non-integral time period. Display screen t 3n The image displayed at the moment of time is t k The image displayed at the moment, if the backlight adjustment is ignored, t 3n The image noise at time t k Image noise at time instants. And the noise algorithm library has already acquired t k Noise at time, therefore, t 3n The image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the image is refreshed last time on the display screen after the write-back function of the non-integration time CWB in the last acquisition cycle is started and the image is not refreshed again, the image noise that interferes with the initial ambient light in each integration time can be obtained without forcibly refreshing the image.
As can be understood by the examples of fig. 28 to 30: if the moment of refreshing the image on the display screen at the last time is not in the non-integral time period, the image does not need to be forcibly refreshed.
In the embodiment of the present application, the flowchart shown in fig. 28 is used, and on the basis of reducing the power consumption of the processor, it is possible to avoid obtaining no image noise that interferes with the integration time period, avoid obtaining a target image that is used when no backlight noise that interferes with the integration time period is obtained, and avoid a situation that a negative gain occurs in an idle state where the display screen is in for a long time.
As described above, the HWC in the AP processor may monitor the change of data in the core node during both the integration period and the non-integration period, and when the data stored in the core node changes, the HWC obtains the brightness to be adjusted from the core node, and transmits the brightness to the noise algorithm library to calculate and obtain the backlight noise.
In practice, the method can be performed as follows.
During the stop of the CWB write-back function, when the HWC monitors the change of the data in the kernel node, the HWC can obtain the brightness to be adjusted, however, the HWC does not transmit the brightness value to be adjusted to the noise algorithm library any more, and accordingly, the noise algorithm library does not calculate and obtain the corresponding backlight noise.
When the CWB write back function is to be enabled (e.g., 1ms before the CWB write back function is enabled, etc.) or enabled, if the HWC does not monitor a change in the data stored in the kernel node while the CWB write back function is disabled, it indicates that the brightness value of the display screen has not changed. The HWC performs the start-stop method of the CWB write-back function provided in any of the embodiments described above.
When the CWB write back function is to be enabled or activated, if the HWC monitors a change in the data stored in the kernel node while the CWB write back function is stopped, it indicates that the brightness value of the display has changed. The HWC needs to send the latest brightness value monitored to the noise algorithm library. Then, the HWC performs the start-stop method of the CWB write-back function provided in any of the above embodiments. During the stop of the CWB write-back function, if there are multiple brightness adjustments, the noise algorithm library only needs to know the value after the last brightness adjustment, that is, the HWC sends the monitored brightness to be adjusted corresponding to the latest brightness change of the display screen to the noise algorithm library. The backlight noise corresponding to the brightness to be adjusted is avoided being frequently calculated by the noise algorithm library, and therefore power consumption is reduced.
As one example, after the HWC module sets a matte mark as the first character; the HWC module monitors whether data in a core node of the electronic device changes or not, wherein the core node stores a brightness value;
In response to a change in data in a core node of the electronic device, the HWC module obtaining a first luminance value from the core node;
after the HWC module obtains a first luminance value from the kernel node, the HWC module obtains a second luminance value from the kernel node in response to a change in data in the kernel node of the electronic device;
in response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
And when the noise algorithm library calculates and obtains the image noise corresponding to the image which is forcibly refreshed, calculating and obtaining a first image noise based on the target image corresponding to the image which is forcibly refreshed and the second brightness value.
After the HWC module sets a matte mark as a second character, the HWC module monitors whether data in a kernel node of the electronic device is changed, and the kernel node stores a brightness value;
in response to a change in data in a core node of the electronic device, the HWC module obtaining a third luminance value from the core node;
the HWC module sends the second luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, the HWC module retrieves a fourth luma value from a core node of the electronic device in response to a change in data in the core node;
The HWC module sends the fourth luma value to the noise algorithm library.
In the above embodiments, the HWC may or may not force the image to be refreshed.
If the HWC forcibly refreshes the image, the image noise corresponding to the forcibly refreshed image is obtained by calculating the brightness value newly transmitted into the noise algorithm library and the target image corresponding to the forcibly refreshed image. The backlight noise corresponding to the brightness adjusting moment does not interfere with the initial ambient light collected in the next integration time period. The fusion noise which interferes with the initial ambient light collected in the next integration time period may be image noise corresponding to a forced refreshing image, and in this case, the value of the image noise is correct, and the target ambient light in the next integration time period is not wrong.
If the HWC does not force the image to be refreshed, indicating that the target image of the image currently displayed on the display screen is stored in the noise algorithm library (the latest frame of target image stored in the noise algorithm library).
If the brightness adjusting time is earlier than the refreshing time of the image currently displayed on the display screen, the backlight noise corresponding to the brightness adjusting time does not interfere with the initial ambient light collected in the next integration time period. The fusion noise that interferes with the initial ambient light collected in the next integration period may be image noise corresponding to an image currently displayed on the display screen. And the value of the image noise is correct. And will not cause target ambient light errors for the next integration period.
If the brightness adjusting time is later than the refreshing time of the image currently displayed on the display screen, the image noise corresponding to the refreshing time of the image currently displayed on the display screen cannot interfere with the initial ambient light collected in the next integral time period. The fusion noise that interferes with the initial ambient light collected in the next integration period may be the backlight noise at the latest brightness adjustment time, and the backlight noise is generated by the latest target image already acquired by the display screen and the adjusted brightness value at the latest time.
Thus, whenever the CWB write back function is enabled, whether or not the image is forced to be refreshed, during the time that the CWB write back function is disabled, if the HWC detects a brightness change; the HWC sends the latest brightness value monitored to the noise algorithm library while the CWB write back function is to be enabled. And calculating by a noise algorithm library to obtain the backlight noise. The HWC continues to perform the start-stop method of the CWB write-back function provided in any of the embodiments described above.
As another embodiment of the present application, during the start of the CWB write back function, the HWC may retrieve the target image from the CWB write back memory once every other frame.
As an example, when the display screen is refreshed at a frequency of 90Hz, which is equivalent to refreshing the image once every 1000ms/90=11.11ms, the HWC fetches the target image once every one frame from the CWB write-back memory specifically:
when the electronic device refreshes the image for the ith time (taking the image matting refreshed for the ith time as an example), after the HWC obtains the synthesized image, the HWC checks the matting mark as a second character, the HWC determines that the image refreshed this time is a matting frame, and the HWC continues to execute subsequent steps according to the above embodiment.
When the image is refreshed for the (i + 1) th time, after the HWC obtains the synthesized image, the matting flag is checked, the HWC checks the matting flag as the second character (during the start of the CWB write-back function, the matting flag is the second character), the HWC obtains the time difference between the last time of determining the matting frame (the last time of determining the matting frame as the i-th refreshing image) and the current time, the time difference is smaller than the matting frame difference threshold (11.11 ms, or other time values, such as 11.5ms,11.8ms,12ms, etc.), it indicates that the last refreshed image of the (i + 1) th refreshed image is already a matting frame, and the image of the (i + 1) th refresh is not subjected to matting.
When the image is refreshed for the (i + 2) th time, after the HWC obtains the synthesized image, the HWC checks the scratch mark as a second character (during the start of the CWB write-back function, the scratch mark is a second character), the HWC obtains the time difference between the time of last determining the scratch frame (the time of the i-th refreshed image is determined as the time of the scratch frame) and the current time, the time difference is greater than or equal to the scratch frame difference threshold (11.11 ms, and other time values, such as 11.5ms,11.8ms, and 12ms), and the image refreshed for the (i + 2) th time is the scratch frame.
In the above example, the time difference is: the HWC last determines the time difference between the moment of the matting frame and the current moment. Actually, the time difference between the time when the HWC last transmitted the synthesized image (carrying the information to be scratched) to the OLED driver and the current time may be also used, or the time difference between the time when the HWC last started to perform scratching and the current time may be also used. The above manner of obtaining the time difference is only used as an example, and in practical applications, other manners of determining the time difference may also be used. The time difference is obtained in different ways, and correspondingly, the threshold value of the difference value of the matting frame can be different.
As an example, the interval between two refreshes is theoretically 11.1ms, and the current time is the time when the HWC looks to get the matte mark as the second character. If the time difference is: the time difference between the time of the matting frame and the current time is determined last time, and the time difference is theoretically 11.1ms (the last frame is the matting frame) or 22.2ms (the last frame is not the matting frame). Thus, the matte frame difference threshold may be any value between 11.1 and 22.2. If the time difference is: when the HWC transmits the synthesized image (carrying the information to be matted) to the OLED drive last time, the time difference is theoretically (11.1-t) ms (the last frame is a matted frame) or (22.2-t) ms (the last frame is not a matted frame), and t is the time difference between the moment when the HWC determines the matted frame and the moment when the synthesized image is transmitted to the OLED drive. Thus, the matte frame difference threshold may be any value between 11.1-t and 22.2-t.
If the display is refreshed at a 120Hz rate, the 11.1ms in the above example would need to be changed to 16.7ms based on the 120Hz rate. Thus, in inter-frame matting, the matte frame difference threshold is also related to the current refresh frequency of the display screen.
Of course, the above-mentioned time points are only used as examples and do not set any limit to the present application.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the above method example, for example, each functional unit may be divided for each function, or two or more functions may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation. The following description will take the example of dividing each functional unit corresponding to each function:
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments may be implemented.
Embodiments of the present application further provide a computer program product, which when run on an electronic device, enables the electronic device to implement the steps in the above method embodiments.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to a first device, including a recording medium, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
The embodiments of the present application further provide a chip system, where the chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement the steps of any of the method embodiments of the present application. The chip system may be a single chip or a chip module composed of a plurality of chips.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and they should be construed as being included in the present application.

Claims (36)

1. A method for monitoring noise, applied to an electronic device, the electronic device comprising: a HWC module, a display subsystem, and a library of noise algorithms, the method comprising:
in response to receiving the first information, the HWC module sets a write-back flag to a first flag, the display subsystem to stop storing an image to a write-back memory of the electronic device while the write-back flag is the first flag;
in response to receiving a first image, the HWC module sends the first image to the display subsystem;
the display subsystem stops storing a second image to a write-back memory of the electronic equipment, wherein the second image is an image corresponding to an area, which contains a first target image, on the first image, the first target image is an image in a first area, and the first area is an area, which is positioned above an environmental sensor of the electronic equipment, on a display screen of the electronic equipment;
in response to reaching a first time, the HWC module sets the write-back flag to a second flag, the display subsystem to begin storing an image to a write-back memory of the electronic device while the write-back flag is the second flag;
The HWC module obtains a third image;
the HWC module sends the third image to the display subsystem;
in response to receiving the third image, the display subsystem stores a fourth image to a write-back memory of the electronic device, where the fourth image is an image corresponding to an area on the third image that includes a second target image, and the second target image is an image in the first area;
the HWC module acquires the second target image from the write-back memory;
the HWC module sends the second target image to a noise algorithm library;
and the noise algorithm base calculates and obtains first image noise based on the second target image.
2. The method of claim 1, wherein the first information comprises a first duration of time for which the display subsystem stops storing images to the write-back memory; the first moment is as follows: the write-back flag is set to be a time when a first time length passes after the time of the first flag;
or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting a delay time length from the first time length, and the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information.
3. The method of claim 1 or 2, wherein the HWC module acquiring the third image comprises:
the HWC module sends a first signal to a surface flanger of the electronic device;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameters.
4. The method of any of claims 1 to 3, wherein the HWC module, after setting the write back flag to the second flag and before the HWC module acquires a third image, further comprises:
the HWC module acquires the last image refreshing time of the electronic equipment;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the HWC module acquires the third image.
5. The method of claim 4, wherein the HWC module, after obtaining the time the image was last refreshed by the electronic device, further comprises:
And if the moment when the image of the electronic equipment is refreshed last time does not meet a first preset condition, the HWC module waits for a Surface flanger module of the electronic equipment to send a second display parameter.
6. The method of claim 4, wherein the HWC module obtaining the first image if a time of last image refresh of the electronic device satisfies a first preset condition comprises:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the HWC module waits for a second time length;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second duration, the HWC module acquires the first image.
7. The method of claim 6, wherein the method further comprises:
if the HWC module receives a fourth display parameter sent by a Surface flag within the second duration, the HWC module acquires a fifth image based on the fourth display parameter;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the fifth image and third information to the display subsystem based on the second flag;
In response to receiving the fifth image and the third information, the display subsystem stores a sixth image comprising a third target image on the fifth image in a write-back memory of the electronic device, the third target image being an image within the first region;
the HWC module acquires the third target image from the write-back memory;
the HWC module sends the third target image to a noise algorithm library;
and the noise algorithm library calculates and obtains second image noise based on the third target image.
8. The method of any of claims 4 to 7, wherein the first information comprises a first value and a second time, the second time being an end time when an ambient light sensor of the electronic device collects the first value;
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time.
9. The method according to any one of claims 4 to 7, wherein the first information further includes a first value and a second time, the second time is an end time when an ambient light sensor of the electronic device acquires the first value, and the time when the electronic device last refreshes an image satisfies a first preset condition includes:
A first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
and a first difference between the last image refreshing time of the electronic equipment and the current time is greater than or equal to a second difference between the second time and the current time.
10. The method of any one of claims 4 to 7, wherein the first preset condition being satisfied by the moment when the electronic device last refreshed the image comprises:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
11. The method of any one of claims 1 to 10, further comprising:
After the HWC module sets a write back flag to a first flag; the HWC module monitors whether data in a kernel node of the electronic equipment is changed or not, and the kernel node stores a brightness value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a first luminance value from the core node;
after the HWC module acquires a first brightness value from the kernel node, the HWC module acquires a second brightness value from the kernel node in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
12. The method of claim 11, wherein the method further comprises:
after the HWC module sets a write back flag to a second flag, the HWC module monitors a core node of the electronic device for a change in data, the core node storing a luminance value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a third luminance value from the core node;
The HWC module sends the third luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, in response to listening for a change in data in a core node of the electronic device, the HWC module retrieving a fourth luma value from the core node;
the HWC module sends the fourth luma value to the noise algorithm library.
13. The method of claim 11 or 12, wherein the computing of the first image noise based on the second target image by the noise algorithm library comprises:
and calculating to obtain first image noise based on the second target image and the second brightness value by the noise algorithm library.
14. The method of any of claims 1 to 13, wherein the HWC module receiving the first image comprises:
the HWC module receives a fifth display parameter sent by a Surface flanger module of the electronic equipment;
the HWC module derives the first image based on the fifth display parameter.
15. The method of any of claims 1-14, wherein the first area is an area on a display screen of the electronic device that is above an ambient light sensor of the electronic device.
16. The method of claim 4, wherein the HWC module obtaining the time before the last image refresh of the electronic device comprises:
the HWC module receives a sixth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module stores a time at which the HWC module received the sixth display parameter;
the HWC module acquires the last image refreshing time of the electronic device, and the method comprises the following steps:
the HWC module obtains the stored time when the sixth display parameter is received, wherein the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before obtaining the last image refreshing time of the electronic device.
17. The method of claim 3, wherein the first display parameter comprises: and synthesizing one or more of the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
18. A method for monitoring noise, applied to an electronic device, the electronic device including a first processor, the method comprising:
The first processor receives first information, and the first information is used for instructing the first processor to stop acquiring a target image from a refreshed image;
after the first processor receives the first information, in response to receiving a first image, the first processor stops acquiring a first target image from the first image, wherein the first target image is an image in a first area;
after the first time is reached, the first processor acquires a third image;
the first processor acquires a second target image from the third image, wherein the second target image is an image in the first area.
19. The method of claim 18, wherein the method further comprises:
in response to receiving the first information, the first processor setting a write back flag as a first flag by a HWC module of the electronic device;
the first processor, in response to receiving a first image, ceasing to acquire a first target image from the first image comprises:
in response to receiving the first image, the first processor querying, via the HWC module, the write-back flag as a first flag;
the first processor sending, by the HWC module, the first image to a display subsystem of the electronic device based on the first flag;
The first processor stops storing a second image comprising a first target image on the first image to a write-back memory of the electronic equipment through the display subsystem, wherein the first target image is an image in a first area;
the method further comprises the following steps:
in response to reaching a first time, the first processor setting, by the HWC module, the write-back flag to a second flag;
the first processor acquires a third image, acquires a second target image from the third image, and the second target image is an image in the first area, and comprises:
the first processor obtaining a third image through the HWC module;
the first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sending, by the HWC module, the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
in response to receiving the third image and the second information, the first processor stores, by the display subsystem, a fourth image including a second target image on the third image to a write-back memory of the electronic device, where the second target image is an image in the first area;
The first processor retrieves the second target image from the write-back memory through the HWC module;
the method further comprises the following steps:
the first processor sending, by the HWC module, the second target image to a noise algorithm library;
the first processor obtains first image noise through calculation of the noise algorithm base based on the second target image.
20. The method of claim 19, wherein the first information comprises a first duration of time for which the display subsystem stops storing images to the write-back memory; the first moment is as follows: the write-back flag is set to be a time when a first time length passes after the time of the first flag;
or the first information includes a first time length, a first value and a second time, the first time length is a time length when the display subsystem stops storing the image to the write-back memory, and the second time is an end time when an ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting a delay time length from the first time length, and the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information.
21. The method of claim 19 or 20, wherein the first processor obtaining a first image via the HWC module comprises:
the first processor sends a first signal to a surface flag of the electronic device through the HWC module;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameters.
22. The method of any of claims 19 to 21, wherein after the first processor sets the write back flag to the second flag via the HWC module, the first processor further comprises before the first processor acquires a third image via the HWC module:
the first processor acquires the moment when the image is refreshed on the electronic device last time through the HWC module;
and if the moment when the image of the electronic equipment is refreshed last time meets a first preset condition, the first processor acquires the third image through the HWC module.
23. The method of claim 22, wherein after the first processor obtains, via the HWC module, a time at which the electronic device last refreshed the image, further comprising:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the first processor waits for a Surface flag module of the electronic equipment to send a second display parameter through the HWC module.
24. The method of claim 22, wherein the first processor obtaining the first image through the HWC module if a first preset condition is met at a time when the electronic device last refreshed comprises:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the first processor waits for a second time length through the HWC module;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second time length, the first processor acquires the first image through the HWC module.
25. The method of claim 24, wherein the method further comprises:
if the HWC module receives fourth display parameters sent by a Surface flag within the second duration, the first processor acquires a fifth image based on the fourth display parameters through the HWC module;
The first processor querying, via the HWC module, the write back flag as the second flag;
the first processor sending, by the HWC module, the fifth image and third information to the display subsystem based on the second flag;
in response to receiving the fifth image and the third information, the first processor stores, by the display subsystem, a sixth image including a third target image on the fifth image in a write-back memory of the electronic device, the third target image being an image within the first region;
the first processor retrieves the third target image from the write-back memory through the HWC module;
the first processor sending, by the HWC module, the third target image to a noise algorithm library of the electronic device;
and the first processor calculates and obtains second image noise based on the third target image through the noise algorithm library.
26. The method of any of claims 22 to 25, wherein the first information comprises a first value and a second time, the second time being an end time when an ambient light sensor of the electronic device acquired the first value;
The electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time;
or, the electronic device meeting the first preset condition at the moment of last image refreshing includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time;
alternatively, the first and second electrodes may be,
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
The moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
27. The method of any one of claims 19 to 26, further comprising:
after the first processor sets a write back flag to a first flag via the HWC module; the first processor monitors whether data in a core node of the electronic device changes through the HWC module, wherein the core node stores a brightness value;
in response to monitoring that a change occurs in data in a core node of the electronic device, the first processor obtaining, by the HWC module, a first luminance value from the core node;
after the first processor acquires a first brightness value from the kernel node through the HWC module, the first processor acquires a second brightness value from the kernel node through the HWC module in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the first processor sends the second luma value to the noise algorithm library through the HWC module.
28. The method of claim 27, wherein the method further comprises:
after the first processor sets a write back flag to a second flag via the HWC module, the first processor monitors, via the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a third luminance value from the core node;
the first processor sending, by the HWC module, the third luma value to the noise algorithm library;
after the first processor sends the third luminance value to the noise algorithm library through the HWC module, in response to monitoring that a change in data in a core node of the electronic device occurs, the first processor obtains a fourth luminance value from the core node through the HWC module;
the first processor sends the fourth luma value to the noise algorithm library through the HWC module.
29. The method of claim 27, wherein the first processor obtaining a first image noise based on the second target image calculation via the noise algorithm library comprises:
The first processor calculates a first image noise based on the second target image and the second luminance value through the noise algorithm library.
30. The method of any of claims 19 to 29, wherein the first processor receiving, by the HWC module, the first image comprises:
the first processor receives a fifth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor obtains, by the HWC module, the first image based on the fifth display parameter.
31. The method of any of claims 18 to 30, wherein the first area is an area on a display screen of the electronic device that is located above an ambient light sensor of the electronic device.
32. The method of claim 22, wherein the first processor, prior to the time at which the electronic device last refreshed the image was obtained by the HWC module, comprises:
the first processor receives a sixth display parameter sent by a Surface flanger module of the electronic device through the HWC module;
the first processor storing, by the HWC module, a time at which the sixth display parameter was received by the HWC module;
The first processor acquires, by the HWC module, a time when the image is last refreshed on the electronic device, where the time includes:
the first processor acquires, by the HWC module, the stored time when the sixth display parameter is received, where the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before the time when the electronic device last refreshes an image is acquired.
33. The method of claim 21, wherein the first display parameter comprises: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
34. An electronic device, characterized in that the electronic device comprises a first processor for executing a computer program stored in a memory, to cause the electronic device to implement the method of any of claims 1 to 17 or the method of any of claims 18 to 33.
35. A chip system comprising a first processor coupled to a memory, the first processor executing a computer program stored in the memory to implement the method of any of claims 18 to 33.
36. A computer-readable storage medium, in which a computer program is stored which, when run on a processor, implements the method of any one of claims 1 to 17 or the method of any one of claims 18 to 33.
CN202211137769.9A 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system Pending CN115564668A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211137769.9A CN115564668A (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211137769.9A CN115564668A (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system
CN202110606261.8A CN113808030B (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202110606261.8A Division CN113808030B (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Publications (1)

Publication Number Publication Date
CN115564668A true CN115564668A (en) 2023-01-03

Family

ID=78942437

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110606261.8A Active CN113808030B (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system
CN202211137769.9A Pending CN115564668A (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202110606261.8A Active CN113808030B (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Country Status (1)

Country Link
CN (2) CN113808030B (en)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1241396C (en) * 2000-05-31 2006-02-08 三星电子株式会社 Method for expressing mode repeatability of images
US8076628B2 (en) * 2008-09-25 2011-12-13 Apple Inc. Ambient light sensor with reduced sensitivity to noise from infrared sources
US8987652B2 (en) * 2012-12-13 2015-03-24 Apple Inc. Electronic device with display and low-noise ambient light sensor with a control circuitry that periodically disables the display
CN108885775B (en) * 2016-04-05 2019-11-29 华为技术有限公司 A kind of display methods and terminal
CN106610879B (en) * 2016-12-23 2019-08-02 盛科网络(苏州)有限公司 The method for improving chip CPU noise testing efficiency
US10475148B2 (en) * 2017-04-24 2019-11-12 Intel Corporation Fragmented graphic cores for deep learning using LED displays
CN207165238U (en) * 2017-05-17 2018-03-30 西安紫光国芯半导体有限公司 A kind of memory of the write-back when carrying out read operation
CN107945747A (en) * 2017-11-22 2018-04-20 广东欧珀移动通信有限公司 Environment light detection method, device, storage medium and electronic equipment
US20200294468A1 (en) * 2019-03-13 2020-09-17 Apple Inc. Electronic Devices With Ambient Light Sensor Systems
US11182884B2 (en) * 2019-07-30 2021-11-23 Nvidia Corporation Enhanced high-dynamic-range imaging and tone mapping
US20210086364A1 (en) * 2019-09-20 2021-03-25 Nvidia Corporation Vision-based teleoperation of dexterous robotic system
CN110677596A (en) * 2019-11-04 2020-01-10 深圳市灵明光子科技有限公司 Ambient light adjusting device, ambient light adjusting method, image sensor and electronic device
CN111754954B (en) * 2020-07-10 2021-08-24 Oppo(重庆)智能科技有限公司 Screen brightness adjusting method and device, storage medium and electronic equipment
CN112229507B (en) * 2020-10-15 2023-07-18 Tcl通讯(宁波)有限公司 Ambient light detection method and device, storage medium and mobile terminal

Also Published As

Publication number Publication date
CN113808030B (en) 2022-09-30
CN113808030A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN113475057B (en) Video frame rate control method and related device
CN112712803A (en) Voice awakening method and electronic equipment
CN112887583B (en) Shooting method and electronic equipment
CN112119641B (en) Method and device for realizing automatic translation through multiple TWS (time and frequency) earphones connected in forwarding mode
CN113625860B (en) Mode switching method and device, electronic equipment and chip system
CN114365482A (en) Large aperture blurring method based on Dual Camera + TOF
CN116991354A (en) Data processing method and related device
CN111741283A (en) Image processing apparatus and method
CN114095666A (en) Photographing method, electronic device and computer-readable storage medium
CN113804290B (en) Ambient light detection method, electronic device and chip system
WO2022199613A1 (en) Method and apparatus for synchronous playback
CN113808030B (en) Noise monitoring method, electronic equipment and chip system
CN113923351B (en) Method, device and storage medium for exiting multi-channel video shooting
CN113923372B (en) Exposure adjusting method and related equipment
CN113837990B (en) Noise monitoring method, electronic equipment, chip system and storage medium
CN113820008B (en) Ambient light detection method, electronic device and chip system
CN114375027A (en) Method and device for reducing power consumption
CN114740986A (en) Handwriting input display method and related equipment
CN113467904A (en) Method and device for determining collaboration mode, electronic equipment and readable storage medium
CN113391735A (en) Display form adjusting method and device, electronic equipment and storage medium
CN113672454B (en) Screen freezing monitoring method, electronic equipment and computer readable storage medium
WO2023207862A1 (en) Method and apparatus for determining head posture
CN115931115A (en) Detection method of ambient light, electronic equipment, chip system and storage medium
CN118276665A (en) Display method and communication device of intelligent watch and intelligent watch
CN117215426A (en) Display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination