CN113808030A - Noise monitoring method, electronic equipment and chip system - Google Patents

Noise monitoring method, electronic equipment and chip system Download PDF

Info

Publication number
CN113808030A
CN113808030A CN202110606261.8A CN202110606261A CN113808030A CN 113808030 A CN113808030 A CN 113808030A CN 202110606261 A CN202110606261 A CN 202110606261A CN 113808030 A CN113808030 A CN 113808030A
Authority
CN
China
Prior art keywords
image
time
hwc
module
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110606261.8A
Other languages
Chinese (zh)
Other versions
CN113808030B (en
Inventor
张文礼
汤中峰
黄邦邦
王思文
张佳祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211137769.9A priority Critical patent/CN115564668A/en
Priority to CN202110606261.8A priority patent/CN113808030B/en
Publication of CN113808030A publication Critical patent/CN113808030A/en
Application granted granted Critical
Publication of CN113808030B publication Critical patent/CN113808030B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4204Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application provides a noise monitoring method, electronic equipment and a chip system, relates to the technical field of ambient light sensors, and can solve the problem of overlarge power consumption of the electronic equipment. The detection method comprises the following steps: an ambient light sensor of the electronic equipment collects ambient light in a collection period, and before the ambient light sensor collects the ambient light each time, a memory write-back function is started to obtain image noise during the collection of the ambient light; stopping the memory write-back function after the ambient light sensor finishes collecting the ambient light every time so as to avoid image noise outside the period of calculating and collecting the ambient light by the electronic equipment; the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. Because the noise interfering with the ambient light may be related to the image displayed on the display screen at the starting time of collecting the ambient light, the image can be forcibly refreshed to obtain the image currently displayed on the display screen after the memory write-back function is started, so that the noise interfering with the ambient light is obtained.

Description

Noise monitoring method, electronic equipment and chip system
Technical Field
The embodiment of the application relates to the field of ambient light sensors, in particular to a control method of electronic equipment, the electronic equipment and a chip system.
Background
With the development of electronic devices, the display screen of the electronic device has a higher and higher occupancy rate. In pursuit of an excellent screen occupation ratio, an ambient Light sensor on an electronic device may be disposed below an OLED (Organic Light-Emitting Diode) screen of the electronic device. The OLED screen itself emits light, which causes the ambient light collected by the ambient light sensor disposed below the OLED screen to include the light emitted by the OLED screen itself, resulting in inaccuracy of the ambient light collected by the ambient light sensor.
Currently, in order to accurately measure ambient light, ambient light collected by an ambient light sensor and noise generated by a display screen of an electronic device may be obtained. Then, the true ambient light is obtained based on the ambient light collected by the ambient light sensor and noise generated by the display screen of the electronic device. In this method, noise generated by the display screen of the electronic device is related to an image displayed by the display screen of the electronic device, and therefore, the image displayed by the display screen of the electronic device needs to be acquired.
Disclosure of Invention
The embodiment of the application provides a control method of an electronic device, the electronic device and a chip system, and solves the problem of overlarge power consumption when the electronic device acquires noise.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a noise monitoring method, which is applied to an electronic device, where the electronic device includes: a HWC module, a display subsystem, and a library of noise algorithms, the method comprising:
in response to receiving the first information, the HWC module sets a write back flag to a first flag;
in response to receiving the first image, the HWC module queries the write-back flag as a first flag;
the HWC module sends the first image to the display subsystem based on the first flag;
the display subsystem stops writing back a memory to the electronic equipment to store a second image comprising a first target image on the first image, wherein the first target image is an image in a first area;
in response to reaching a first time, the HWC module sets the write back flag to a second flag;
the HWC module acquires a third image;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
In response to receiving the third image and the second information, the display subsystem stores, to a write-back memory of the electronic device, a fourth image that includes a second target image on the third image, the second target image being an image within the first region;
the HWC module acquires the second target image from the write-back memory;
the HWC module sends the second target image to a noise algorithm library;
and the noise algorithm library calculates and obtains first image noise based on the second target image.
In the embodiment of the application, after the ambient light sensor finishes collecting the ambient light every time, the SCP processor may send the first information to the AP processor, and on one side of the AP processor, the HWC module sets the write-back flag as the first flag, and the memory write-back function is stopped. The HWC module may set the write back flag to the second flag and may also force a refresh of an image, such as a third image, before the ambient light sensor next acquires ambient light, such as at a first time. And under the condition that the write-back mark is the second mark, starting a memory write-back function, if the image is refreshed, obtaining a target image of the refreshed image, namely the target image on the third image, sending the target image obtained according to the third image to a noise algorithm library by the HWC, and calculating by the noise algorithm library to obtain the image noise. By the method, the HWC can be controlled to obtain the target image of the current refreshed image only when the refreshed image exists during the period that the ambient light sensor collects the ambient light, and the HWC does not obtain the target image of the current refreshed image any more when the refreshed image exists at the time other than the period that the ambient light sensor collects the ambient light. In practice, whether the HWC obtains the target image from the currently refreshed image matte is set by the write-back flag. According to the embodiment of the application, the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. In a possible implementation manner of the first aspect, the first information includes a first duration, where the first duration is a duration for the display subsystem to stop storing the image to the write-back memory; the first moment is as follows: the write-back mark is set to a time when a first time length passes after the time of the first mark;
Or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting a delay time length from the first time length, and the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information.
In one possible implementation of the first aspect, the HWC module obtaining the third image includes:
the HWC module sends a first signal to a surface flag of the electronic device;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameter.
In one possible implementation manner of the first aspect, after the HWC module sets the write-back flag to be the second flag and before the HWC module acquires the third image, the method further includes:
the HWC module acquires the moment when the image is refreshed on the electronic equipment last time;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the HWC module acquires the third image.
In one possible implementation manner of the first aspect, after the HWC module obtains the time when the electronic device last refreshes the image, the method further includes:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the HWC module waits for a Surface flag module of the electronic equipment to send a second display parameter.
In a possible implementation manner of the first aspect, if a time when the electronic device last refreshes an image meets a first preset condition, the HWC module obtains the first image, including:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the HWC module waits for a second time length;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second duration, the HWC module acquires the first image.
In a possible implementation manner of the first aspect, the method further includes:
if the HWC module receives a fourth display parameter sent by a Surface flag within the second duration, the HWC module acquires a fifth image based on the fourth display parameter;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the fifth image and third information to the display subsystem based on the second flag;
in response to receiving the fifth image and the third information, the display subsystem stores a sixth image comprising a third target image on the fifth image in a write-back memory of the electronic device, the third target image being an image within the first region;
the HWC module acquires the third target image from the write-back memory;
the HWC module sends the third target image to a noise algorithm library;
and the noise algorithm library calculates and obtains second image noise based on the third target image.
In a possible implementation manner of the first aspect, the first information includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device acquires the first value;
The electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time.
In a possible implementation manner of the first aspect, the first information further includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device collects the first value, and a time when an image is last refreshed by the electronic device meets a first preset condition includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
and a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time.
In a possible implementation manner of the first aspect, the meeting of the first preset condition by the electronic device at the time of last refreshing the image includes:
The moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
In a possible implementation manner of the first aspect, the method further includes:
after the HWC module sets a write back flag to a first flag; the HWC module monitors whether data in a core node of the electronic device changes or not, wherein the core node stores a brightness value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a first luminance value from the core node;
after the HWC module acquires a first brightness value from the kernel node, the HWC module acquires a second brightness value from the kernel node in response to monitoring that data in the kernel node of the electronic device changes;
In response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
In a possible implementation manner of the first aspect, the method further includes:
after the HWC module sets a write back flag to a second flag, the HWC module monitors whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a third luminance value from the core node;
the HWC module sends the third luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, in response to listening for a change in data in a core node of the electronic device, the HWC module retrieving a fourth luma value from the core node;
the HWC module sends the fourth luma value to the noise algorithm library.
In one possible implementation manner of the first aspect, the calculating, by the noise algorithm library, the first image noise based on the second target image includes:
and calculating to obtain first image noise based on the second target image and the second brightness value by the noise algorithm library.
In one possible implementation of the first aspect, the HWC module receiving the first image includes:
the HWC module receives a fifth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module derives the first image based on the fifth display parameter.
In a possible implementation manner of the first aspect, the first area is an area on a display screen of the electronic device, which is located above an ambient light sensor of the electronic device.
In one possible implementation manner of the first aspect, the obtaining, by the HWC module, an image last refreshed by the electronic device includes:
the HWC module receives a sixth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module stores a time at which the HWC module received the sixth display parameter;
the HWC module acquires the last time when the electronic device refreshes the image, and the method comprises the following steps:
the HWC module obtains the stored time when the sixth display parameter is received, where the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before the time when the electronic device last refreshes an image is obtained.
In a possible implementation manner of the first aspect, the first display parameter includes: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
In a second aspect, an embodiment of the present application provides a noise monitoring method, which is applied to an electronic device, where the electronic device includes: a first processor, the method comprising:
the first processor receives first information;
after the first processor receives the first information, in response to receiving a first image, the first processor stops acquiring a first target image from the first image, wherein the first target image is an image in a first area;
after the first time is reached, the first processor acquires a third image;
the first processor acquires a second target image from the third image, wherein the second target image is an image in the first area.
In the embodiment of the application, an ambient light sensor of the electronic device collects ambient light in a collection period, and before the ambient light sensor collects the ambient light each time, a memory write-back function is started to obtain a target image of a refreshed image, so that image noise during the collection of the ambient light is obtained; stopping the memory write-back function after the ambient light sensor finishes collecting the ambient light every time so as to avoid image noise outside the period of calculating and collecting the ambient light by the electronic equipment; the power consumption is reduced by circularly controlling the starting and stopping of the memory write-back function. Because the noise interfering with the ambient light may be related to the image displayed on the display screen at the starting time of collecting the ambient light, after the memory write-back function is started, the image, that is, the third image, may be forcibly refreshed to obtain the image currently displayed on the display screen, so as to obtain the noise interfering with the ambient light.
In one possible implementation manner of the second aspect, the method further includes:
in response to receiving the first information, the first processor setting a write back flag as a first flag by a HWC module of the electronic device;
the stopping, in response to receiving the first image, the first processor from acquiring the first target image from the first image comprises:
in response to receiving a first image, the first processor querying, by the HWC module, that the write-back flag is a first flag;
the first processor sending, by the HWC module, the first image to a display subsystem of the electronic device based on the first flag;
the first processor stops storing a second image comprising a first target image on the first image to a write-back memory of the electronic equipment through the display subsystem, wherein the first target image is an image in a first area;
the method further comprises the following steps:
in response to reaching a first time, the first processor setting, by the HWC module, the write-back flag to a second flag;
the first processor acquires a third image, acquires a second target image from the third image, and the second target image is an image in the first area, and comprises:
The first processor obtaining a third image through the HWC module;
the first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sending, by the HWC module, the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
in response to receiving the third image and the second information, the first processor stores, by the display subsystem, a fourth image including a second target image on the third image to a write-back memory of the electronic device, where the second target image is an image in the first area;
the first processor retrieves the second target image from the write-back memory through the HWC module;
the method further comprises the following steps:
the first processor sending, by the HWC module, the second target image to a noise algorithm library;
the first processor obtains first image noise through the noise algorithm library based on the second target image calculation.
In a possible implementation manner of the second aspect, the first information includes a first duration, where the first duration is a duration for the display subsystem to stop storing the image to the write-back memory; the first moment is as follows: the write-back mark is set to a time when a first time length passes after the time of the first mark;
or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting a delay time length from the first time length, and the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information.
In one possible implementation of the second aspect, the first processor obtaining, by the HWC module, the first image includes:
the first processor sends a first signal to a surface flag of the electronic device through the HWC module;
In response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameter.
In one possible implementation manner of the second aspect, after the first processor sets the write-back flag to be the second flag through the HWC module, and before the first processor acquires a third image through the HWC module, the method further includes:
the first processor acquires the moment when the image is refreshed on the electronic device last time through the HWC module;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the first processor acquires the third image through the HWC module.
In one possible implementation manner of the second aspect, after the first processor obtains, by the HWC module, a time when the electronic device last refreshes an image, the method further includes:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the first processor waits for a Surface flag module of the electronic equipment to send a second display parameter through the HWC module.
In a possible implementation manner of the second aspect, if a time when the electronic device last refreshes an image satisfies a first preset condition, the acquiring, by the first processor, the first image by the HWC module includes:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the first processor waits for a second time length through the HWC module;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second time length, the first processor acquires the first image through the HWC module.
In one possible implementation manner of the second aspect, the method further includes:
if the HWC module receives fourth display parameters sent by a Surface flag within the second duration, the first processor acquires a fifth image based on the fourth display parameters through the HWC module;
the first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sending, by the HWC module, the fifth image and third information to the display subsystem based on the second flag;
in response to receiving the fifth image and the third information, the first processor stores, by the display subsystem, a sixth image including a third target image on the fifth image in a write-back memory of the electronic device, the third target image being an image within the first region;
The first processor retrieves the third target image from the write-back memory through the HWC module;
the first processor sending, by the HWC module, the third target image to a noise algorithm library of the electronic device;
and the first processor calculates and obtains second image noise based on the third target image through the noise algorithm library.
In a possible implementation manner of the second aspect, the first information includes a first value and a second time, where the second time is an end time when an ambient light sensor of the electronic device acquires the first value;
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time;
or, the electronic device meeting the first preset condition at the moment of last image refreshing includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
The moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time;
alternatively, the first and second electrodes may be,
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
In one possible implementation manner of the second aspect, the method further includes:
after the first processor sets a write back flag to a first flag via the HWC module; the first processor monitoring, by the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
In response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a first luminance value from the core node;
after the first processor acquires a first brightness value from the kernel node through the HWC module, the first processor acquires a second brightness value from the kernel node through the HWC module in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the first processor sends the second luma value to the noise algorithm library through the HWC module.
In one possible implementation manner of the second aspect, the method further includes:
after the first processor sets a write back flag to a second flag via the HWC module, the first processor monitors, via the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a third luminance value from the core node;
The first processor sending, by the HWC module, the third luminance value to the noise algorithm library;
after the first processor sends the third luminance value to the noise algorithm library through the HWC module, in response to monitoring that a change in data in a core node of the electronic device occurs, the first processor obtains a fourth luminance value from the core node through the HWC module;
the first processor sends the fourth luma value to the noise algorithm library through the HWC module.
In one possible implementation manner of the second aspect, the obtaining, by the first processor, first image noise based on the second target image calculation through the noise algorithm library includes:
the first processor calculates a first image noise based on the second target image and the second luminance value through the noise algorithm library.
In one possible implementation of the second aspect, the first processor receiving, by the HWC module, the first image includes:
the first processor receives a fifth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor obtains, by the HWC module, the first image based on the fifth display parameter.
In a possible implementation manner of the second aspect, the first area is an area on a display screen of the electronic device, which is located above an ambient light sensor of the electronic device.
In one possible implementation manner of the second aspect, the obtaining, by the first processor, a time when the image is last refreshed by the electronic device through the HWC module includes:
the first processor receives a sixth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor storing, by the HWC module, a time at which the sixth display parameter was received by the HWC module;
the first processor acquires, by the HWC module, a time when the image is last refreshed on the electronic device, where the time includes:
the first processor acquires, by the HWC module, the stored time when the sixth display parameter is received, where the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before the time when the electronic device last refreshes an image is acquired.
In one possible implementation manner of the second aspect, the first display parameter includes: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
In a third aspect, an electronic device is provided, comprising a processor configured to execute a computer program stored in a memory, to implement the method of any of the first aspect or the method of any of the second aspect of the present application.
In a fourth aspect, a chip system is provided, which comprises a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any one of the second aspects of the present application.
In a fifth aspect, there is provided a computer readable storage medium storing a computer program which, when executed by one or more processors, performs the method of any one of the first or second aspects of the present application.
In a sixth aspect, embodiments of the present application provide a computer program product, which, when run on an apparatus, causes the apparatus to perform the method of any one of the first aspect or the second aspect of the present application.
It is understood that the beneficial effects of the second to sixth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
Fig. 2 is a diagram illustrating a positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present disclosure;
fig. 3 is a diagram illustrating another positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present disclosure;
fig. 4 is a diagram illustrating another positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present disclosure;
FIG. 5 is a diagram illustrating a positional relationship of a target area on a display screen according to an embodiment of the present disclosure;
FIG. 6 is a diagram illustrating a positional relationship between an ambient light sensor and a target area on a display screen according to an embodiment of the present disclosure;
FIG. 7 is a diagram of a technical architecture on which the method for detecting ambient light provided by embodiments of the present application relies;
fig. 8 is a schematic diagram of an acquisition cycle of the ambient light sensor for acquiring ambient light according to an embodiment of the present application;
FIG. 9 is a schematic diagram of time nodes for image refresh and backlight adjustment during an acquisition cycle in the embodiment of FIG. 8;
FIG. 10 is a timing flow diagram of a method for detecting ambient light based on the technical architecture shown in FIG. 7;
fig. 11 is a timing flow chart of various modules in the AP processor provided by the embodiment of the present application in the embodiment shown in fig. 10;
FIG. 12 is a diagram of another technical architecture upon which the method for detecting ambient light provided by embodiments of the present application relies;
FIG. 13 is another timing flow diagram of a method for detecting ambient light based on the technical architecture shown in FIG. 12;
FIG. 14 is a schematic diagram of calculating integral noise based on image noise and backlight noise at each time node provided by the embodiment shown in FIG. 9;
fig. 15 is a schematic diagram of each time node for performing image refreshing and backlight adjustment in an upward direction of a time axis in an acquisition period according to the embodiment of the present application;
FIG. 16 is a schematic diagram illustrating the calculation of integral noise based on the image noise and backlight noise at each time node provided by the embodiment shown in FIG. 15;
fig. 17 is a schematic diagram of a start-stop scheme of a CWB write-back function according to an embodiment of the present disclosure;
FIG. 18 is a flowchart illustrating a process of the SCP processor transmitting the first information to the AP processor according to an embodiment of the present application;
fig. 19 is a schematic diagram of a start-stop scheme of a CWB write-back function of a forced refresh image according to an embodiment of the present disclosure;
FIG. 20 is a schematic diagram illustrating events at various times in a start-stop scheme for a CWB write-back function provided by the implementation shown in FIG. 19;
FIG. 21 is a schematic diagram of the start-stop scheme for obtaining the integral noise using the CWB write back function provided by the embodiments shown in FIGS. 19 and 20;
FIG. 22 is a diagram illustrating a refresh state and an idle state of a display screen according to an embodiment of the present disclosure;
fig. 23 is a start-stop scheme of a CWB write-back function for forcibly refreshing an image when a display screen is in an idle state for a long time according to the embodiment of the present application;
fig. 24 is a start-stop scheme of the CWB write-back function using the forced refresh image shown in fig. 23 according to the embodiment of the present application;
fig. 25 is a start-stop scheme for implementing the CWB write-back function shown in fig. 17 according to an embodiment of the present disclosure;
fig. 26 is a schematic flowchart of another start/stop scheme of the CWB writeback function according to the embodiment of the present application;
FIG. 27 is a diagram illustrating a method for determining whether to force refreshing of an image according to the embodiment shown in FIG. 26;
FIG. 28 is a schematic diagram of obtaining integral noise using the start-stop scheme shown in FIG. 26 according to an embodiment of the present disclosure;
FIG. 29 is a schematic diagram of another embodiment of the present application for obtaining integral noise using the start-stop scheme shown in FIG. 26;
fig. 30 is a schematic diagram of another embodiment of the present application for obtaining integral noise by using the start-stop scheme shown in fig. 26.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in the embodiments of the present application, "one or more" means one, two, or more than two; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," "fourth," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The noise monitoring method provided by the embodiment of the application can be suitable for electronic equipment with an OLED screen. The electronic device may be a tablet computer, a wearable device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or other electronic devices. The embodiment of the present application does not limit the specific type of the electronic device.
Fig. 1 shows a schematic structural diagram of an electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a touch sensor 180K, an ambient light sensor 180L, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. For example, the processor 110 is configured to execute the noise monitoring method in the embodiment of the present application.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area can store an operating system and an application program required by at least one function. The storage data area may store data created during use of the electronic device 100.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc.
In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave.
In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio signals into analog audio signals for output and also to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to implement noise reduction functions in addition to listening to voice information. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on. For example, the microphone 170C may be used to collect voice information related to embodiments of the present application.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A.
In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may employ an organic light-emitting diode (OLED). In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The embodiment of the present application does not particularly limit the specific structure of the main execution body of monitoring one kind of noise, as long as the program recorded with the noise monitoring method of the embodiment of the present application can be run to perform processing by the noise monitoring method provided by the embodiment of the present application. For example, an execution subject of the noise monitoring method provided by the embodiment of the present application may be a functional module capable of calling a program and executing the program in the electronic device, or a communication device, such as a chip, applied to the electronic device.
Fig. 2 is a front position relationship diagram of a display screen and an ambient light sensor in an electronic device according to an embodiment of the present application.
As shown in fig. 2, the projection of the ambient light sensor on the display screen of the electronic device is located at the upper half of the display screen of the electronic device. When a user holds the electronic equipment, the ambient light sensor positioned on the upper half part of the electronic equipment can detect the light intensity and the color temperature of the environment on the front side (the orientation of the display screen in the electronic equipment) of the electronic equipment, and the light intensity and the color temperature are used for adjusting the brightness and the color temperature of the display screen of the electronic equipment, so that a better visual effect can be achieved. For example, the display screen may not be too bright in dark environments to cause glare, and may not be too dark in bright environments to cause poor visibility.
Fig. 3 is a side view of the display screen and the ambient light sensor in the electronic device. The display screen of the electronic equipment comprises from top to bottom: glass apron (printing opacity), display module assembly and protection pad pasting, wherein, all are used for showing the azimuth relation when electronic equipment's display screen upwards places here under and. Because the ambient light sensor need gather the ambient light of the top of electronic equipment's display screen, consequently, can dig a part with the display module assembly in the display screen, ambient light sensor is placed to this part, is equivalent to ambient light sensor and places the below of the glass apron in the display screen in, and display module assembly is located the same layer. Note that the detection direction of the ambient light sensor coincides with the orientation of the display screen in the electronic device (the orientation of the display screen in the electronic device is upward in fig. 3). Obviously, this arrangement of the ambient light sensor sacrifices a portion of the display area. When a high screen occupation ratio is pursued, the arrangement mode of the ambient light sensor is not applicable.
Fig. 4 shows another arrangement of the ambient light sensor provided in the embodiments of the present application. And transferring the ambient light sensor from the lower part of the glass cover plate to the lower part of the display module. For example, the ambient light sensor is located below an Active Area (AA) area in the OLED display module, and the AA area is an area in the display module where image content can be displayed. This arrangement of the ambient light sensor does not sacrifice the display area. However, the OLED screen is a self-luminous display screen, when the OLED screen displays an image, a user can see the image from above the display screen, and similarly, the ambient light sensor located below the OLED screen can also collect light corresponding to the image displayed on the OLED screen. Therefore, the ambient light collected by the ambient light sensor includes light emitted by the display screen and actual ambient light from the outside. If the external real ambient light is to be accurately obtained, the light emitted by the display screen needs to be obtained in addition to the ambient light collected by the ambient light sensor.
As can be understood from fig. 4, since the ambient light sensor is located below the AA area, the AA area in the display module is not sacrificed by the arrangement of the ambient light sensor. Therefore, the projection of the ambient light sensor on the display screen can be located in any area of the front of the display screen, and is not limited to the following arrangement: the projection of the ambient light sensor on the display screen is located at the top of the front of the display screen.
Regardless of which region of the display screen is located below the AA region, the projected area of the ambient light sensor on the display screen is much smaller than the area of the display screen itself. Therefore, not the entire display screen may emit light that interferes with the ambient light collected by the ambient light sensor. But the light emitted from the display area above the ambient light sensor in the display screen and the light emitted from the display area above a certain range around the ambient light sensor will interfere with the ambient light collected by the ambient light sensor.
As an example, the light sensing area of the ambient light sensor has a light sensing angle, and the ambient light sensor may receive light within the light sensing angle but not light outside the light sensing angle. In fig. 5, light emitted from point a above the ambient light sensor (within the sensing angle) and light emitted from point B above a certain range around the ambient light sensor (within the sensing angle) both interfere with the ambient light collected by the ambient light sensor. While the light emitted from point C (located outside the light sensing angle) farther away from the ambient light sensor in fig. 5 does not interfere with the ambient light collected by the ambient light sensor. For convenience of description, a display area of the display screen that interferes with the ambient light collected by the ambient light sensor may be referred to as a target area (the target area may be referred to as a first area). The location of the target area in the display screen is determined by the specific location of the ambient light sensor under the AA area. As an example, the target area may be a square area with a side length of a certain length (e.g., 80 microns, 90 microns, 100 microns) centered at a center point of the ambient light sensor. Of course, the target area may also be an area of other shape obtained by measurement that interferes with the light collected by the ambient light sensor.
As another example, fig. 6 is a schematic front view of an OLED screen of an electronic device provided in an embodiment of the present application, and as shown in fig. 6, the electronic device includes a housing, an OLED screen of the electronic device displays an interface, a corresponding area of the display interface in the display screen is an AA area, and an ambient light sensor is located behind the AA area. The center point of the target area coincides with the center point of the ambient light sensor.
It should be noted that, the ambient light sensor is a single device, and the manufacturer may be different, and the shape of the external appearance may also be different. The central point of the ambient light sensor in the embodiment of the present application is the central point of the photosensitive area where the ambient light sensor collects ambient light. In addition, the target area shown in FIG. 6 is larger than the projected area of the ambient light sensor on the OLED screen. In practical application. The target area may also be less than or equal to the projection area of the ambient light sensor on the OLED screen. However, the target area is typically larger than the photosensitive area of the ambient light sensor. As mentioned above, the real ambient light from the outside is equal to the ambient light collected by the ambient light sensor minus the light emitted by the display screen. And the light emitted by the display screen has been determined to be the light emitted by the target area. The emitted light of the target area is light generated by the display content of the target area. And the interference of the display content to the ambient light collected by the ambient light sensor comes from two parts: RGB pixel information of the display image and luminance of the display image. As can be understood from the above analysis, the interference to the ambient light collected by the ambient light sensor is: RGB pixel information of an image displayed by the target area and luminance information of the target area. As an example, if the pixel value of a pixel is (r, g, b) and the luminance is L, the normalized luminance of the pixel is: l × (r/255) 2.2,L×(g/255)2.2,L×(b/255)2.2
For convenience of description, an image corresponding to the target area may be referred to as a target image, and interference of RGB pixel information of the target image and luminance information on ambient light collected by the ambient light sensor may be referred to as fusion noise. The ambient light collected by the ambient light sensor can be recorded as initial ambient light, and the external real ambient light can be recorded as target ambient light.
From the above description it can be derived: the target ambient light is equal to the initial ambient light minus the fusion noise at each instant in the time period in which the initial ambient light was collected. In the embodiment of the present application, a process of calculating the fusion noise together according to the RGB pixel information and the luminance information is referred to as a noise fusion process.
When the display screen is in a display state, the RGB pixel information of the image displayed in the target area may change, and the brightness information of the displayed image may also change. The fusion noise may be changed whether the RGB pixel information of the image displayed in the target area is changed or the luminance information of the displayed image is changed. Therefore, it is necessary to calculate the fusion noise thereafter from the changed information (RGB pixel information or luminance information). If the image of the target area is unchanged for a long time, the fusion noise is calculated only when the brightness of the display screen is changed. Therefore, in order to reduce the frequency of calculating the fusion noise, the target region may be a region in which the frequency of change of the image displayed on the display screen is low. For example, a status bar area at the top of the front of the electronic device. The projection of the ambient light sensor on the display screen is located to the right in the status bar area of the display screen. Of course, the position of the ambient light sensor may be a position to the left in the status bar area or a position in the middle in the status bar area, and the embodiment of the present application does not limit the specific position of the ambient light sensor.
A technical architecture corresponding to the method for obtaining the target ambient light through the initial ambient light and the content displayed on the display screen according to the embodiment of the present application will be described below with reference to fig. 7.
As shown in fig. 7, the processor in the electronic device is a multi-core processor, which at least includes: an AP (application processor) processor and an SCP (sensor co-processor) processor. The AP processor is an application processor in the electronic device, and an operating system, a user interface and an application program are all run on the AP processor. The SCP processor is a co-processor that may assist the AP processor in performing events related to images, sensors (e.g., ambient light sensors), and the like.
Only the AP processor and SCP processor are shown in fig. 7. In practical applications, the multi-core processor may also include other processors. For example, when the electronic device is a mobile phone, the multi-core processor may further include a Baseband (BP) processor that runs mobile phone radio frequency communication control software and is responsible for sending and receiving data.
The AP processor in fig. 7 only shows the content related to the embodiment of the present application, and the implementation of the embodiment of the present application needs to rely on: an Application Layer (Application), a Java Framework Layer (Framework Java), a native Framework Layer (Framework native), a Hardware Abstraction Layer (HAL), a kernel Layer (kernel), and a Hardware Layer (Hardware).
The SCP processor in fig. 7 may be understood as a sensor control center (sensor hub) which can control the sensors and process data related to the sensors. The implementation of the embodiment of the present application needs to rely on: a co-application layer (Hub APK), a co-framework layer (Hub FWK), a co-driver layer (Hub DRV), and a co-hardware layer (Hub hardware).
Various applications exist in the application layer of the AP processor, and application a and application B are shown in fig. 7. Taking application a as an example, after the user starts application a, the display screen will display the interface of application a. Specifically, the application a sends the display parameters (for example, the memory address, the color, and the like of the interface to be displayed) of the interface to be displayed to the display engine service.
And the display engine service in the AP processor sends the received display parameters of the interface to be displayed to a surfaceFlinger of a Framework layer (Framework native) of the AP processor.
The surface Flinger in the native Framework layer (Framework native) of the AP processor is responsible for the fusion of the control interface (surface). As an example, an overlap region of at least two interfaces that overlap is calculated. The interface here may be an interface presented by a status bar, a system bar, the application itself (interface to be displayed by application a), wallpaper, background, etc. Therefore, the surfaceflag can obtain not only the display parameters of the interface to be displayed by the application a, but also the display parameters of other interfaces.
The hardware abstraction layer of the AP processor has HWC (hardware component hal), which is a module for interface synthesis and display in the system and provides hardware support for a surfaceflag service. Step a1 is to send the display parameters (e.g., memory address, color, etc.) of each interface to the HWC through the interface (e.g., setLayerBuffer, setLayerColor, etc.) for interface fusion by the surfefinger. In practical applications, the display parameters may include: the location, size, color, memory address, etc. of the interface of the composite image on the display screen of the electronic device.
Generally, in image synthesis (for example, when an electronic device displays an image, it is necessary to synthesize a status bar, a system bar, an application itself, and a wallpaper background), the HWC obtains a synthesized image according to display parameters of each interface through hardware (for example, a hardware synthesizer) underlying the HWC. The HWC in the hardware abstraction layer of the AP processor sends the underlying hardware-synthesized image to the OLED driver, see step a 2.
In practice, the HWC module may obtain the synthesized image based on the display parameters sent by the surfefinger in any manner.
The OLED drive of the kernel layer of the AP processor gives the synthesized image to the display subsystem (DSS) of the hardware layer of the AP processor, see step A3. The display subsystem (DSS) in the hardware layer of the AP processor may perform secondary processing (e.g., HDR10 processing for enhancing image quality) on the combined image, and may display the secondary processed image after the secondary processing. In practical application, the secondary treatment may not be performed. Taking the example of not performing the secondary processing, the display subsystem of the AP processor hardware layer sends the synthesized image to the OLED screen for display.
If the starting of the application a is taken as an example, the synthesized image displayed by the OLED screen is an interface synthesized by the interface to be displayed by the application a and the interface corresponding to the status bar.
The OLED screen can complete image refreshing and displaying once according to the mode.
In the embodiment of the present application, before the image after the secondary processing (or the synthesized image) is sent to be displayed, the display subsystem (DSS) may be controlled to store the whole frame of image (which may also be an image of the whole frame of image larger than the target area, or may also be an image corresponding to the target area in the whole frame of image) in the memory of the kernel layer of the AP processor, and since the process belongs to Concurrent Write-Back image frame data, the memory may be recorded as a Write-Back (CWB) memory, see step a 4.
In the embodiment of the present application, for example, the display subsystem stores the whole frame image in the CWB memory of the AP processor, and after the display subsystem successfully stores the whole frame image in the CWB memory, the display subsystem may send a signal indicating that the storage is successful to the HWC. The whole frame image corresponding to the image stored in the CWB memory by the display subsystem may be recorded as an image to be refreshed (the image to be refreshed may also be understood as an image after the current refresh).
The AP processor may also be configured to allow the HWC to access the CWB memory. The HWC may obtain the target image from the CWB memory after receiving a signal indicating that the storage sent by the subsystem was successful, see step a 5.
It should be noted that, regardless of whether the image of the whole frame image or the image of the partial region in the whole frame image is stored in the CWB memory, the HWC can obtain the target image from the CWB memory. The process of the HWC obtaining the target image from the CWB memory may be denoted as HWC matting from the CWB memory.
The range of the target image can be understood as the length and width limited range size of the target image, the range of the image to be refreshed is the range of the whole frame image, and the length and width limited range size can also be adopted.
As an example, the size of the image to be refreshed is X1 (pixels) × Y1 (pixels), the size of the target image is X2 (pixels) × Y2 (pixels), and the size of the image stored in the CWB memory is X3 (pixels) × Y3 (pixels). X3 satisfies X1 ≥ X3 ≥ X2, and Y3 satisfies Y1 ≥ Y3 ≥ Y2.
Of course, when X3 is X1 and Y3 is Y1, the image stored in the CWB memory is an entire frame image. When X3 is X2 and Y3 is Y2, the image stored in the CWB memory is the target image.
Continuing to take application a as an example, when application a has a brightness adjustment requirement due to switching of the interface, application a sends the brightness to be adjusted to the display engine service.
And the display engine service in the AP processor sends the brightness to be adjusted to the kernel node in the kernel layer of the AP processor so as to enable related hardware to adjust the brightness of the OLED screen according to the brightness to be adjusted stored in the kernel node.
According to the mode, the OLED screen can complete one-time brightness adjustment.
In the embodiment of the present application, the HWC may be further configured to obtain the brightness to be adjusted from the kernel node, and the brightness to be adjusted may also be recorded as the brightness after the current adjustment, which is specifically referred to in step a 5'.
In a specific implementation, the HWC may monitor whether data stored in the kernel node changes based on a uevent mechanism, and obtain currently stored data, that is, a brightness value to be adjusted (the brightness value to be adjusted is used to adjust the brightness of the display screen, and therefore, may also be recorded as the brightness value of the display screen) from the kernel node after monitoring that the data in the kernel node changes. After obtaining the target image or the brightness information to be adjusted, the HWC may send the target image or the brightness information to be adjusted to a noise algorithm library of a hardware abstraction layer of the AP processor. See step a 6. The noise algorithm library can calculate and obtain the fusion noise at the refreshing time of the target image after the target image is obtained every time. After each brightness is obtained, the fusion noise at the brightness adjusting moment is calculated and obtained. And the noise algorithm library stores the fusion noise obtained by calculation in a noise memory of the noise algorithm library.
In practical applications, after the HWC obtains the target image, the HWC may store the target image, and the HWC may send the storage address of the target image to the noise algorithm library, and the noise algorithm library may buffer the target image of a frame at the latest time in an address-recording manner. After the HWC obtains the brightness to be adjusted, the HWC may send the brightness to be adjusted to a noise algorithm library, which may buffer a brightness at the latest moment. For convenience of description, the subsequent embodiments of the present application are described in terms of the HWC sending the target image to the noise algorithm library, and in practical applications, the HWC may obtain the target image and then store the target image, and send the storage address of the target image to the noise algorithm library.
As an example, after receiving the storage address of the first frame target image, the noise algorithm library buffers the storage address of the first frame target image. And each time a new storage address of the target image is received, the new storage address of the target image is used as the latest storage address of the cached target image. Correspondingly, the noise algorithm library buffers the first brightness after receiving the first brightness, and the new brightness is taken as the latest brightness buffered every time a new brightness is received. In the embodiment of the application, the noise algorithm library caches the acquired target image and brightness value in the data storage library. The target image and the luminance value stored in the data store may be recorded as screen data, i.e. the screen data stored in the data store includes: a target image and a luminance value.
In addition, in order to describe the transfer relationship between parameters such as the target image and the brightness to be adjusted, the embodiment of the present application takes the example that the HWC sends the parameters such as the target image and the brightness to be adjusted to the noise algorithm library. In practice, the relationship between the HWC and the noise algorithm library calls the noise algorithm library for the HWC. When the HWC calls the noise algorithm library, the HWC inputs parameters such as a target image (a storage address of the target image), brightness to be adjusted, and the like as arguments of a calculation model in the noise algorithm library to the noise algorithm library. Other parameters will not be exemplified.
Because brightness adjustment and image refreshing are two completely independent processes, the image may be refreshed at a certain time, and the brightness remains unchanged, then the target image and the current brightness corresponding to the refreshed image are adopted when the fusion noise at the time is calculated (the brightness value stored in the noise algorithm library and stored latest before the time represented by the timestamp of the target image). For convenience of description, the fusion noise at the image refresh time calculated due to the image refresh may be regarded as the image noise at the image refresh time. Similarly, if the image is not refreshed at a certain time and the brightness is adjusted, the adjusted brightness and the current target image (the target image stored in the noise algorithm library and newly stored before the time indicated by the time stamp of the brightness value) are used for calculating the fusion noise at the certain time. For convenience of description, the fusion noise at the luminance adjustment timing calculated due to the luminance adjustment may be regarded as the backlight noise at the luminance adjustment timing.
The target image and the brightness sent by the HWC to the noise algorithm library are both time-stamped, and correspondingly, the image noise and the backlight noise obtained by the computation of the noise algorithm library are also both time-stamped. The timestamp of the image noise is the same as the timestamp of the target image, and the timestamp of the backlight noise is the same as the timestamp of the brightness to be adjusted. The timestamp of the image noise should be the image refresh moment in the strict sense. In practical applications, another time node close to the image refresh time may be used as the image refresh time, for example, a start time (or an end time, or any time between the start time and the end time) when the HWC performs matting to obtain the target image from the CWB memory may be used as the image refresh time. The time stamp of the backlight noise should be strictly speaking the backlight adjustment instant. In practical applications, another time node close to the backlight adjustment time may also be used as the backlight adjustment time, for example, the start time (or the end time, or any time between the start time and the end time) when the HWC executes to obtain the brightness to be adjusted from the kernel node is used as the brightness adjustment time. The timestamp of the image noise and the timestamp of the backlight noise facilitate denoising of the initial ambient light collected by the subsequent ambient light sensor and the ambient light sensor over a time span to obtain the target ambient light. The noise algorithm library stores the image noise and the backlight noise in a noise memory, stores the time stamp of the image noise when the noise algorithm library stores the image noise, and stores the time stamp of the backlight noise when the noise algorithm library stores the backlight noise.
An Ambient Light Sensor (ALS) in the co-hardware layer of the SCP processor collects initial ambient light at a certain collection period after start-up (typically, after the electronic device is powered on, the ambient light sensor is started up). The ambient light sensor of the SCP processor transmits the initial ambient light information to the ambient light sensor driver (ALS DRV) of the co-driver layer (Hub DRV) layer of the SCP processor, see step E2.
The initial ambient light information transmitted by the SCP processor to the AP processor includes a first value, a first time and a second time, where the first value can be understood as a raw value of the initial ambient light, the first time is an integration start time at which the ambient light sensor acquires the first value, and the second time is an integration end time at which the ambient light sensor acquires the first value.
And in a cooperative driving (Hub DRV) layer of an SCP processor, an ambient light sensor driving (ALS DRV) carries out preprocessing on initial ambient light information to obtain raw values on four channels of the RGBC. The co-driver layer of the SCP processor transmits raw values on the RGBC four channels to the ambient light sensor application of the co-application layer of the SCP processor, see step E3.
The ambient light sensor of the co-application layer of the SCP processor sends raw values on the RGBC four channels and other relevant data (e.g. start time and end time of each time the ambient light sensor collects initial ambient light) to the HWC of the AP processor via a first inter-core communication (communication between the ambient light sensor application of the SCP processor and the HWC of the AP processor), see step E4.
After the HWC in the AP processor obtains the initial ambient light data reported by the SCP processor, the HWC in the AP processor may send the initial ambient light data to the noise algorithm library. See step a 6.
As described above, the noise algorithm library may calculate the image noise at the image refresh timing and the backlight noise at the luminance adjustment timing, and store the calculated image noise and backlight noise in the noise memory in the noise algorithm library. In practical application, the noise algorithm library can calculate and obtain image noise at the image refreshing time and backlight noise at the brightness adjusting time. The integral noise between the acquisition start time and the acquisition end time of the initial ambient light may also be obtained from the image noise and the backlight noise stored in the noise memory after the acquisition start time and the acquisition end time of the initial ambient light are obtained. And the noise algorithm library deducts integral noise between the acquisition starting time and the acquisition ending time of the initial environment light from the initial environment light to obtain the target environment light.
As can be understood from the above description of the noise algorithm library, the noise calculation library includes a plurality of calculation models, for example, a first algorithm model, for obtaining the fusion noise according to the target image and the luminance calculation. And the second algorithm model is used for obtaining integral noise between the acquisition starting time and the acquisition ending time of the initial environment light according to the fusion noise at each moment. And the third algorithm model is used for obtaining the target ambient light according to the initial ambient light and the integral noise. In practical applications, the noise algorithm library may further include other calculation models, for example, in a process of obtaining the target ambient light based on the target image, the brightness, and the initial ambient light, if the raw values on the four channels of the initial ambient light are filtered, there is a model for filtering the raw values on the four channels of the initial ambient light, which is not illustrated in the embodiment of the present application.
The inputs to the library of noise algorithms include: the target image and brightness acquired by the HWC at various times, and the initial ambient light correlation data acquired by the HWC from the SCP processor. The output of the noise algorithm library is: and the raw value of the target environment light can be recorded as a second value. In the embodiment of the present application, the process of sending the target image, the brightness, and the initial ambient light from the HWC to the noise algorithm library is denoted as step a 6.
The noise calculation library also needs to return the target data to the HWC after obtaining the target ambient light, and this process is denoted as step a 7. In practical applications, the output of the noise algorithm library is raw values on four channels of the target ambient light.
The HWC in the AP processor sends the raw values on the four channels of the target ambient light returned by the noise algorithm library to the ambient light sensor application in the co-application layer of the SCP processor via first inter-core communication, see step A8.
After the ambient light sensor application of the co-driver layer of the SCP processor obtains the raw values on the target ambient light four channels, the raw values on the target ambient light four channels are stored in the ambient light memory of the co-driver layer. See step E5.
The co-driver layer of the SCP processor is provided with a calculation module that obtains from memory the raw values on the target ambient light four channels, see step E6. When the integration of each time is finished, the ambient light sensor generates an integration interrupt signal, the ambient light sensor sends the integration interrupt signal to the ambient light sensor driver, the ambient light sensor driver calls the calculation module, and the calculation module is triggered to obtain raw values on four channels of the target ambient light from the storage.
The ambient light sensor drive triggers the calculation module to acquire the raw value of the target ambient light after the current integration is finished, so that the raw value of the target ambient light in the previous integration period is acquired at the moment.
Taking the embodiment shown in FIG. 8 as an example, at t1After the time integral is finished, the ambient light sensor obtains t0Time to t1Initial ambient light at time, SCP processor will t0Time to t1The initial environment light at the moment is sent to an AP processor, and the AP processor obtains t through calculation0Time to t1Raw value of the target ambient light at the time. AP processor will t0Time to t1The raw value of the target ambient light at the time is sent to the SCP processor. The SCP processor will store t0Time to t1The raw value of the target ambient light at the time is entered into the memory of the SCP processor.
At t3After the time integral is finished, the ambient light sensor obtains t2Time to t3Initial ambient light at time, SCP processor will t2Time to t3The initial ambient light at the time is sent to the AP processor. An integral interrupt signal is generated after the integration of the ambient light sensor is finished every time, the ambient light sensor sends the integral interrupt signal to the ambient light sensor drive, the ambient light sensor drive calls the calculation module, and the calculation module is triggered to obtain the currently stored t from the memory 0Time to t1Raw value of the target ambient light at the time. Since this time is t3After the moment, the calculation module therefore at t3After the moment according to t obtained0Time to t1Target ring of momentsAnd calculating the raw value of the ambient light to obtain the lux value of the target ambient light. That is, the SCP processor calculates the lux value of the target ambient light obtained in the T2 period as the lux value of the real ambient light in the T1 period.
As previously mentioned, the ambient light sensor in the SCP processor will end up integrating (t)3Time) is followed by an integration interrupt signal (which gives the ambient light sensor drive) and at t3After the time, the initial ambient light of the period T2 is sent to the AP processor, the target ambient light is sent to the SCP processor after the AP processor calculates and obtains the target ambient light, and the SCP processor stores the target ambient light of the period T2 in a memory. If the SCP processor calculates the lux value using the raw value of the target ambient light for the period T2, it will start waiting until the AP processor transmits the target ambient light to the memory of the SCP processor, starting from the receipt of the integration interrupt signal from the ambient light sensor driver. The ambient light sensor driver in the SCP processor can invoke the calculation module to retrieve the raw value of the target ambient light from memory for the period T2. The waiting time includes at least: the process of transmitting the initial ambient light to the AP processor by the SCP processor, the process of calculating and obtaining the target ambient light by the AP processor based on the initial ambient light and other related data and the process of transmitting the target ambient light to a memory in the SCP processor by the AP processor are respectively corresponding to the time, and the time is relatively longer and is not fixed. Therefore, the ambient light sensor driver in the SCP processor may be configured to invoke the calculation module to retrieve the raw value of the target ambient light of the previous cycle from the memory after receiving the integration interrupt signal of the second acquisition cycle, so as to calculate the lux value according to the raw value of the target ambient light of the previous cycle. The lux value of the target ambient light may be recorded as a third value, and the third value and the second value are the lux value and the raw value of the same target ambient light.
Taking the collection period shown in fig. 8 as an example, if the raw value of the initial ambient light collected in the collection period T1 is the first value. The raw value of the target ambient light corresponding to the acquisition period T1, which is obtained from the raw value of the initial ambient light acquired during the acquisition period T1, is the second value. The lux value of the target ambient light corresponding to the acquisition period T1, which is obtained from the raw value of the target ambient light corresponding to the acquisition period T1, is a third value. The raw value of the initial ambient light acquired during the acquisition period T2 may be recorded as a fourth value. The fourth value is the initial ambient light acquired in an acquisition period subsequent to the acquisition period corresponding to the first value (or the acquisition period corresponding to the second value, or the acquisition period corresponding to the third value).
And a calculation module in a co-driving layer of the SCP processor obtains the lux value of the target ambient light according to the raw value on the four channels of the target ambient light. The calculation module in the SCP processor sends the calculated lux value of the target ambient light to the ambient light sensor application of the co-application layer through the interface of the co-framework layer, see steps E7 and E8.
The ambient light sensor application of the co-application layer in the SCP processor transmits the lux value of the target ambient light to the light service (light service) of the raw framework layer in the AP processor through the second inter-core communication (communication of the SCP processor to the light service of the AP processor), see step E9.
A light service (light service) may send the lux value of the target ambient light to the display engine service. The display engine service may send the lux value of the target ambient light to the upper layer to facilitate an application in the application layer to determine whether to adjust the brightness. The display engine service can also send the lux value of the target ambient light to the kernel node so as to enable related hardware to adjust the brightness of the display screen according to the lux value of the target ambient light stored by the kernel node.
After describing the technical architecture on which the method of obtaining the target ambient light depends, the process of obtaining the target ambient light from the target image, the brightness, and the initial ambient light collected by the ambient light sensor will be described from the perspective of the collection period of the ambient light sensor.
As can be understood from the above examples, the target image and the brightness to be adjusted are both obtained by the HWC, and therefore, there is a sequential order in the processes of obtaining the target image and obtaining the brightness to be adjusted by the HWC. After the HWC acquires the target image or the brightness to be adjusted, the target image or the brightness to be adjusted is sent to the noise algorithm library, and the process that the HWC sends the target image or the brightness to be adjusted to the noise algorithm library also has a sequence. Correspondingly, the time when the noise algorithm library receives the target image and the brightness to be adjusted also has a sequence. However, even if there is a chronological order in the time when the noise algorithm library receives the target image and the brightness to be adjusted, the timestamps of the target image and the brightness to be adjusted may be the same since the HWC may be within the same time metric level when acquiring the target image and the brightness to be adjusted. As an example, within the same millisecond (5 th millisecond), the HWC performs acquisition of the brightness to be adjusted first and then performs acquisition of the target image. Although there is a precedence in the execution of the HWC, the time stamps of the target image and the brightness to be adjusted are both 5 th msec.
Referring to fig. 8, the ambient light sensor collects ambient light at a time period T from which the ambient light sensor collects0To t2(acquisition period T1), from T2To t4(acquisition period T2), from T4To t6(acquisition period T3) is one acquisition period. During the acquisition period of T1, the time when the ambient light sensor actually performs acquisition is T0To t1From t1To t2The ambient light sensor may be in a sleep state for this period of time. The embodiment of the present application is described by taking as an example that the collection period of the ambient light is fixed (i.e., the values of T1, T2, and T3 are the same) and the duration of the integration period is fixed.
As an example, it may be at 350ms (t)2-t0) As one acquisition cycle. The actual acquisition time of the ambient light sensor in one acquisition period is 50ms (t)1-t0) Then the ambient light sensor will have 300ms (t) in one acquisition period2-t1) Is in a dormant state. The above examples of 350ms, 50ms and 300ms are for example only and not intended to be limiting.
For ease of description, the time period (e.g., t) for which the ambient light sensor actually collects may be0To t1) Time periods when the environmental sensor does not initiate acquisition (e.g., t) are noted as integration time periods1To t2) Denoted as the non-integration period.
The image displayed on the display screen of the electronic equipment is at a certain frequencyRefreshing is performed. Taking 60Hz as an example, it is equivalent to refreshing the display screen of the electronic device 60 times per second, or refreshing the image every 16.7 ms. Image refresh occurs during the acquisition period of the ambient light sensor when the display screen of the electronic device displays images. When the image displayed on the display screen is refreshed, the AP processor performs steps a1 to a6 (transmission target image) in the technical architecture shown in fig. 7. HWC in AP processor from t0Starting at the moment, the CWB is controlled to write back all the time, i.e. the above steps are repeated as long as there is an image refresh.
Note that, in the present embodiment, a refresh rate of 60Hz is taken as an example. In practice, the refresh rate may be 120Hz or other refresh rates. In the embodiment of the present application, the step a1 to the step a6 (transmission target image) need to be repeatedly executed every refresh frame, and in practical applications, the step a1 to the step a6 (transmission target image) may also be repeatedly executed every other frame (or two frames, etc.).
The brightness adjustment does not have a fixed periodicity, so the brightness adjustment may also occur during the acquisition period of the ambient light sensor. When the brightness is adjusted, the HWC also performs steps a 5' to a6 (sending the brightness to be adjusted) in the technical architecture shown in fig. 7.
After each integration of the ambient light sensor (i.e. at t)1After that, t3After that, t5… …), the SCP processor reports the data (e.g. raw values on four channels of the initial ambient light and the integration start time and the integration end time of the current integration process) of the initial ambient light collected by the current integration process to the HWC of the AP processor, and the HWC of the AP processor sends the relevant data of the initial ambient light to the noise algorithm library and obtains the target ambient light through calculation of the noise algorithm library.
Referring to FIG. 9, taking an acquisition cycle as an example, at t01Time (sum t)0The same time), t03Time t04Time t and11the moments are all image refreshing moments at t02Time t and12the time is the brightness adjustment time. Thus, the AP processor may compute in real timeGet t01Image noise at time t02Backlight noise at time t03Image noise at time t04Image noise at time t11Image noise and t at time12Backlight noise at the moment. At the end of this integration (t)1Time of day), the noise memory of the AP processor stores: t is t01Image noise at time t02Backlight noise at time t03Image noise and t at time04Image noise at time instants.
At the end of this integration (t) 1Time), the ambient light sensor obtains the initial ambient light of the current integration and the current integration time period. The SCP processor reports the data of the initial environment light to the AP processor, and a noise calculation module in the AP processor obtains t from a noise memory according to the starting time and the ending time of the current integration time period01Image noise at time t02Backlight noise at time t03Image noise at time t04Image noise at time instants. And the noise calculation library calculates and obtains target environment light according to the initial environment light collected in the integral time period and the image noise and backlight noise influencing the integral time period.
During a non-integration period (t)1To t2) Since the HWC always controls the CWB write back, therefore, the HWC is on t11The refreshed image at the moment is also subjected to matting to obtain a target image, and the noise algorithm library also calculates t11Image noise at time instants. Non-integration time period t12The brightness changes at the moment, and the noise algorithm base also calculates t12Backlight noise at the moment. However, when the target ambient light is obtained by calculation, the required fusion noise is a fusion noise that interferes with the initial ambient light obtained in the current integration period, and therefore, t is not required11Image noise and t at time 12The backlight noise at the moment can also obtain the target ambient light of the current integration time period. In practical application, the noise algorithm library computer obtains t11Image noise and t at time12After the backlight noise at the moment, t also needs to be set11Image noise and t at time12The backlight noise at the moment is stored inIn the noise memory.
The above example describes the process of acquiring the target ambient light from the perspective of the technical architecture based on fig. 7 and from the perspective of the acquisition period of the ambient light sensor based on fig. 9, respectively. A time sequence process diagram for acquiring the target ambient light provided by the embodiment shown in fig. 10 will be described below with reference to the technical architecture shown in fig. 7 and one acquisition cycle of the ambient light sensor shown in fig. 9.
As can be understood from the above description, the process of triggering the AP processor to calculate the image noise by image refresh, the process of triggering the AP processor to calculate the backlight noise by brightness adjustment, and the process of controlling the underlying hardware ambient light sensor to collect the initial ambient light by the SCP processor are performed independently, and there is no chronological order. And the noise calculation library of the AP processor processes the target image, the brightness and the initial ambient light obtained in the three independent processes to obtain the target ambient light.
The same reference numbers for steps in the embodiment of fig. 10 and steps in the technical architecture of fig. 7 indicate that the same steps are performed. In order to avoid repetitive description, the contents detailed in the embodiment shown in fig. 7 will be briefly described in the embodiment shown in fig. 10.
From t, in connection with FIG. 90The moment begins and the image is refreshed. At the same time, the ambient light sensor enters an integration period to begin collecting initial ambient light.
Accordingly, in FIG. 10, step E1, the ambient light sensor in the co-hardware layer of the SCP processor enters an integration period from t0(t01) The initial ambient light is collected at the beginning of the time.
Step A1, image t0(t01) And refreshing at the moment, and sending the display parameters of the interface to the HWC in the hardware abstraction layer of the AP processor by the SurfaceFlinger in the native framework layer of the AP processor. The HWC may send the received display parameters of each layer interface sent by the surfafinger to the hardware at the bottom of the HWC, and the hardware at the bottom of the HWC obtains the image synthesized by each layer interface according to the display parameters of each layer interface. The hardware underlying the HWC returns the synthesized image to the HWC.
In step A2, the HWC in the hardware abstraction layer of the AP processor sends the resultant image to the OLED driver in the kernel layer of the AP processor. Step a3, the OLED driver in the kernel layer of the AP processor sends the synthesized image to the display subsystem of the hardware layer of the AP processor.
In step a4, the display subsystem in the hardware layer of the AP processor stores the image before display in the CWB memory in the kernel layer of the AP processor.
In the embodiment of the present application, the HWC waits for a successful store signal from the display subsystem after sending the synthesized image to the OLED driver.
The display subsystem will send a signal to the HWC that the image was successfully stored in the CWB memory before being displayed. After receiving the signal that the display subsystem is successfully stored, the HWC performs cutout operation on the image before display stored in the CWB memory in the kernel layer to obtain a target image.
In step a5, the HWC in the hardware abstraction layer of the AP processor abstracts the target image from the pre-rendering image stored in the CWB memory in the kernel layer.
Step A6, after the HWC in the hardware abstraction layer of the AP processor obtains the target image, the target image is sent to the noise algorithm library of the layer, and after the noise algorithm library receives the target image, t is calculated according to the target image and the cached current brightness information01Image noise at time instants. During the execution of steps a1 through a6, the ambient light sensor in the co-hardware layer of the SCP processor is always in the integration process for one acquisition period.
In conjunction with FIG. 9, at t02At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t02At that moment, the brightness of the display screen changes, triggering the execution of step B1 in fig. 10.
In fig. 10, step B1 (step a 5' in the architecture shown in fig. 7), the HWC of the hardware abstraction layer of the AP processor obtains t from a kernel node in the kernel layer of the AP processor02Luminance information of the time instant.
Step B2 (step A6), HWC of hardware abstraction layer of AP processor will t02Brightness of timeThe degree information is sent to a noise algorithm library, and the noise algorithm library is based on t02Calculating and obtaining t by the brightness information of the moment and the cached currently displayed target image02Backlight noise at the moment.
During the execution of steps B1 through B2, the ambient light sensor in the co-hardware layer of the SCP processor is always in the integration process for one acquisition period.
After step B2, the noise memory of the noise algorithm library stores t01Image noise and t at time02Backlight noise at the moment.
In conjunction with FIG. 9, at t03At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t03At that time, the image is refreshed once.
In fig. 10, since the image is refreshed, the steps C1 to C6 are continuously performed, and the steps C1 to C6 refer to the descriptions in a1 to a6, which are not described herein again.
During the execution of steps C1 through C6, the ambient light sensor in the co-hardware layer of the SCP processor is still in the integration process for one acquisition period.
After step C6, the noise memory of the noise algorithm library stores t01Image noise at time t02Backlight noise and t at time03Image noise at time instants.
Referring to FIG. 9, at t04At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t04At that time, the image is refreshed once.
In fig. 10, since the image is refreshed, the steps D1 to D6 are continued, and the steps D1 to D6 refer to the descriptions in a1 to a6, which are not repeated herein.
During the execution of steps D1 through D6, the ambient light sensor in the co-hardware layer of the SCP processor is still in the integration process for one acquisition period.
After step D6, the noise memory of the noise algorithm library stores t01Image noise at time t02Time of dayBacklight noise of (1), t03Image noise and t at time04Image noise at time instants.
In conjunction with FIG. 9, at t1At this time, the current integration of the ambient light sensor is finished, and the integration of the ambient light sensor is finished (t)1Time), the ambient light sensor obtains the initial ambient light, and in fig. 10, the SCP processor starts to execute step E2, step E3, and step E4, and transmits the correlation data (raw value, integration start time, and integration end time on the RGBC four channels) of the initial ambient light to the HWC of the hardware abstraction layer of the AP processor.
In conjunction with FIG. 9, during non-integration periods, the image may also be refreshed (e.g., t11Temporal image refresh), brightness may also change (e.g., t12The brightness changes at the moment). Therefore, in the non-integration period, step F1 to step F6 still exist in fig. 10 (step F1 to step F5 in fig. 10 are omitted, and specifically, step a1 to step a5 may be referred to), so that t is t11The image noise at the time is stored in a noise memory of a noise algorithm library. In the non-integration period, there are still steps G1 to G2 (step G1 in fig. 9 is omitted, and specifically, refer to step B1) so that t is12The backlight noise at a time is stored in a noise memory of a noise algorithm library.
At step a 6', the HWC in the hardware abstraction layer of the AP processor sends the initial ambient light data to the noise algorithm library. And the noise algorithm library calculates and obtains the target ambient light according to the data of the initial ambient light and the image noise and the backlight noise which interfere with the initial ambient light.
As can be understood from fig. 10, the integration start time and the integration end time of the ambient light sensor are controlled by the corresponding clocks of the ambient light sensor; the process of calculating the image noise by the AP processor is controlled by an image refreshing clock; the process of the AP processor calculating the backlight noise is controlled by the timing of the backlight adjustment. Therefore, the execution of step a1 (or step C1, step D1, step F1) is triggered by an image refresh. The execution of step B1 (or step G1) is triggered by a brightness adjustment. The integration start time and the integration end time of the ambient light sensor are completely performed according to a preset acquisition period and each integration duration. Thus, the execution of step E2 is triggered by the event that the ambient light sensor integration ends.
From the triggering event perspective, these three processes are completely independent. However, the results obtained by the three processes (image noise, backlight noise, and initial ambient light) are correlated by the denoising process after the ambient light sensor integration period is over. The initial ambient light fused in the denoising process is the initial ambient light collected by the ambient light sensor in the current collection period, and the image noise and the backlight noise removed in the denoising process are image noise and backlight noise which can cause interference on the initial ambient light collected in the current collection period.
The embodiment of the application can obtain by analyzing the structure of the ambient light under the screen: factors disturbing the ambient light collected by the ambient light sensor include the display content of the display area directly above the photosensitive area of the ambient light sensor and the display area directly above a certain area around the photosensitive area of the ambient light sensor. The display content is divided into two parts: RGB pixel information and luminance information of the display image. Therefore, the noise calculation library in the embodiment of the present application obtains the fusion noise according to the fusion of the RGB pixel information and the luminance information of the target image. Then, integral noise of an integral period of the initial ambient light is obtained from the fusion noise. The target ambient light is obtained by removing integral noise that interferes with the initial ambient light from the initial ambient light obtained from the ambient light sensor integration period. Because the interference part is removed, accurate target environment light can be obtained, and the universality is strong.
In addition, since the AP processor of the electronic device can obtain the target image and the luminance information, accordingly, the AP processor obtains image noise and backlight noise. The SCP processor may obtain initial ambient light. Thus, the SCP processor may send the initial ambient light to the AP processor, where the initial ambient light and the fusion noise are processed by the AP processor to obtain the target ambient light. The problem that the AP processor frequently sends the target image (or image noise) and the brightness information (or backlight noise) to the SCP processor, and the inter-core communication is too frequent and consumes more power is avoided.
Furthermore, the DSS in the AP processor may store the image before display (the image to be displayed in the current refresh of the display screen) in the CWB memory. The HWC in the AP processor extracts a target image from an image before display sending stored in the CWB memory so as to calculate and obtain fusion noise, and the fusion noise obtained by the method is accurate and has low power consumption.
It should be noted that, in the case of displaying an image on the display screen, the brightness of the display screen needs to be adjusted according to the target ambient light. In the case where the display screen does not display any image, it is not necessary to adjust the brightness of the display screen in accordance with the target ambient light. Therefore, the AP processor also needs to monitor the display screen for on and off screen events. When the screen is bright, the detection method of the ambient light provided by the embodiment of the application is executed. While the screen is being turned off, the AP processor may not perform the steps a4 through a 6. Similarly, the SCP processor may also control the ambient light sensor to stop collecting the initial ambient light when the screen is turned off, and the SCP processor may not perform steps E2 to E5.
To provide a clearer understanding of the execution inside the AP processor, a timing diagram between various modules inside the AP processor will be described, which is obtained by obtaining t in the embodiment shown in fig. 1001Image noise at time t02The backlight noise at the time is described as an example.
In the embodiment shown in fig. 11, when refreshing an image, the respective modules in the AP processor perform the following steps:
step 1100, after the display engine service obtains the display parameters of the interface to be displayed from the application in the application layer, the display engine service sends the display parameters of the interface to be displayed to the surface flicker.
In step 1101, after the surfefinger obtains the display parameters of the interface to be displayed of the application a from the display engine service, the display parameters (e.g., memory address, color, etc.) of each interface (the interface to be displayed of the application a, the status bar interface, etc.) are sent to the HWC through the interfaces (e.g., setLayerBuffer, setLayerColor).
In step 1102, after the HWC receives the display parameters of each interface, the HWC obtains a synthesized image according to the display parameters of the interface to be displayed by the hardware on the bottom layer of the HWC.
In step 1103, the HWC obtains the image synthesized by the hardware on the bottom layer, and sends the synthesized image to the OLED driver.
And step 1104, after receiving the synthesized image sent by the HWC, the OLED driver sends the synthesized image to the display subsystem.
Step 1105, after the display subsystem receives the synthesized image, it performs a secondary processing on the synthesized image to obtain the image before display.
At step 1106, the display subsystem stores the pre-display image in the CWB memory.
It should be noted that, since the OLED screen needs to refresh the image, the display subsystem needs to send the image before being sent to the display screen for display.
In the embodiment of the application, the step of sending the image before being sent and displayed to the display screen by the display subsystem and the step of storing the image before being sent and displayed in the CWB memory by the display subsystem are two independent steps without strict precedence order.
In step 1107, after the display subsystem successfully stores the pre-display image in the CWB memory, it may send a signal to the HWC that the storage was successful.
In step 1108, after receiving the signal that the storage is successful, the HWC performs matting to obtain the target image from the image before display stored in the CWB memory, and the time when the HWC starts to obtain the target image is used as the timestamp of the target image.
In step 1109, the HWC sends the target image and the timestamp to the noise algorithm library after acquiring the target image and the timestamp.
Step 1110, the noise algorithm library calculates and obtains the image noise (t) at the refresh time corresponding to the target image01Image noise at the moment). The timestamp of the image noise is the timestamp of the target image from which the image noise is obtained. A noise algorithm library stores the image noise and a timestamp of the image noise.
During brightness adjustment, each submodule in the AP processor executes the following steps:
and 1111, after the display engine service obtains the brightness to be adjusted from the application a in the application layer, the display engine service sends the brightness to be adjusted to the kernel node.
In step 1112, the HWC acquires the brightness to be adjusted from the core node after monitoring that the data in the core node changes. The time when the HWC executes the retrieval of the brightness to be adjusted from the kernel node is a time stamp of the brightness to be adjusted.
In practical applications, the HWC always listens to the kernel node for data changes.
In step 1113, the HWC sends the adjusted brightness and the timestamp of the brightness to be adjusted to the noise algorithm library.
Step 1114, the noise algorithm library calculates the backlight noise (t) at the adjustment time to obtain the brightness to be adjusted02Backlight noise at the moment). The timestamp of the backlight noise is the timestamp of the brightness to be adjusted of the backlight noise. A noise algorithm base stores the backlight noise and a time stamp of the backlight noise.
After the end of an integration period, the SCP processor sends the initial ambient light collected during the integration period to the HWC in the AP processor.
In step 1115, the HWC of the AP processor receives the initial ambient light sent by the SCP processor and the integration start time and the integration end time of the initial ambient light.
In step 1116, the HWC sends the initial ambient light and the integration start time and the integration end time of the initial ambient light to the noise algorithm library after receiving the initial ambient light sent by the SCP processor and after receiving the integration start time and the integration end time of the initial ambient light.
In step 1117, the noise algorithm library calculates the integration noise according to the image noise and the corresponding timestamp, the backlight noise and the corresponding timestamp and the integration start time and the integration end time of the initial ambient light. And the noise algorithm library calculates and obtains backlight noise according to the integral noise and the initial ambient light.
The embodiment of the application mainly describes a sequential logic diagram among modules when the AP processor obtains target ambient light.
In the above embodiments, the example is that after the AP processor acquires the target image and the luminance information, the AP processor calculates the fusion noise, and after the SCP processor acquires the initial ambient light, the SCP processor sends the initial ambient light to the AP processor, and the AP processor processes the fusion noise to acquire the integral noise of the integral time period of the initial ambient light, and then acquires the target ambient light according to the initial ambient light and the integral noise.
In practical application, the AP processor may also send the target image and the brightness information to the SCP processor after obtaining the target image and the brightness information. The SCP processor fuses the target image and the brightness information to obtain fusion noise and integral noise of an integral time period of the initial ambient light, and then obtains the target ambient light according to the fusion noise and the initial ambient light.
In practical application, after the AP processor acquires the target image and the brightness information, the AP processor calculates the fusion noise and sends the fusion noise obtained by calculation to the SCP processor. The SCP processor obtains integral noise of an integral time period according to the received fusion noise, and obtains target ambient light according to the integral noise of the integral time period and initial ambient light collected by the ambient light sensor.
Referring to fig. 12, the fusion noise is calculated and obtained at the AP processor according to the embodiment of the present disclosure; and calculating integral noise at the SCP processor, and obtaining target ambient light according to the integral noise and the target ambient light.
As mentioned above, the process of obtaining the target ambient light can be briefly described as follows:
step 1, calculating image noise according to a target image.
And 2, calculating backlight noise according to the brightness.
And 3, calculating target ambient light (raw values on four channels) according to the image noise, the backlight noise and the initial ambient light.
In the technical architecture shown in fig. 7, step 3, the target ambient light is calculated according to the image noise, the backlight noise and the initial ambient light, and is implemented in the noise algorithm library of the AP processor. The noise algorithm library of the AP processor can calculate the image noise and the backlight noise. The initial ambient light is derived from the driving of the ambient light sensor of the SCP processor. Therefore, the AP processor noise algorithm library needs to obtain the relevant data of the initial ambient light reported by the SCP processor (steps E3 to E4). The AP processor finally returns the calculated values on the four channels of the target ambient light to the SCP processor to obtain the Lux value of the target ambient light (step A8, step E5, step E6).
In the technical architecture shown in fig. 12, step 3, the target ambient light is calculated according to the image noise, the backlight noise and the initial ambient light, and is implemented in the denoising module of the SCP processor. The image noise and backlight noise are acquired by the AP processor and the initial ambient light is acquired by the ambient light sensor drive of the SCP processor. Therefore, the denoising module of the SCP processor needs to acquire the image noise and the backlight noise transmitted by the AP processor (step A8, step E5, step E6), and also needs the ambient light sensor of the SCP processor to drive the transmitted initial ambient light (step E3).
In view of the above analysis, in the technical architecture shown in fig. 7, the calculations of step 1 to step 3 need to be implemented in the noise algorithm library of the AP processor. In the technical architecture shown in fig. 12, step 1 and step 2 need to be implemented in the noise algorithm library of the AP processor, and step 3 needs to be implemented in the computation module of the SCP processor.
For a clearer understanding of the process of obtaining the target ambient light corresponding to the technical architecture shown in fig. 12, reference is made to a timing chart shown in fig. 13. From t in connection with events at various times in FIG. 90The moment begins and the image is refreshed. At the same time, the ambient light sensor enters an integration period to begin collecting initial ambient light.
Accordingly, in FIG. 13, step E1, the ambient light sensor in the co-hardware layer of the SCP processor enters an integration period from t0(t01) The initial ambient light is collected at the beginning of the time.
Steps A1 through A6 refer to the description of steps A1 through A6 in the example of FIG. 7.
Step A7, noise algorithm library in hardware abstraction layer in AP processor will t01Temporal image noise transmissionHWC of the same layer.
Step A8, calculating t at AP processor01After the image noise of the moment, t01The image noise at the moment is sent to the ambient light sensor application of the co-application layer of the SCP processor.
Step A9 (step E5 in the architecture shown in FIG. 12), the ambient light sensor application of the co-application layer of the SCP processor will t01And the image noise at the moment is sent to a noise memory of the SCP processor co-driving layer.
Steps B1 through B2 refer to the description of steps B1 through B2 in the embodiment of FIG. 7.
Step B3, noise algorithm library in hardware abstraction layer in AP processor will t02The backlight noise at the moment is sent to the HWC of the same layer.
Step B4, calculating t at AP processor02After the backlight noise of the moment, t02The backlight noise at the moment is sent to the ambient light sensor application of the co-application layer of the SCP processor.
Step B5 (step E5 in the architecture shown in FIG. 11), ambient light sensor application t of the co-application layer of the SCP processor02The backlight noise at the moment is sent to a noise memory of the SCP processor co-driving layer.
The steps C1 to C9, and the steps D1 to D9 refer to the descriptions of the steps a1 to a9, which are not repeated herein.
After the ambient light sensor integration is over, the SCP processor is triggered to perform step E2, step E2 as described with reference to the embodiment shown in fig. 7.
And E3 to E6, the SCP processor cooperates with a denoising module in the driving layer to take out the fusion noise from the noise memory of the layer, and the raw values on four channels of the initial ambient light are obtained from the ambient light sensor of the layer. And calculating according to raw values on four channels of the initial ambient light and image noise and backlight noise which interfere with the initial ambient light to obtain the target ambient light. During non-integration periods, the image may also be refreshed (e.g., t 11Temporal image refresh), brightness may also change (e.g., t12The brightness changes at the moment). Thus, during the non-integration periodIn fig. 13, steps F1 to F9 still exist (step F1 to F5 in fig. 13 are omitted, and specifically, step a1 to step a5 in fig. 13 may be referred to), so that t is11The image noise at the time is stored in a noise memory of the SCP processor. In the non-integration period, there are still steps G1 to G5 (step G1 in fig. 13 is omitted, and specifically, step B1 in fig. 13 may be referred to), so that t12The backlight noise at a time is stored in a noise memory of a noise algorithm library.
The process of obtaining the target ambient light from the target image, the brightness, and the initial ambient light calculation by the noise algorithm library in the embodiment shown in fig. 7 will be described below.
Step one, when a noise calculation base obtains a target image, calculating and obtaining image noise at the refreshing time of the target image according to the target image and the brightness of a display screen at the refreshing time of the target image; and when the noise calculation library obtains a brightness, calculating and obtaining the backlight noise at the brightness adjusting time according to the brightness and the target image at the brightness adjusting time.
Although the image noise and the backlight noise are different names, the calculation process is calculated based on a frame of the target image and a luminance value.
The target image is composed of a plurality of pixel points, firstly, weighting and operation are carried out according to the RGB value of each pixel point and the weighting coefficient of each pixel point, and the weighted RGB value of the target image is obtained. And determining the weighting coefficient of each pixel point according to the distance between the coordinate of the pixel point and the reference coordinate of the target image. The coordinates of the center point of the photosensitive area of the ambient light sensor may be used as reference coordinates of the target image.
And step two, the noise calculation library obtains fusion noise according to the weighted RGB value and the brightness of the target image. The fusion noise may be obtained by a table lookup method (in the table, fusion noise corresponding to the weighted RGB value of the target image and the luminance is set), or may be obtained by a preset functional relationship (the independent variable is the weighted RGB value and the luminance of the target image, and the dependent variable is the fusion noise). The fusion noise obtained at this time is a raw value of four channels.
And step three, calculating and obtaining integral noise in the integral time period of the initial environment light by the noise calculation base according to the fusion noise at each moment.
It should be noted that image noise is not generated by the image refresh process itself. In the integration time period, in the time period before the image refreshing, the interference to the initial environment light is the image noise corresponding to the image before the refreshing, and in the time period after the image refreshing, the interference to the initial environment light is the image noise corresponding to the image after the refreshing.
Similarly, the backlight noise is not generated by the process of adjusting the brightness. In the integration time period, in the time period before brightness adjustment, the interference to the initial environment light is backlight noise corresponding to the brightness before adjustment, and in the time period after brightness adjustment, the interference to the initial environment light is backlight noise corresponding to the brightness after adjustment.
As described above, the noise memory stores the image noise and the backlight noise at each time point calculated by the noise algorithm library. The noise stored in the noise memory is collectively referred to as fusion noise or first noise.
A step a1, the first processor fetching a first noise from an exit position of the noise memory through the noise algorithm library, the first processor updating the exit position of the noise memory or the first noise of the exit position through the noise algorithm library;
step B1, if the timestamp corresponding to the currently fetched first noise is before the first time or the first time, the first processor continues to execute step A1 through the noise algorithm library until the currently fetched first noise is after the first time;
step B2, if the first noise currently extracted is after the first time, the first processor performs the following steps through the noise algorithm library:
Step C1, if the timestamp of the currently extracted first noise is after the first time for the first time and before the second time, calculating and obtaining the integral noise between the first time and the time corresponding to the timestamp of the currently extracted first noise according to the last extracted first noise, and continuing to execute the step A1;
step C2, if the timestamp of the currently extracted first noise is after the first time for the first time and after the second time or the second time, calculating to obtain the integral noise between the first time and the second time according to the last extracted first noise, and continuing to execute step D1;
step C3, if the timestamp of the first noise currently taken out is not after the first time and before the second time, calculating, according to the first noise taken out last time, to obtain an integrated noise between a time corresponding to the timestamp of the first noise taken out last time and a time corresponding to the timestamp of the first noise currently taken out; and continues from step a 1;
step C4, if the timestamp of the first noise taken out at present is not after the first time and is after the second time or the first time, calculating the integral noise between the time corresponding to the timestamp of the first noise taken out at last time and the second time according to the first noise taken out at last time, and continuing to execute step D1;
And D1, obtaining a second value according to the integral noise between the first time and the second time and the first value.
When the noisy memory is a fifo (First Input First output) memory. The FIFO memory is a first-in first-out double-port buffer, one of two ports of the memory is an input port of the memory, and the other port of the memory is an output port of the memory. In the structure of the memory, the data which enters the memory firstly is shifted out, and correspondingly, the sequence of the shifted-out data is consistent with the sequence of the input data. The outlet position of the FIFO memory is the storage address corresponding to the output port of the FIFO memory.
When the FIFO memory shifts out a datum, the process is as follows: the fusion noise stored in the exit position is removed from the exit position (first position), and then the data in the second position from the exit position is moved to the exit position, and the data in the third position from the exit position is moved to the second position from the exit position, … … in turn.
Of course, in practical applications, after the fused noise stored at the first position (a1) is removed from the exit position (first position, a1), the exit position of the memory may be updated to the second position (a 2). After the fusion noise stored at the current exit position (a2) is removed again, the exit position of the memory is continuously updated to the third position (A3) … ….
The process of obtaining the second value based on the above-described calculation may refer to the embodiment described with reference to fig. 14 to the embodiment shown in fig. 16.
Referring to fig. 14, fig. 14 is a process of calculating integral noise according to image noise and backlight noise by the noise calculation library in the AP processor provided in the embodiment of the present application. The various times in the process may be compared to the descriptions of the various times in the embodiments shown in fig. 9 and 10: at t01Refreshing the image at all times to obtain t01Image noise at a time; at t02Adjusting brightness at a moment to obtain t02Backlight noise at a moment; at t03Refreshing the image at all times to obtain t03Image noise at a time; at t04Refreshing the image at all times to obtain t04Image noise at time instants.
From t01Time to t02At the moment, the displayed image is t01The brightness of the display screen of the image after the moment refreshing is t01Brightness at time (t)01The brightness at the moment is the brightness value stored in the noise algorithm library at t01Brightness value most recently stored before the time), t01The image noise at time t01The brightness of the image after the moment refreshing on the display screen is t01Noise in the case of the brightness of the time instant. Thus, the initial ambient light includes a duration of "t02-t01", time stamp t01The image noise of (1).
From t02Time to t03At the moment, the brightness of the display screen is t 02The brightness after the moment adjustment is t, the image displayed by the display screen01Image after temporal refresh, t02Backlight noise at time t02The brightness after the moment adjustment is displayed on the display screenDisplay t01Noise in the case of time-adjusted images. Thus, the initial ambient light includes a duration of "t03-t02", time stamp t02Backlight noise at the moment.
From t03Time to t04At the moment, the displayed image is t03The brightness of the display screen of the image after the moment refreshing is t02Brightness adjusted at time t03The image noise at time t03The brightness of the image after the moment refreshing on the display screen is t02Noise in the case of the luminance after the time adjustment. Thus, the initial ambient light includes a duration of "t04-t03", time stamp t03The image noise of (1).
From t04Time to t1At the moment, the displayed image is t04The brightness of the display screen of the image after the moment refreshing is t02Brightness adjusted at time t04The image noise at time t04The brightness of the image after the moment refreshing on the display screen is t02Noise in the case of the luminance after the time adjustment. Thus, the initial ambient light includes a duration of "t1-t04", time stamp t04The image noise of (1).
Based on the above understanding, the AP processor, in calculating the integral noise:
t01Image noise pair t of time01Time to t02The initial ambient light at the moment causes interference;
t02backlight noise pair t of time instants02Time to t03The initial ambient light at the moment causes interference;
t03image noise pair t of time03Time to t04The initial ambient light at the moment causes interference;
t04image noise pair t of time04Time to t1The initial ambient light at the moment causes interference.
Thus, t can be calculated separately01Time to t02Integral noise at time t02Time to t03Of time of dayIntegral noise, t03Time to t04Integral noise at time t04Time to t1Integral noise at time instants.
For t01Time to t02The integral noise at time is:
Figure BDA0003093985790000301
for t02Time to t03The integral noise at time is:
Figure BDA0003093985790000302
for t03Time to t04The integral noise at time is:
Figure BDA0003093985790000303
for t04Time to t1The integral noise at time is:
Figure BDA0003093985790000304
wherein the content of the first and second substances,
Figure BDA0003093985790000305
represents t01The fusion noise at the time of day is,
Figure BDA0003093985790000306
represents t02The fusion noise at the time of day is,
Figure BDA0003093985790000307
represents t03The fusion noise at the time of day is,
Figure BDA0003093985790000308
represents t04Fusion noise at time.
While each sub-period (t) within the integration period01To t02,t02To t03,t03To t04,t04To t1) Product ofThe sum of the partial noises is the integral noise of the whole integration period.
The start time of the integration period in the above example is just the time of image refresh, i.e., the image noise at the start time of the integration period can be obtained.
In practical applications, it is possible that the start time of the integration period is not the time of image refresh nor the time of backlight adjustment. In this case, it is necessary to acquire the fusion noise corresponding to the change time (image refresh time or backlight adjustment time) that is the latest before the start of the current integration period.
Referring to fig. 15, an integration time period (t) is obtained for a noise calculation library in an AP processor provided in an embodiment of the present application01Time to t1Time of day), t01The time is not the starting time of the current integration time period, but is the image refreshing time of one time in the current integration time period. The latest change time (image refresh time or brightness adjustment time) before the start of the current integration period is t-1The time is an image refresh time.
Referring to fig. 16, if the latest change time before the start of the current integration period is the image refresh time, the image noise corresponding to the image refresh time will be t0Time to t01The initial ambient light at the moment causes interference.
Of course, if the latest change time is the brightness adjustment time, the backlight noise corresponding to the brightness adjustment time will be t 0To t01The initial ambient light at the moment causes interference.
In the embodiment shown in fig. 16, the integration noise corresponding to each sub-period in the integration period is:
for t0Time to t01The integral noise at time is:
Figure BDA0003093985790000309
for t01Time to t02The integral noise at time is:
Figure BDA00030939857900003010
for t02Time to t03The integral noise at time is:
Figure BDA00030939857900003011
for t03Time to t04The integral noise at time is:
Figure BDA00030939857900003012
for t04Time to t1The integral noise at time is:
Figure BDA00030939857900003013
wherein the content of the first and second substances,
Figure BDA00030939857900003014
represents t-1The fusion noise at the time of day is,
Figure BDA00030939857900003015
represents t01The fusion noise at the time of day is,
Figure BDA00030939857900003016
represents t02The fusion noise at the time of day is,
Figure BDA00030939857900003017
represents t03The fusion noise at the time of day is,
Figure BDA00030939857900003018
represents t04Fusion noise at time.
As can be understood from the above example, the obtained integral noise is also a raw value on four channels.
The timestamps in the above examples are different, and in practical applications, the HWC may perform both the process of acquiring the target image and the process of acquiring the brightness to be adjusted within one time measurement unit (e.g., within 1 ms). However, the time stamp of the target image acquired at this time and the brightness to be adjusted are the same.
If a target image and a brightness value with the same timestamp exist and the noise algorithm library receives the target image first, the noise algorithm library calculates image noise according to the latest brightness value before the target image and the target image, and calculates backlight noise according to the target image and the brightness value with the same timestamp when calculating backlight noise corresponding to the brightness value;
If the target image and the brightness value with the same timestamp exist and the noise algorithm library receives the brightness value first, the noise algorithm library calculates the backlight noise according to the latest target image before the brightness value and the brightness value, and calculates the image noise according to the target image and the brightness value with the same timestamp when calculating the image noise corresponding to the target image.
The noise algorithm library firstly receives a target image, then calculates to obtain image noise, and firstly stores the image noise to a noise memory. The fusion noise stored in the noise memory has a time sequence, that is, before being stored in the noise memory, whether the fusion noise to be stored currently is behind the timestamp of the fusion noise stored last time is judged, if the fusion noise to be stored currently is behind the timestamp of the fusion noise stored last time, the fusion noise to be stored currently is stored, and if the fusion noise to be stored currently is before or the same as the timestamp of the fusion noise stored last time, the noise to be stored currently is discarded. Therefore, the backlight noise obtained by the post-calculation is discarded.
In practical applications, the time stamp of the target image may be the time when the HWC starts to execute the target image fetch from the CWB write-back memory. The timestamp of the luminance value may be the time at which the HWC started to retrieve the luminance value from the kernel node as the timestamp of the luminance value. The HWC may switch to obtaining the luminance values during the process of obtaining the target image. Therefore, the HWC performs the capturing of the target image first and then the capturing of the luminance value, and the timestamp of the luminance value is later than the timestamp of the target image. In practical applications, the HWC may obtain the luminance value and send the luminance value to the noise algorithm library, and the noise algorithm library calculates the backlight noise and stores the backlight noise. And obtaining a target image after the HWC and sending the target image to a noise algorithm library, and calculating by the noise algorithm library to obtain image noise and storing the image noise. This results in the time at which the timestamp of the image noise is currently ready to be stored being before the time at which the timestamp of the backlight noise was last stored.
And step four, removing integral noise of the whole integral time period from the initial environment light by a noise algorithm library to obtain the target environment light.
In the embodiment of the present application, the initial ambient light sent by the SCP processor to the HWC of the AP processor is initial ambient light data in the form of RGBC four-channel raw values. The HWC sends the initial ambient light data, also in the form of RGBC four-channel raw values, to the noise algorithm library. The raw values over the four channels of the integrated noise are obtained in step three. Therefore, in this step, the raw value on the four channels of the target ambient light can be obtained by performing an operation on the raw value of the four channels of the initial ambient light and the raw value of the integrated noise four channels.
After calculating and obtaining raw values on four channels of the target ambient light, the noise algorithm library can send the raw values on the four channels of the target ambient light to the SCP processor, and the SCP processor calculates and obtains the lux value of the target ambient light according to the raw values on the four channels of the target ambient light.
As an example, the lux value may be weighted according to the raw value of each channel multiplied by a coefficient of each channel (which may be provided by the manufacturer of the ambient light sensor).
As described above, each time (or once, twice, etc. at intervals) the electronic device refreshes an image, the DSS in the AP processor stores the image before display (which may be understood as an image to be refreshed in the current refreshing process or an image after the current refreshing) in the CWB memory. The HWC in the AP processor performs matting to obtain a target image from the image to be refreshed stored in the CWB memory, and then sends the target image to a noise algorithm library. For convenience of description, the step of the DSS in the AP processor storing the pre-send image in the CWB memory, the HWC obtaining the target image from the CWB memory is denoted as a write-back function of the CWB.
In the CWB writeback function enabled state of the AP processor, the DSS stores the pre-rendered image in the CWB memory each time (or every other time, twice, etc.) the electronic device refreshes the image. After the DSS stores the image before display in the CWB memory, the DSS sends a message to the HWC module that the storage was successful. After receiving the successful store information, the HWC may retrieve the target image from the CWB memory. Accordingly, the HWC module may send the target image to a noise algorithm library.
That is, in the CWB writeback function enabled state of the AP processor, the AP processor performs steps a1 through a step a6 in fig. 7 each time (or, once every other time, twice, etc.) the electronic device refreshes an image.
In the CWB write-back function stop state of the AP processor, the electronic device performs steps a1 to A3 of the technical architecture shown in fig. 7 according to the refresh display flow every time (or every other time, twice, etc.) the image is refreshed. However, the DSS no longer stores the pre-sent image in the CWB memory and, accordingly, the AP processor no longer performs the subsequent correlation steps.
That is, in the CWB write back function stop state of the AP processor, the AP processor performs steps a1 to A3 in fig. 7 each time (or, once every other time, twice, etc.) the electronic device refreshes the image, but does not perform steps a4 to a6 in the technical architecture shown in fig. 7 any more.
As mentioned above, after receiving the target image, the noise algorithm library calculates and obtains the image noise according to the received target image. Therefore, when the electronic device performs image refreshing during the activation state of the CWB write-back function of the AP processor, the corresponding image noise is not obtained, and when the electronic device performs image refreshing during the deactivation state of the CWB write-back function of the AP processor, the corresponding image noise is not obtained.
Taking fig. 9 and 14 as an example, at the end of integration (t)1Time) later, the noise algorithm library is calculated to obtain t0Time to t1The fusion noise adopted when the target ambient light is at the moment is: t is t01Image noise at time t02Temporal backlight noise、t03Image noise and t at time04Image noise at time instants. The unnecessary fusion noise includes at least: t is t11Image noise and t at time12Backlight noise at the moment. That is, the image noise and the backlight noise corresponding to the time period from the start time (including the start time) to the end time of the integration time period interfere with the initial ambient light acquired by the current integration process, and the image noise and the backlight noise corresponding to the non-integration time period may not interfere with the initial ambient light acquired by the current integration process.
Therefore, to reduce power consumption, the AP processor may control the CWB write back function to be enabled by the HWC during the integration period of the ambient light sensor, and the AP processor performs steps a 4-a 6. During the non-integration period of the ambient light sensor, the CWB write back function is stopped by the HWC control and the AP processor no longer performs steps a4 through a 6.
Referring to fig. 17, fig. 17 is a schematic diagram of a start-stop method of a CWB write-back function according to an embodiment of the present application.
As shown in fig. 17, during the integration period of the ambient light sensor (from t)0To t1From t2To t3From t4To t5) The HWC controls the CWB writeback function to start, during the non-integration period (from t)1To t2From t3To t4From t5To t6) The HWC controls the CWB write back function to stop. In this way, the image noise in each integration process can be obtained, and the power consumption of the AP processor can be reduced.
The embodiment of the application will focus on a start-stop method of the CWB write-back function. In the start-stop method of the CWB write-back function, the HWC in the AP processor may monitor whether the data in the core node changes during both the integration period and the non-integration period. When the change occurs, the HWC may obtain the brightness to be adjusted, and accordingly, the noise algorithm library may calculate the backlight noise for the entire acquisition period.
In the subsequent embodiments of the present application, for example, the HWC in the AP processor may monitor the change of data in the core node in both the integration time period and the non-integration time period, and when the data stored in the core node changes, the HWC acquires the brightness to be adjusted from the core node, and transmits the brightness to the noise algorithm library to calculate and obtain the backlight noise.
In addition, since the integration process of the ambient light sensor is controlled by the SCP processor side, the SCP processor needs to send time-related parameters during the process of acquiring the initial ambient light by the ambient light sensor to the AP processor.
As described above, the SCP processor may transmit the initial ambient light, the time related to the integration process of the current initial ambient light (for example, the current integration end time and the current integration duration, or the current integration start time and the current integration end time, etc.) to the HWC in the AP processor after obtaining the initial ambient light each time the integration ends. The SCP processor may also send the integration start time (or a period of time before) when the ambient light sensor is next ready to collect the initial ambient light to the HWC in the AP processor as the time when the CWB writeback function is initiated. The integration end time (or some time thereafter) at which the initial ambient light is next ready to be collected is sent to the HWC in the AP processor as the time at which the CWB write back function stops. I.e., the start and stop times at which the CWB write back function is sent by the SCP processor to the HWC in the AP processor.
In practical application, when the acquisition period of the ambient light sensor is fixed, the SCP processor may send the integration start time of the next acquisition of the initial ambient light to the AP processor as the time when the CWB write-back function is started. And the AP processor calculates and obtains the moment when the CWB write-back function stops according to the received starting moment and the received acquisition period of the CWB write-back function. Or the SCP processor reports the start time of the first integration, the integration duration, the sampling period and the like. The start time and stop time of the CWB write back function are determined by the AP processor based on the data.
It should be noted that, in the embodiment of the present application, the parameter related to the time during the process of acquiring the initial ambient light by the ambient light sensor, which is sent by the SCP processor to the AP processor, is not limited. As long as the AP processor can obtain the next starting time of the CWB write back function according to the received time-dependent parameter.
The start time of the CWB write-back function does not completely coincide with the integration start time, and the stop time of the CWB write-back function does not completely coincide with the integration end time. The time period corresponding to the start state of the CWB write back function needs to include the integration time period in the acquisition cycle. Taking an acquisition cycle as an example, the starting time of the CWB write-back function is earlier than or equal to the starting time of the integration period of one acquisition cycle, and the stopping time of the CWB write-back function is later than or equal to the starting time of the integration period of one acquisition cycle.
As another example, the SCP processor may further send the initial ambient light, the integration duration corresponding to the initial ambient light (or the integration start time of this time), the integration end time of this time, and the sleep duration of the CWB write-back function to the HWC in the AP processor after the integration is ended. For convenience of description, information transmitted after the end of the integration may be collectively referred to as first information. The first information is not limited to the above information, and more or less information may be included in the above information.
As described above, the noise algorithm library calculates and obtains the target ambient light based on the initial ambient light, the integration duration (or the current integration start time) corresponding to the initial ambient light, the current integration end time, and the corresponding fusion noise. The detailed process refers to the description of the above embodiments and is not repeated.
The HWC in the AP processor needs to determine the time to initiate the CWB writeback function based on the sleep duration of the AP processor.
Of course, the first message may be split into a plurality of sub-messages, which are respectively sent to the HWC of the AP processor. The embodiment of the present application does not limit this.
Taking the example of sending the above information together to the AP processor, the SCP processor may transmit the time at which the SCP processor sends the first information together.
After receiving the first message, the HWC in the AP processor first controls the CWB write back function to stop. And then the HWC in the AP processor obtains the starting time of the CWB write-back function according to the partial information in the received first information or obtains the starting time of how long the CWB write-back function needs to be waited for.
Since the start time of the CWB write back function may be before the integration start time of the next cycle, it is not necessary to strictly control at a certain point in time. Therefore, the starting time of the CWB write back function can be obtained in any of the following ways or other ways not shown in the embodiments of the present application.
The HWC in the AP processor obtains the inter-core communication delay based on the time when the SCP processor sends the first information and the time when the AP processor receives the first information. Then, the HWC in the AP processor obtains, according to the inter-core communication delay and the sleep time, a time length (the sleep time length minus the inter-core communication delay) until the start time of the CWB write-back function is still waiting or obtains a start time of the CWB write-back function (the time length until the start time of the CWB write-back function is added to the time length until the AP processor receives the first information through the HWC). The HWC in the AP processor initiates the CWB writeback function while waiting until the CWB writeback function is initiated.
As an example, in the case where the total duration of the non-integration period is 300ms, the sleep duration may be 240ms, 250ms, 260ms, 270ms, 280ms, or the like. Thus, even if there is an inter-core communication delay (e.g., 1ms), the CWB write back function can be guaranteed to start before the next integration starts.
Certainly, in practical applications, the AP processor may also calculate, by using the HWC, the start time of the CWB write-back function (the integration end time plus the sleep time) or the time duration that the start time of the CWB write-back function should continue to wait from the start time of the CWB write-back function (the integration end time plus the sleep time minus the time when the AP processor receives the first information through the HWC) according to the integration end time and the sleep time sent by the SCP processor. In this example, the CWB write back function start time is noted as the first time. The first time is also the time when the second time length passes after the time when the cutout mark is set as the first character. The sleep duration in the first information may be recorded as a first duration. The second duration is: the sleep duration minus the duration of the delay. The time length of the delay is as follows: the time when the HWC module receives the first message is subtracted by the duration of the integration end time, which may be recorded as the second time. As described above, the first information received by the AP processor may further include an integration start time (first time), an integration end time (second time), an initial ambient light (first value), a sleep duration (first duration), and the like.
Since the scratch flag represents the start and stop of the CWB write back function, the length of time that the CWB write back function stops can also be understood as the length of time that the scratch flag is set to the first character.
As described above, the start time of the CWB write-back function is not strictly fixed at a certain time in the embodiment of the present application, and therefore, other calculation methods may also be adopted in the embodiment of the present application as long as the start time of the CWB write-back function is ensured to be before the start time of the integration. Therefore, the AP processor may also ignore the communication delay by using the sleep duration in the received first message as the sleep duration of the CWB write-back function.
The above examples are described by taking the time alignment of the AP processor and the SCP processor as an example, and in the case that the AP processor and the SCP processor are not time aligned, the time difference when the AP processor and the SCP processor are not time aligned needs to be considered on the basis of the time or the time obtained by the above calculation.
As mentioned above, the stop time of the CWB write back function is the time or after the AP processor receives the first message through the HWC, so after the CWB write back function is started, the HWC stops the CWB write back function in the currently started state at the time or after the next time the first message sent by the SCP processor is received.
As can be appreciated in the manner described above, the turn-off time of the CWB write back function is after the integration ends. The starting time of the CWB write back function may be determined according to the sleep duration of the AP processor.
Referring to fig. 18, on the AP processor side, each time the electronic device refreshes an image, the surface flipper sends the display parameters of the interface to the HWC (specifically, refer to step a1 in the embodiment shown in fig. 7), the HWC obtains a synthesized image based on the display parameters, the HWC needs to query the scratch flag, and in the case that the scratch flag indicates that it is enabled, the HWC enables the CWB write-back function, and the AP processor may execute steps a1 to a6 in the technical architecture shown in fig. 7. I.e., the CWB write back function is enabled, the noise algorithm library can calculate the image noise and backlight noise during the activation of the CWB write back function.
In this example, the scratch marks can be written as write back marks.
The sectional mark in the above embodiments may also be in the form of an identifier. After the HWC receives the first message sent by the SCP processor, the HWC sets the identifier to a first character (e.g., 0, False), and after the HWC waits for the sleep duration, the HWC sets the identifier to a second character (e.g., 1, True). If the tag is the first character (e.g., 0, False), the HWC control stops the CWB write back function. If the identifier is the second character (e.g., 1, True), then the HWC control initiates the CWB write back function.
In this example, the first character may be marked as a first mark and the second character may be marked as a second mark.
The HWC controls whether the CWB write back function is enabled or disabled by querying the tag.
As an example, the HWC may query whether the identifier is currently the first character or the second character each time before performing step a2 in the technical architecture shown in fig. 7. If the character is the second character, which indicates that the CWB write-back function is in the enabled state, the HWC transmits the information to be scratched when executing step a2, and the display subsystem stores the image to be displayed in the CWB memory after receiving the synthesized image and the information to be scratched, which are transmitted downward by the HWC. If the character is the first character and indicates that the CWB write-back function is in a stop state, the HWC executes step a2, and then does not transmit the information that needs to be scratched (or transmits the information that does not need to be scratched all together), and after the display subsystem receives the synthesized image that the HWC has transmitted downward, if the information that needs to be scratched (or receives the information that does not need to be scratched), the image to be displayed is not stored in the CWB memory. The HWC may not retrieve the target image.
As an example, when the matte mark is the second mark, if the electronic device refreshes the image (which may be referred to as the fifth image), the surface flicker transmits the display parameter of the interface (which may be referred to as the fourth display parameter) to the HWC, and the HWC may call the underlying hardware composite image after receiving the fourth display parameter. The HWC may transmit the synthesized image (which may be the fifth image or an image processed to obtain the fifth image) to the display subsystem together with the information to be scratched (which may be referred to as the third information). The display subsystem receives the fifth image and the third information, and may store the fifth image or a partial image in the fifth image (which may be referred to as a sixth image), and also a target image on the fifth image (which may be referred to as a third target image) in the CWB memory. The HWC obtains the target image from the CWB memory and sends the target image to a noise algorithm library. The library of noise algorithms may derive image noise (which may be referred to as second image noise) based on the target image.
As another example, when the matte mark is the first mark, if the electronic device refreshes the image (which may be referred to as the first image), the surface flicker transmits the display parameter of the interface (which may be referred to as the fifth display parameter) to the HWC, and the HWC may call the underlying hardware composite image after receiving the fifth display parameter. The HWC does not transmit the information to be scratched when transmitting the synthesized image (which may be the first image or an image that has been processed to obtain the first image) to the display subsystem. The display subsystem receives the first image and no longer stores the first image, a partial image (which may be referred to as a second image) in the first image, and a target image (which may be referred to as a first target image) on the first image in the CWB memory. Accordingly, the HWC cannot obtain the target image from the CWB memory and does not send the target image to the noise algorithm library. The noise algorithm library no longer derives image noise based on the target image.
As shown in fig. 18, on the SCP processor side, after the SCP processor is started, the driving of the ambient light sensor is initialized, and then the ambient light integration is started according to a preset acquisition period.
After the ambient light integration is finished, the on-off screen state can also be monitored, and after the on-off screen event is monitored by the HWC on the AP processor side, the AP processor sends related information to the SCP processor to trigger the change of the on-off screen state on the SCP processor side.
In the bright screen state, the SCP processor needs to send the acquired initial ambient light to the HWC of the AP processor, the HWC in the AP processor sends the initial ambient light to the noise algorithm library, and the noise algorithm library calculates a raw value of the target ambient light according to the received initial ambient light. The AP processor sends the raw value of the target ambient light to the ambient light memory of the SCP processor. Of course, in practical applications, the AP processor may also calculate the lux value of the target ambient light according to the raw value of the target ambient light. The AP processor sends the lux value of the target ambient light to the SCP processor.
Referring to fig. 18, in the bright screen state, the SCP processor needs to send the end time of the integration and the sleep duration to the AP processor.
And the AP processor receives the integral ending time and the dormancy duration reported by the SCP processor. The HWC sets the scratch flag to be first character, and in the case where the scratch flag is first character, the HWC will stop the CWB write back function.
After receiving the first information, a matting thread (a thread for executing to acquire a target image) in the HWC in the AP processor sets a matting mark as a first character. Then, the duration which should be dormant is obtained through calculation, a sleep function is called based on the duration which should be dormant, when the sleep function is called by the cutout thread, the duration which should be dormant (for example, 270 ms) is transmitted, and the cutout thread can be dormant for 270 ms. After sleeping for 270ms, the matting thread is finished sleeping. After the dormancy of the matting thread is finished, the matting thread sets the matting mark as a second character, and the CWB write-back function is started.
The SCP processor also needs to calculate to obtain the lux value of the target ambient light from the raw value of the target ambient light. In addition, ambient light integration continues to be initiated at the beginning of the next integration.
When the screen is turned off, the initial ambient light collected by the ambient light sensor is the real ambient light, and at this time, the SCP processor can no longer report the lux value of the initial ambient light collected in the integration time period to the AP processor. Since the CWB write back function need not be enabled to pick up the associated noise in the off-screen state, the SCP processor no longer counts the time for the next activation of the CWB write back function.
Certainly, in some scenes, for example, when the face is unlocked in the screen-off state, the electronic device needs to know whether the current environment is a dark environment, and the face needs to be supplemented with light in the dark ambient light. Therefore, the electronic device needs to know the current lux value of the real ambient light in this scenario. Therefore, even in the screen-off state, the ambient light sensor needs to collect ambient light, and the SCP processor reports the lux value of the collected ambient light to the AP processor when receiving the face unlocking request issued by the AP processor, so that the AP processor determines whether to supplement light according to the received lux value of the ambient light.
The embodiment of the application focuses on how the HWC controls the start and stop of the CWB write back function in the AP processor. Other details not shown may refer to the description in any of the embodiments above.
As mentioned above, the starting time and the stopping time of the CWB write back function in the AP processor are determined by the data reported by the SCP processor. Data transmission delays may exist in view of inter-core communications between the AP processor and the SCP processor. It is possible to provide: after the AP processor determines that the display screen is bright, the HWC in the AP processor controls the CWB write-back function to be normally open, and after the HWC receives first information reported by the SCP processor, the HWC starts to control the CWB write-back function to be circularly started and stopped according to the start-stop scheme described in any embodiment.
Taking the embodiment shown in fig. 9 as an example, if the start-stop method of the CWB write-back function shown in fig. 17 is adopted:
HWC can obtain t01The target image at the moment and the noise algorithm base also calculate to obtain t01Image noise at a time;
HWC can obtain t02The brightness value to be adjusted at the moment and the noise algorithm base are also calculated to obtain t02Backlight noise at a moment;
HWC can obtain t03The target image at the moment and the noise algorithm base also calculate to obtain t03Image noise at a time;
HWC can obtain t 04The target image at the moment and the noise algorithm base also calculate to obtain t04Image noise at a time;
HWC no longer gets t11The target image at the moment, the noise algorithm base does not calculate to obtain t11Image of time of dayNoise;
HWC can obtain t12The brightness value to be adjusted at the moment, and the noise algorithm base do not calculate to obtain t12Backlight noise at the moment.
If the electronic device refreshes an image at a frequency of 60Hz and a CWB write-back function is normally used in a scene where the electronic device plays a video, the HWC may acquire a target image 18 times in a non-integration time period (300ms for example) within an acquisition cycle (350ms for example), and the noise algorithm library may calculate and store image noise 18 times.
By using the CWB write-back function start-stop scheme shown in fig. 17, within one acquisition cycle (350ms), the process of acquiring the target image by the HWC for 18 times and the process of calculating the image noise by the noise algorithm library for 18 times can be reduced. Obviously, the power consumption can be reduced by adopting the CWB write back function start-stop scheme shown in fig. 17.
However, in the embodiment shown in FIGS. 15 and 16, if t-1The time of day is in the non-integration period of time in the last acquisition cycle. Since the non-integration period CWB write back function is stopped (steps A4-A6 are no longer performed), the display subsystem no longer stores the image to be refreshed in the CWB memory, and accordingly, the HWC does not obtain t -1The target image at the moment and the noise algorithm library do not obtain t-1Target image of time and t-1Image noise at time instants. Accordingly, there is also no t in the noise memory-1Image noise at time instants. Then the noise algorithm library will lose the t pair when calculating the integral noise for each sub-period0Time to t01The initial ambient light corresponding to the moment has interfering fusion noise. In the missing pair t0Time to t01Under the condition that the initial environment light corresponding to the moment has the interfered fusion noise, the noise algorithm library adopts t stored in the noise memory-1The fusion noise before the time is taken as the interference t0Time to t01The fusion noise of the initial ambient light at the moment causes inaccuracy of the target ambient light calculated finally.
To solve the problem, the write-back function of the CWB may be controlled to be started before the start time of each integration period, and after the write-back function of the CWB is started, the image is forced to be refreshed once, so as to ensure that the display subsystem stores the image to be refreshed forcibly in the CWB memory, and the HWC can obtain the target image corresponding to the image to be refreshed forcibly from the CWB memory. Correspondingly, the noise algorithm library also calculates and obtains the image noise corresponding to the moment of forcibly refreshing the image. The embodiment of the application records the image which is forcibly refreshed as the third image.
As mentioned above, before the image is forced to be refreshed, the CWB write-back function is already enabled, that is, the matting flag is already recorded as the second flag, and the HWC module transmits the information (which may be recorded as the second information) to be scratched together when sending the image to be forced to be refreshed to the display subsystem. Accordingly, the image stored by the display subsystem into the CWB memory that can be a forced refresh may be a partial image of the forced refresh image (the partial image is denoted as the fourth image) or may be a target image (the target image is denoted as the second target image). As mentioned above, the target image may obtain corresponding image noise (which may be referred to as first image noise).
An interface for forcing an image is present in the HWC, and the HWC calls the interface when determining that the image needs to be forced, and the electronic device implements forced refreshing of the image once. When the HWC calls the interface, a first signal is sent to the Surface flag through the interface, and after receiving the first signal, the Surface flag acquires the latest cached display parameter in the cached display parameters from the cache, wherein the display parameter is marked as the first display parameter. The Surface flag sends the display parameter to the HWC module, the HWC calls underlying hardware based on the display parameter to obtain a synthesized image (the image is a third image), and checks that the scratch mark is a second mark, so that the HWC carries information to be scratched when sending the synthesized image to the display subsystem.
In practical applications, the latest cached display parameter of the display parameters cached by the Surface flicker may be the corresponding display parameter when the image is refreshed before. If the electronic device refreshes the image before forcibly refreshing the image, the refreshed image is the first image, and correspondingly, the latest cached display parameter of the display parameters cached by the Surface flicker may be the fifth display parameter for generating the first image. Thus, the third image may be the same as the first image. Therefore, the image that the electronic device implements the forced refresh may be an image currently displayed on the display screen of the electronic device (after the first image is refreshed last time, the display screen of the electronic device keeps displaying the first image). The process of forced refreshing the image is the same as the process of normal refreshing the image, and the forced refreshing image and the normal refreshing image are all displayed through the surface flag, the HWC, the OLED drive and the display subsystem. For a specific process, reference may be made to the description of the above embodiments, which is not repeated herein.
In the embodiment of the application, the purpose of forcibly refreshing the image is to display the image currently displayed on the display screen, and the image currently displayed on the display screen is the image refreshed last time on the display screen. In practical application, before the image to be displayed is sent to the display subsystem, a frame of image may be cached, and the frame of image may be understood as an image currently displayed on the display screen or an image refreshed on the display screen at the last time. The HWC retrieves the image from the cache and then passes the image down to the display subsystem along with the information that needs to be scratched. The display subsystem may store the image (or a region image of the image, or a target image corresponding to the image) in the CWB memory, and the HWC performs the step of retrieving the target image from the CWB memory.
As mentioned above, if the HWC needs to perform matting on a refreshed image to obtain a target image, the HWC can transmit the synthesized image downward to carry information that needs to be scratched. If the HWC does not need to matte the image currently to be refreshed, the HWC may not transmit the information that needs matting (or carry information that does not need matting). The display subsystem is based on whether the received image carries information needing to be scratched and is used as a basis for storing the information in the CWB memory. In the case where the received image carries information that needs to be matted, steps a4 through a6 in the technical architecture shown in fig. 7 are performed. In case the received image does not carry information that needs matting (or carries information that does not need matting), steps a4 to a6 in the technical architecture shown in fig. 7 are no longer performed.
The image is forced to be refreshed after the CWB write-back function is started, so the transferred data carries the information to be scratched when the AP processor executes steps a2 to A3 in the technical architecture shown in fig. 7.
Referring to fig. 19, there is provided a start-stop scheme for forcibly refreshing an image after a CWB write-back function is started at a first preset time before integration starts according to an embodiment of the present application. In this embodiment and the following embodiments, in order to facilitate drawing, the time when the CWB write-back function is activated and the time when the image is forcibly refreshed are set to be the same time. For drawing convenience, the stop time of the CWB write back function and the integral end time are set to be the same time, and in practical application, the stop time of the CWB write back function may be later than the integral end time.
As shown in FIG. 19, a first preset time (t) before the start of integration for each acquisition cycle2-t1n、t4-t3n、t6-t5n) Corresponding time (t)1n、t3n、t5n) After the CWB write back function is initiated, the image is forced to refresh once. Also understood as a second preset time (t) after the start of the non-integration period of each acquisition cycle1n-t1、t3n-t3、t5n-t5) Corresponding time (t)1n、t3n、t5n) After the CWB write back function is initiated, the image is forced to refresh once. The sum of the first preset time and the second preset time is the duration of a non-integration time period.
Taking the first acquisition cycle as an example, T is the non-integration period of the first acquisition cycle (T1)1nAt this point, the HWC in the AP processor controls the writeback function of the CWB to start and force a refresh of the image once after start. HWC can obtain t1nThe target image corresponding to the image moment is forcibly refreshed at the moment, and the noise algorithm library can calculate to obtain t1nImage noise at time, noise algorithm library will t1nThe image noise at the time is stored in a noise memory. For other acquisition periods, reference may be made to this example of acquisition period, which is not described herein again.
Start-stop scheme for verifying CWB write-back function shown in FIG. 19Without losing the fusion noise that interferes with the initial ambient light collected for the integration period, see the embodiment of fig. 20, in which T is the non-integration period prior to the integration start time of the second collection period (T2), see the embodiment of fig. 20 1nAt this point, the HWC in the AP processor controls the writeback function of the CWB to start and force a refresh of the image once after start. HWC can obtain t1nThe target image corresponding to the image moment is forcibly refreshed at the moment, and the noise algorithm library can cache t1nTarget image of time and calculating to obtain t1nImage noise at time, noise algorithm library will t1nThe image noise at the time is stored in a noise memory.
At t1nAfter the time to the start of the integration period (t)2) Neither brightness adjustment nor image refresh is present.
If there is only one brightness adjustment during the second acquisition period (T2): t is t21And adjusting the brightness at the moment. The library of noise algorithms may be based on t1nTarget image corresponding to image refreshed at any time and t21Adjusted luminance at a time t21Backlight noise at the moment. Noise algorithm library will t21The backlight noise at the moment is sent to the noise memory.
The integration period ends (t) during the second acquisition cycle3Time) after, t is stored in the noise memory1nImage noise and t at time21Backlight noise at the moment.
Referring to fig. 21, the integral noise of the initial ambient light that interferes with the second acquisition period is:
duration of "t2Time to t21T at time "t1nImage noise at a time;
Duration of "t21Time to t2T at time "t21Backlight noise at the moment.
As can be understood from the embodiment shown in fig. 20, if a start-stop scheme is used that forces the image to be refreshed before integration begins:
no image refreshing exists between the moment of forcibly refreshing the image to the moment of starting integration next timeThe library of noise algorithms may also obtain a first sub-period (t) affecting the present integration period2Time to t21Time of day) of the fusion noise (t)1nFusion noise at time).
And, there is a brightness adjustment (t) between the time of this forced refresh of the image and the time of the next image refresh21Brightness adjustment at a time), a target image (t) corresponding to the brightness adjustment time can be obtained1nThe target image corresponding to the moment) to obtain the correct brightness adjustment moment (t)21Time of day) corresponding to the backlight noise.
When the electronic device plays a video through the display screen, the image displayed on the display screen of the electronic device may be refreshed at a frequency of 60Hz, i.e., every 16.7 ms. The acquisition period of the ambient light sensor may be set to 350ms, the integration period may be set to 50ms, and the non-integration period to 300 ms. Even before the start of the integration period (e.g., t) 2-t1n20ms) and forces a refresh of the image once after the CWB write back function is activated. This corresponds to the process of acquiring the target image and computing the image noise by the noise algorithm library with a reduced number of HWC acquisition target images (300-20)/16.7-16.8 times in one acquisition cycle.
In the above embodiment, t2-t1n20ms, in practice, t2-t1nBut may also be equal to other duration values. Example of the present application for t2-t1nThe corresponding time duration is set to ensure that the noise algorithm library can obtain the target image once and the image noise once before the integration starts. Therefore, the above embodiment can reduce the power consumption of the processor, and simultaneously can obtain accurate target ambient light.
In practical applications, when a display screen of an electronic device is on, the display screen may not be in a refresh state all the time, and may also be in an idle state for a long time.
The display screen comprises the following components when the display screen is lightened: an idle state and a refresh state. In practical application, the moment when the display screen refreshes the image for the last time can be obtained, and whether the display screen is in a refreshing state or an idle state at present is judged according to the difference value between the current moment and the moment when the display screen refreshes the image for the last time. A threshold value may be preset, the display screen is currently in a refresh state when a difference between a current time and a time when the image is refreshed for the last time on the display screen is smaller than the threshold value, and the display screen is currently in an idle state when the difference between the current time and the time when the image is refreshed for the last time on the display screen is greater than or equal to the threshold value.
The embodiments of the present application do not intend to strictly distinguish between a refresh state and an idle state. It is only to explain that when the image displayed on the display screen of the electronic device does not change for a long time (idle state), the image displayed on the display screen is always the corresponding image when the image is refreshed for the last time.
As another example, when a user views a certain interface of an electronic device, no operation is performed for a long time while there is no animation in the current interface. Before the display screen is turned off, the display screen is in an idle state. When the display screen of the electronic device plays a video, the display screen may refresh the image at a frequency of 60Hz, and the display screen is in a refresh state. In the embodiment of the application, the display screen is in a refreshing state, and the image displayed by the display screen may or may not change. The refresh state of the display screen the displayed content of the display screen does not change because: the image before the refresh obtained by the AP processor performing the steps a1 to A3 in the technical architecture shown in fig. 7 is exactly the same as the image after the refresh obtained by the AP processor performing the steps a1 to A3 in the technical architecture shown in fig. 7. The display screen is in an idle state, and the image displayed by the display screen does not change because: the AP process does not perform steps a1 to A3 in the technical architecture shown in fig. 7, and the display subsystem still sends the image obtained by the AP processor performing steps a1 to A3 in the technical architecture shown in fig. 7 for the last time to the display screen of the electronic device according to the preset refresh frequency.
Referring to fig. 22, a schematic diagram of a refresh state and an idle state provided in an embodiment of the present application is shown. In fig. 22, the display screen is always on.
In the TS0 period, the modules in the AP processor cooperate to synthesize image 1 to be refreshed on the display screen during the TS1 period.
In the TS1 period, the display subsystem sends the image 1 to the display screen, and the display screen displays the image 1 which is synthesized by the cooperation of each module in the AP processor in the TS0 period, and simultaneously, each module in the AP processor cooperates with the image 2 which is to be refreshed by the display screen when the AP processor synthesizes the TS2 period.
In the TS2 period, the display subsystem sends the image 2 to the display screen, and the display screen displays the image 2 which is synthesized by the cooperation of each module in the AP processor in the TS1 period, and simultaneously, each module in the AP processor cooperates with the image 3 which is to be refreshed by the display screen when the AP processor synthesizes the TS3 period.
In the TS3 period, the display subsystem sends the image 3 to the display screen, and the display screen displays the image 3 which is synthesized by the cooperation of each module in the AP processor in the TS2 period, and simultaneously, each module in the AP processor cooperates with the image 4 which is to be refreshed by the display screen when the AP processor synthesizes the TS4 period.
From the start time of the TS4 cycle, the display enters an idle state.
In the TS4 period, the display subsystem sends the image 4 to the display screen, the display screen displays the image 4 which is synthesized by matching each module in the AP processor in the TS3 period, and the AP processor does not synthesize the image to be refreshed any more.
During the TS5 period, the display subsystem sends image 4 to the display screen, which continues to display image 4 and the AP processor no longer synthesizes the image to be refreshed.
During the TS6 period, the display subsystem sends image 4 to the display screen, which continues to display image 5, and the AP processor no longer synthesizes the image to be refreshed.
In the above flow, the periods TS0 to TS3 are the refresh states of the display screen, and the periods TS4 to TS6 are the idle states of the display screen. After the TS4 period and the TS4 period, the electronic device does not perform an image refresh operation, the display screen enters an idle state, and after the display screen enters the idle state, the display subsystem still sends the image 4 synthesized by the AP processor for the last time to the display screen for display according to the preset frequency (the frequency is the refresh frequency of the display screen). The displayed image (image 4) is the last image refreshed before the display screen switches to the idle state. Although the display subsystem still sends the image 4 last composed by the AP processor to the display screen at the preset frequency (which is the refresh frequency of the display screen), the AP processor does not perform steps a1 to a2 of the technical architecture described in fig. 7.
Of course, in practical applications, the periods TS0 to TS4 may be recorded as the refresh state of the display screen, and the periods TS5 to TS6 may be recorded as the idle state of the display screen.
In the CWB write-back function enabled state, if the display screen is in a refresh state, the HWC may extract a target image corresponding to a currently refreshed image, and similarly, may also extract corresponding image noise. If the display is idle, even if the CWB write back function is enabled, the AP processor does not perform the process of matching the respective modules with the composite image in steps a 1-A3 in the embodiment shown in fig. 7. Accordingly, the AP processor no longer performs steps a4 through a6 in the embodiment depicted in fig. 7, and the noise algorithm library does not receive the target image during the idle state of the display screen, nor does it obtain the image noise during the idle state of the display screen.
The display screen is idle for a long time, e.g. 1 minute, and the display screen does not need to refresh the image within this 1 minute. With a start-stop scheme that forces an image refresh once before the start of integration of the ambient light sensor, this may result in a forced refresh being required every 350ms within this 1 minute. This corresponds to an additional refresh of 171.4 images of about 60000ms/350ms in 1 minute. Therefore, in the case where the display screen is in the idle state for a long time, power consumption is increased again undoubtedly.
To more clearly understand the reason why the start-stop scheme of forcing the image refresh between the integration starts may lead to an increase in the power consumption of the processor when the display screen is in an idle state for a long time, this is illustrated by way of example in fig. 23.
Referring to fig. 23, t at the integration period of the 1 st acquisition cycle01The image is refreshed at any moment, and correspondingly, the noise algorithm library stores t01Image sum of time t01Image noise at time instants.
Referring to FIG. 23, at t01After the moment, the display screen is at the M +1 thT of the integration period of the acquisition cycle(2M)1The image is refreshed once again at a time, which example ignores the brightness adjustment.
Referring to FIG. 23, t precedes the integration period of the 2 nd acquisition cycle1nAfter the CWB write back function is enabled at a time, the image is forced to refresh once (for convenience of description, the time of forced image refresh and the time of enabling the CWB write back function are within the same time metric unit, for example, both are within 1 ms), the AP processor performs steps a4 to a6, and the noise algorithm library obtains t1nTarget image of time and t1nImage noise at time instants.
Referring to fig. 24, the integrated noise for the integration period of the 2 nd acquisition cycle is: duration t being integral duration1nImage noise at time instants.
Referring to FIG. 23, t precedes the integration period of the 3 rd acquisition cycle3nAfter the CWB write-back function is started at the moment, the image is forcibly refreshed once, the AP processor executes the steps A4 to A6 once, and the noise algorithm library obtains t3nTarget image of time and t3nImage noise at time instants.
Referring to fig. 24, the integrated noise for the integration period of the 3 rd acquisition cycle is: duration t being integral duration3nImage noise at time instants.
……。
Referring to FIG. 23, t precedes the integration period of the M +1 acquisition cycle(2M-1)nAfter the CWB write-back function is started at the moment, the image is forcibly refreshed once, the AP processor executes the steps A4 to A6 once, and the noise algorithm library obtains t(2M-1)nTarget image of time and t(2M-1)nImage noise at time instants.
Referring to FIG. 23, t is the integration period of the M +1 acquisition cycle(2M)1At the moment, the image is refreshed, the AP processor executes the steps A1 to A6 once, and the noise algorithm library obtains t(2M)1Target image of time and t(2M)1Image noise at time instants.
Referring to fig. 24, the integrated noise for the integration period of the M +1 th acquisition cycle is: duration of time t2MTo t(2M)1T at the moment of time(2M-1)nImage noise at the moment and duration of time t(2M)1To t2M+1T at the moment of time(2M)1Image noise at time instants.
If the start-stop scheme of the embodiment shown in FIG. 19 is followed, in the embodiments shown in FIGS. 23 and 24, the start-stop scheme is started from t 0Time to t2MAt time (M acquisition cycles), the image is forcibly refreshed M times in total.
If write back function is turned on in CWB (t)1n、t1n……t(2M-1)n) The image is not forced to be refreshed afterwards. Then, referring to fig. 25, the integrated noise for the 2 nd acquisition cycle is: t of duration integration time01The image noise at the moment, the integral noise at the 3 rd acquisition period is: t of duration integration time01The image noise at time instant, … …, the integrated noise for the M +1 acquisition period is: duration of time t2MTo t(2M)1T at the moment of time01Image noise at the moment and duration of time t(2M)1To t2M+1T at the moment of time(2M)1Image noise at time instants.
As mentioned above, the process of forcing the image refresh does not change the image displayed on the display screen, i.e. t01The image refreshed at that moment is the same as the image refreshed at the forced refresh moment, and correspondingly, t01The target image at the time and the target image at the forced refresh time are also the same. If brightness adjustment is ignored, t01The image noise at the time and the image noise at the forced refresh time are also the same. If brightness adjustment exists, the target image adopted at the brightness adjustment moment is unchanged. Therefore, it is not necessary to force refreshing of the image in some scenes.
As can be understood from the above analysis, when the display screen is in the idle state for a long time, image noise that may interfere with the integration period may not be lost even if the image is not forcibly refreshed.
Of course, in the above embodiment, if t01The moment is in the non-integral time period of the first acquisition cycle, the noise algorithm library may not obtain t01Target image at time and image noise. Will need to be at t1nInstant forced brushA new image.
In combination with the above various embodiments, the embodiments of the present application provide the technical solution shown in fig. 26. The embodiment shown in fig. 26 comprises the following steps:
at step 2601, the HWC in the AP processor starts the CWB write back function at a first preset time before the integration starts and looks at the last time the image is refreshed on the display.
In the embodiment of the application, whether the display screen needs to be forcibly refreshed or not, the CWB write-back function needs to be started at a first preset time before the integration starts, and then other factors are combined to determine whether the image needs to be forcibly refreshed or not.
For convenience of description, referring to fig. 27, a time corresponding to a first preset time before the start of integration in one acquisition period (T2) is selected as a reference time, which is T3n. The embodiment of the application needs a time (t) corresponding to a first preset time before the integration starts3nTime) initiates the CWB write back function and looks at the time the image was last refreshed on the display screen.
For convenience of description, the time when the image is refreshed on the display screen can be recorded as tk
As an example, when the image is refreshed last time no longer within the current non-integration period, the electronic device waits for the upper layer application to transmit the display parameters of the interface to the display engine service and then goes through the display engine service, the Surface flag, the HWC, and the like without forcing the refresh of the image. For the HWC, the HWC waits for a display parameter (which may be denoted as a second display parameter) sent by the Surface flag.
The HWC module receives the display parameters sent by the Surface flag module of the electronic equipment, and stores the moment when the display parameters are received and sixth display parameters;
the HWC module may obtain the time when the electronic device last refreshes the image, which may be the time when the display parameter sent by the Surface flag module last is received. The sixth display parameter may be set to the last display parameter that the HWC module acquired before the time the electronic device last refreshed the image was acquired. Accordingly, the time the image was last refreshed is the time the HWC module received the sixth display parameter.
Step 2602, if the time of last image refreshing of the display screen is within the current non-integration time period, wait for a second time (which may be recorded as a second time length).
In the embodiment of the application, the time difference between the time of last image refreshing and the current time is greater than the difference threshold value, which may be understood that the display screen has already entered the idle state, and the time difference between the time of last image refreshing and the current time is less than or equal to the difference threshold value, which may be understood that the display screen has not entered the refreshing state. Wherein the difference threshold may be determined based on empirical values.
The focus of the embodiment of the present application is to obtain the time of the last image refresh (when the HWC performs the image matting, the start time of the HWC performing the image matting from the last refresh) to determine whether the image needs to be forcibly refreshed according to the time of the last image refresh (or the start time of the HWC performing the image matting from the last refresh).
The focus of the embodiment of the application is not to confirm the current state of the display screen. The current state of the display screen is convenient for understanding the reason why the power consumption is increased in the idle state of the display screen in the above-described embodiments.
Referring to the embodiment shown in FIG. 27, the time t when the image is refreshed on the display screenkDuring the current non-integration period (t)3To t3nIn between), then the display screen is at t3nThe image displayed at the moment of time is t kThe image being refreshed at all times. t is tkThe time is at t3Time to t3nBetween the moments, this time period is the time when the CWB write back function stops. I.e. the HWC has not acquired tkThe target image at that moment, and correspondingly, the noise algorithm library also does not obtain tkImage noise at time instants. If the image is not forced to be refreshed at this time, the following may occur:
(1) at t3nAfter the moment, the first change (image refresh or brightness adjustment) of the content displayed on the display screen is tbAnd adjusting the brightness at the moment.
tbAt a time t3nTime to t4Time between the times and tbTime to t4Without image refresh and brightness adjustment between moments. t is tbBacklight noise pair t of time instants4Time to t5There is a disturbance in the initial ambient light for the integration period corresponding to the instant. At the calculation of tbIn the case of backlight noise at the time, the latest target image buffered by the HWC is not tkTarget image of time, but tkA target image before the moment, t calculatedbBacklight noise errors at time instants. T resulting in a noisy algorithm library calculation4Time to t5The target ambient light for the integration period corresponding to the instant is inaccurate.
tbAt a time t4At time tbBacklight noise pair t of time instants4Time to t5There is a disturbance in the initial ambient light for the integration period corresponding to the instant. At the calculation of t bIn the case of backlight noise at the time, the latest target image buffered by the HWC is not tkThe target image of the moment, the calculated tbBacklight noise error at time, resulting in t calculated by noise algorithm library4Time to t5The target ambient light for the integration period corresponding to the instant is inaccurate.
tbAt a time t4Time to t5In the case of time between times, t4Time to tbInitial ambient light exposure t between momentskImage noise interference at a moment. While HWC does not get tkAt the moment of image noise, t is used in the integration processkOne-time fusion noise before the moment (possibly backlight noise and also image noise) is taken as the pair t4Time to tbThe initial ambient light between the moments causes interfering fusion noise, resulting in t calculated by the noise algorithm library4Time to t5The target ambient light for the integration period corresponding to the instant is inaccurate. In addition, tbBacklight noise pair t of time instants4Time to t5There is a disturbance in the initial ambient light for the integration period corresponding to the instant.At the calculation of tbIn the case of backlight noise at the time, the latest target image buffered by the HWC is not tkThe target image at the moment in time, resulting in a calculated backlight noise error. T calculated by noise algorithm library4Time to t5The target ambient light for the integration period corresponding to the instant is inaccurate.
(2) At t3nThe last change (image refresh or brightness adjustment) of the content displayed on the display screen after the moment is tbImage refresh at the moment. t is tbAt a time t4Time to t5In the case of time between times, t4Time to tbInitial ambient light exposure t between momentskInterference of image noise at the moment. While the noise algorithm library does not calculate tkImage noise at time instants. Noise Algorithm library computation t4Time to t5Using t stored in noise memory for target ambient light at a timekThe fusion noise before the moment acts as the fusion noise of the first sub-period of the interference integration period, resulting in t calculated by the noise algorithm library4Time to t5The target ambient light for the integration period corresponding to the instant is inaccurate.
However, it can be understood from the above analysis that if the time of last refreshing the image on the display screen is within the current non-integral time period, the image needs to be forcibly refreshed to obtain the target image corresponding to the image currently displayed on the display screen and the image noise corresponding to the target image, and certainly, the target image corresponding to the image currently displayed on the display screen and the image noise corresponding to the target image can be understood as t3nTarget image at time and image noise.
As another embodiment of the present application, if the time when the image is refreshed on the display screen last time is not within the current non-integration time period, the image is not forced to be refreshed.
In the embodiment of the application, the moment t when the image is refreshed on the display screen for the last timekIs not at t3Time t and3nbetween the moments, it may be within the integration period of the present acquisition cycle, or within the previous or earlier acquisition cycle.
If tkIn the integration time period in the current acquisition cycle, the HWC can obtain t because the CWB write-back function is started during the integration time period in the current acquisition cyclekThe target image at the moment, the noise algorithm library can also obtain tkThe target image at the moment and the image noise, and therefore, the image does not need to be forcibly refreshed.
If tkDuring the last acquisition cycle or earlier, since it was already performed according to the embodiment shown in fig. 26 at the time of the last acquisition cycle, it is not necessary to consider whether to forcibly refresh the image. The present application will subsequently verify this situation (t) by means of fig. 28 to 30kLast acquisition cycle or earlier acquisition cycle) whether forced refreshing of the image is not a concern. Reference is made in particular to the description of fig. 28 to 30.
In the embodiment of the application, the time t of refreshing the image on the display screen is judged kThe method of whether it is within the non-integration period of the current cycle may refer to the manner shown in fig. 27.
The first method is as follows: judgment T22 (T)3n-tk) And T21 (T)3n-t3) The size of (2). If T22 (T)3n-tk) Less than T21 (T)3n-t3) It means that the time of the last refresh of the image on the display screen is within the current non-integration period. Otherwise, the time of refreshing the image on the display screen is not in the current non-integration time period. In this embodiment, T22 may be recorded as a first difference and T21 may be recorded as a second difference.
The second method comprises the following steps: judging tkAnd t3The size of (2). If tkGreater than t3And is less than t3nIt indicates that the moment at which the image was last refreshed on the display screen is within the current non-integration period. Otherwise, the time of refreshing the image on the display screen is not in the current non-integration time period. In the embodiment of the present application, if t3Image refresh is performed at time t3The CWB write back function will not stop until after that time. Thus, at t3The HWC can retrieve t in the presence of an image refresh at that time3The target image at the moment and the noise algorithm library can also acquire t3Target image of time of dayNoise. Therefore, T22 (T) may be set3n-tk) Equal to T21 (T)3n-t3) The moment at which the display screen switches to the idle state is within the current integration period. In the same way, t kIs equal to t3The moment at which the display screen switches to the idle state is within the current integration period.
The third method comprises the following steps: it is checked whether the time of the last refresh of the image and the time of the last matting are less than a certain threshold (because there may be a difference between the image refresh time and the start time of the HWC execution to fetch the target image). If the current non-integration time period is less than a threshold, the image indicating the last refresh is over-scratched by the HWC, and the image indicating the last refresh is not in the current non-integration time period. If the threshold value is greater than or equal to a certain threshold value, it indicates that the image refreshed last time has not been subjected to the matting processing by the HWC, and it indicates that the threshold value is set according to the actual situation in the current non-integration time period. In this embodiment, the threshold may be denoted as a first threshold.
In the embodiment of the present application, a time when the HWC acquires the display parameters of the interface from the surface flag may be used as a time when the image is refreshed this time, a time when the HWC obtains the synthesized image through the underlying hardware may be used as a time when the image is refreshed this time, and a time when the image is sent to be displayed by the display subsystem may be used as an image refresh time. Whichever timing is the timing of refreshing the image this time may cause a slight difference between the timing of refreshing the image this time and the timing at which the HWC starts to perform matting from the image refreshed this time, for example, 0.5ms, 0.8ms, 1ms, and the like. Of course, the time of this refresh of the image and the time when the HWC starts to perform matting from the refreshed image may also be equal. And a refresh image adjacent to the current refresh image, wherein the time of the adjacent refresh image and the time of the current refresh image are usually different by one refresh period, and the refresh period is 1000ms/60 to 16.7ms taking a refresh frequency of 60Hz as an example. Taking the refresh frequency of 120Hz as an example, the refresh period is 1000ms/120 to 8.3 ms. Thus, the threshold in this example may be a relatively small value, e.g., 2ms, relative to the refresh period.
Step 2603, if the electronic device refreshes the image during the second preset time, the electronic device does not force to refresh the image.
When the electronic device refreshes the image, it indicates that the HWC can receive the display parameter sent by the Surface flag (the display parameter is denoted as a fourth display parameter). The currently refreshed image may be denoted as a fifth image.
Step 2603', if the electronic device does not refresh the image while waiting for the second preset time period, the image is forcibly refreshed.
In this embodiment of the application, if the electronic device is always refreshing the image, a second preset time period (for example, 17ms) has a refresh action, and the HWC has already acquired the latest target image, and then lags the second preset time to decide whether to perform the forced refresh of the image, so that an additional action of forcibly refreshing the image can be avoided, and power consumption can be further reduced.
The HWC module waits for a second preset time period, and if the display parameter (which may be referred to as a third display parameter) sent by the Surface flicker is not received, the image needs to be forcibly refreshed.
In the following, three embodiments (all take HWC cutout every time the display refreshes an image as an example) are used to verify that the display is in an idle state for a long time (in the above embodiment, if t is kIn the last acquisition period or an earlier acquisition period) whether all image noise and backlight noise interfering with the initial ambient light acquired during the integration period can be obtained.
Referring to fig. 28, the time when the image is refreshed on the display screen last time is the integration time period of the last acquisition cycle, and the image is not refreshed again all the time. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t1nTime t2nTime t3nThe CWB write back function is enabled all the time.
First acquisition cycle, t0At the moment, the ambient light sensor starts to collect the initial ambient light, t0At this point, the CWB write back function has been enabled. At tkThe last time the image is refreshed on the display screen at that moment, and the noise algorithm library obtains tkOf time of dayImage noise.
At t1nAnd the moment when the image is refreshed on the display screen for the last time is not in the non-integral time period. Display screen t1nThe image displayed at the moment of time is tkThe image displayed at the moment, if the backlight adjustment is ignored, t1nThe image noise at time tkImage noise at time instants. And the noise algorithm library has already acquired tkImage noise at time, therefore, t1nThe image is no longer forced to refresh at that moment.
In the second acquisition period, there is no image refresh and brightness adjustment is ignored. After the integration of the second acquisition period is finished, the fusion noises disturbing the integration time period are all tkImage noise at time instants.
At t3nAnd the moment when the display screen refreshes the image for the last time is not in the non-integral time period. Display screen t3nThe image displayed at the moment of time is tkThe image displayed at the moment, if the backlight adjustment is ignored, t3nThe image noise at time tkImage noise at time instants. While HWC has already obtained tkImage noise at time, therefore, t3nThe image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the image is refreshed last time in the integration period of the last acquisition cycle and the image is not refreshed again, the image noise interfering with each integration period can be obtained without forcibly refreshing the image.
Referring to FIG. 29, t in the non-integration period of the previous acquisition cycle is the time at which the image was last refreshed on the display screen1nBefore the moment, and the image is not refreshed again. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t 1nTime t2nTime t3nThe CWB write back function is enabled all the time.
First acquisition cycle, t0At the moment, the ambient light sensor starts to collect the initial ambient light, t0At this point, the CWB write back function has been enabled. At tkAt the moment that the display screen refreshes the image for the last time, at the moment that the CWB write-back function is in a stop state, the HWC does not obtain tkImage noise at time instants.
At t1nThe moment when the image is refreshed on the display screen is the non-integral time period, and the image needs to be refreshed forcibly (the moment becomes the moment t when the image is refreshed last time)k') to obtain t1nImage noise at time instants.
Certainly, in practical application, even if a decision is made to forcibly refresh a new image, the method can wait for a certain time period, if the display screen refreshes the image according to the refresh frequency during the waiting for the certain time period, the image does not need to be forcibly refreshed, and if the refreshed image is not monitored after waiting for the certain time period, the image can be forcibly refreshed. This example takes a forced refresh image as an example.
In the second acquisition period, no image is refreshed, brightness adjustment is ignored, and after the integration of the second acquisition period is finished, the fusion noises disturbing the integration time period of the second acquisition period are all t1nImage noise at time instants.
At t3nAnd the moment when the display screen refreshes the image last time is not in the non-integral time period. Display screen t3nThe image displayed at the moment of time is t1nThe image displayed at the moment, if the backlight adjustment is ignored, t3nThe image noise at time t1nImage noise at time instants. And the noise algorithm library has already acquired t1nNoise at time, therefore, t3nThe image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the moment when the image is refreshed on the display screen last time is before the write-back function of the non-integration period CWB of the last acquisition cycle is started, and the image is not refreshed again, the image noise that interferes with the initial ambient light of each integration period can be obtained without forcibly refreshing the image.
Referring to FIG. 30, t in the non-integration period of the previous acquisition cycle is the time at which the image was last refreshed on the display screen1nAfter time t2Before the moment, and the image is not refreshed again. The last acquisition cycle is a first acquisition cycle, and the current acquisition cycle is a second acquisition cycle. In this example, t 1nTime t2nTime t3nThe CWB write back function is enabled all the time.
A first acquisition cycle at t1nAt time, the CWB write back function is enabled.
At tkAt the moment, the display screen refreshes the image, and then the noise algorithm library can acquire tkImage noise of the image displayed at the moment.
In the second acquisition period, there is no image refresh and brightness adjustment is ignored. After the integration of the second acquisition period is finished, the fusion noises disturbing the integration time period are all tkImage noise at time instants.
At t3nAnd the moment when the display screen refreshes the image for the last time is not in the non-integral time period. Display screen t3nThe image displayed at the moment of time is tkThe image displayed at the moment, if the backlight adjustment is ignored, t3nThe image noise at time tkImage noise at time instants. And the noise algorithm library has already acquired tkNoise at time, therefore, t3nThe image is no longer forced to refresh at that moment.
In the third acquisition cycle, there is no image refresh and brightness adjustment is ignored. Processing continues according to the processing procedure in the second acquisition cycle.
As can be understood from this example, if the image is not refreshed again after the write-back function of the non-integration time CWB of the last acquisition cycle is started at the time when the image is refreshed last time on the display screen, the image noise that interferes with the initial ambient light of each integration time can be obtained without forcibly refreshing the image.
As can be understood by the examples of fig. 28 to 30: if the moment of refreshing the image on the display screen at the last time is not in the non-integral time period, the image does not need to be forcibly refreshed.
The embodiment of the present application adopts the flowchart shown in fig. 28, and on the basis of reducing the power consumption of the processor, it is possible to avoid not only obtaining image noise that interferes with the integration time period, but also obtaining a target image that is used when backlight noise that interferes with the integration time period is not obtained, and it is also possible to avoid a situation that negative gain occurs in an idle state where the display screen is in for a long time.
As described above, the HWC in the AP processor may monitor the change of data in the core node in both the integration period and the non-integration period, and when the data stored in the core node changes, the HWC acquires the brightness to be adjusted from the core node, and transmits the brightness to the noise algorithm library to calculate and obtain the backlight noise.
In practice, the method can be performed as follows.
During the stop of the CWB write-back function, when the HWC monitors that the data in the kernel node changes, the HWC may obtain the brightness to be adjusted, however, the HWC no longer transmits the brightness value to be adjusted to the noise algorithm library, and accordingly, the noise algorithm library no longer calculates and obtains the corresponding backlight noise.
When the CWB write back function is to be enabled (e.g., 1ms before the CWB write back function is enabled, etc.) or enabled, if the HWC does not monitor a change in the data stored in the kernel node while the CWB write back function is disabled, it indicates that the brightness value of the display screen has not changed. The HWC performs the start-stop method of the CWB write-back function provided in any of the embodiments described above.
When the CWB write back function is to be enabled or started, if the HWC monitors that the data stored in the kernel node has changed during the time the CWB write back function is stopped, it indicates that the brightness value of the display has changed. The HWC needs to send the latest brightness value monitored to the noise algorithm library. Then, the HWC performs the start/stop method of the CWB write-back function provided in any of the above embodiments. During the stop of the CWB write-back function, if there are multiple brightness adjustments, the noise algorithm library only needs to know the value after the last brightness adjustment, that is, the HWC sends the monitored brightness to be adjusted corresponding to the latest brightness change of the display screen to the noise algorithm library. The backlight noise corresponding to the brightness to be adjusted is avoided being frequently calculated by the noise algorithm library, and therefore power consumption is reduced.
As one example, after the HWC module sets a matte mark as the first character; the HWC module monitors whether data in a core node of the electronic device changes or not, wherein the core node stores a brightness value;
In response to a change in data in a core node of the electronic device, the HWC module retrieving a first luminance value from the core node;
after the HWC module obtains a first luminance value from the kernel node, the HWC module obtains a second luminance value from the kernel node in response to a change in data in the kernel node of the electronic device;
in response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
And when calculating and obtaining the image noise corresponding to the image which is forcibly refreshed, the noise algorithm library calculates and obtains the first image noise based on the target image corresponding to the image which is forcibly refreshed and the second brightness value.
After the HWC module sets a matte mark as a second character, the HWC module monitors whether data in a kernel node of the electronic device is changed, and the kernel node stores a brightness value;
in response to a change in data in a core node of the electronic device, the HWC module obtaining a third luminance value from the core node;
the HWC module sending the second luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, the HWC module retrieves a fourth luma value from a core node of the electronic device in response to a change in data in the core node;
The HWC module sends the fourth luma value to the noise algorithm library.
In the above embodiments, the HWC may or may not force the image to be refreshed.
If the HWC forcibly refreshes the image, the image noise corresponding to the forcibly refreshed image is obtained by calculating the brightness value newly transmitted into the noise algorithm library and the target image corresponding to the forcibly refreshed image. The backlight noise corresponding to the brightness adjusting moment does not interfere with the initial ambient light collected in the next integration time period. The fusion noise which interferes with the initial ambient light collected in the next integration time period may be image noise corresponding to a forced refresh image, and in this case, the value of the image noise is correct, so that the target ambient light error in the next integration time period is not caused.
If the HWC no longer forces the image to be refreshed, indicating that the noise algorithm library stores the target image of the image currently displayed by the display screen (the latest frame of target image stored by the noise algorithm library).
If the brightness adjusting time is earlier than the refreshing time of the image currently displayed on the display screen, the backlight noise corresponding to the brightness adjusting time does not interfere with the initial ambient light collected in the next integration time period. The fusion noise interfering with the initial ambient light collected in the next integration period may be image noise corresponding to the image currently displayed on the display screen. And the value of the image noise is correct. And will not cause target ambient light errors for the next integration period.
If the brightness adjusting time is later than the refreshing time of the image currently displayed by the display screen, the image noise corresponding to the refreshing time of the image currently displayed by the display screen cannot interfere with the initial ambient light collected in the next integral time period. The fusion noise interfering with the initial ambient light collected in the next integration period may be the backlight noise at the latest brightness adjustment time, and the backlight noise is generated by the latest target image acquired by the display screen and the adjusted brightness value at the latest time.
Thus, whenever the CWB write back function is enabled, whether or not the image is forced to be refreshed, during the time that the CWB write back function is disabled, if the HWC detects a brightness change; the HWC sends the latest brightness value monitored to the noise algorithm library when the CWB write back function is to be initiated. And calculating by a noise algorithm library to obtain the backlight noise. The HWC continues to perform the start-stop method of the CWB write-back function provided by any of the embodiments described above.
As another embodiment of the present application, during the start of the CWB write back function, the HWC may retrieve the target image from the CWB write back memory once every other frame.
As an example, when the display screen is refreshed at a frequency of 90Hz, which is equivalent to refreshing the image every 1000ms/90 ═ 11.11ms, the HWC fetches the target image from the CWB write-back memory every other frame specifically:
when the electronic device refreshes the image for the ith time (taking the image matting refreshed for the ith time as an example), after the HWC obtains the synthesized image, the HWC checks the matting mark as a second character, the HWC determines that the image refreshed this time is a matting frame, and the HWC continues to execute subsequent steps according to the above embodiment.
When the image is refreshed for the (i + 1) th time, after the HWC obtains the synthesized image, the sectional mark is checked, the HWC checks the sectional mark as a second character (during the start of the CWB write-back function, the sectional mark is a second character), the HWC obtains the time difference between the last time of determining the sectional frame by the HWC (the time of determining the sectional frame by the refresh image for the (i) th time) and the current time, and the time difference is smaller than the sectional frame difference threshold (11.11 ms, or other time values, e.g., 11.5ms, 11.8ms, 12ms, etc.), which indicates that the last refreshed image of the image refreshed for the (i + 1) th time is already the sectional frame, and the image refreshed for the (i + 1) th time is not subjected to sectional view any more.
When the image is refreshed for the (i + 2) th time, after the HWC obtains the synthesized image, the HWC checks the matting flag as a second character (during the start of the CWB write-back function, the matting flag is a second character), the HWC obtains the time difference between the last time of determining the matting frame (the i-th time of refreshing the image is determined as the matting frame) and the current time, and the time difference is greater than or equal to the matting frame difference threshold (may be 11.11ms, and may also be other time values, for example, 11.5ms, 11.8ms, 12ms, and the like), then it indicates that the image refreshed for the (i + 2) th time is the matting frame.
In the above example, the time difference is: the HWC last determines the time difference between the moment of the matting frame and the current moment. Actually, the time difference between the time when the HWC last transmitted the synthesized image (carrying the information to be scratched) to the OLED driver and the current time may be also used, or the time difference between the time when the HWC last started to perform scratching and the current time may be also used. The above manner of obtaining the time difference is only used as an example, and in practical applications, other manners of determining the time difference may also be used. The time difference is obtained in different ways, and correspondingly, the threshold value of the difference value of the matting frame can be different.
As an example, the interval between two image refreshes is theoretically 11.1ms, and the current time is the time when the HWC looks to get the matte mark as the second character. If the time difference is: the time difference between the moment of the matting frame and the current moment is determined last time, and the time difference is theoretically 11.1ms (the last frame is the matting frame) or 22.2ms (the last frame is not the matting frame). Thus, the matte frame difference threshold may be any value between 11.1 and 22.2. If the time difference is: the time difference is theoretically (11.1-t) ms (the last frame is a matting frame) or (22.2-t) ms (the last frame is not a matting frame), and t is the time difference between the time of the matting frame determined by the HWC and the time of transmitting the synthesized image to the OLED drive. Thus, the matte frame difference threshold may be any value between 11.1-t and 22.2-t.
If the display is refreshed at a 120Hz rate, the 11.1ms in the above example would need to be changed to 16.7ms based on the 120Hz rate. Thus, in inter-frame matting, the matte frame difference threshold is also related to the current refresh frequency of the display screen.
Of course, the above-mentioned time points are only used as examples and do not limit the present application in any way.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the above method example, for example, each functional unit may be divided for each function, or two or more functions may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation. The following description will take the example of dividing each functional unit corresponding to each function:
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments may be implemented.
Embodiments of the present application further provide a computer program product, which when run on an electronic device, enables the electronic device to implement the steps in the above method embodiments.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to a first device, including recording media, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
An embodiment of the present application further provides a chip system, where the chip system includes a processor, the processor is coupled to the memory, and the processor executes a computer program stored in the memory to implement the steps of any of the method embodiments of the present application. The chip system may be a single chip or a chip module composed of a plurality of chips.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (36)

1. A method for monitoring noise is applied to an electronic device, and the electronic device comprises: a HWC module, a display subsystem, and a library of noise algorithms, the method comprising:
in response to receiving the first information, the HWC module sets a write back flag to a first flag;
in response to receiving the first image, the HWC module queries the write-back flag as a first flag;
the HWC module sends the first image to the display subsystem based on the first flag;
the display subsystem stops writing back a memory to the electronic equipment to store a second image comprising a first target image on the first image, wherein the first target image is an image in a first area;
in response to reaching a first time, the HWC module sets the write back flag to a second flag;
the HWC module acquires a third image;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
In response to receiving the third image and the second information, the display subsystem stores, to a write-back memory of the electronic device, a fourth image that includes a second target image on the third image, the second target image being an image within the first region;
the HWC module acquires the second target image from the write-back memory;
the HWC module sends the second target image to a noise algorithm library;
and the noise algorithm library calculates and obtains first image noise based on the second target image.
2. The method of claim 1, wherein the first information comprises a first duration of time for which the display subsystem stops storing images to the write-back memory; the first moment is as follows: the write-back mark is set to a time when a first time length passes after the time of the first mark;
or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting a delay time length from the first time length, and the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information.
3. The method of claim 1 or 2, wherein the HWC module acquiring the third image comprises:
the HWC module sends a first signal to a surface flag of the electronic device;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameter.
4. The method of any of claims 1 to 3, wherein the HWC module, after setting the write back flag to the second flag and before the HWC module acquires a third image, further comprises:
the HWC module acquires the moment when the image is refreshed on the electronic equipment last time;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the HWC module acquires the third image.
5. The method of claim 4, wherein the HWC module, after obtaining the time the image was last refreshed by the electronic device, further comprises:
And if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the HWC module waits for a Surface flag module of the electronic equipment to send a second display parameter.
6. The method of claim 4, wherein the HWC module obtaining the first image if a time of last image refresh of the electronic device satisfies a first preset condition comprises:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the HWC module waits for a second time length;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second duration, the HWC module acquires the first image.
7. The method of claim 6, wherein the method further comprises:
if the HWC module receives a fourth display parameter sent by a Surface flag within the second duration, the HWC module acquires a fifth image based on the fourth display parameter;
the HWC module queries the write-back flag as the second flag;
the HWC module sends the fifth image and third information to the display subsystem based on the second flag;
In response to receiving the fifth image and the third information, the display subsystem stores a sixth image comprising a third target image on the fifth image in a write-back memory of the electronic device, the third target image being an image within the first region;
the HWC module acquires the third target image from the write-back memory;
the HWC module sends the third target image to a noise algorithm library;
and the noise algorithm library calculates and obtains second image noise based on the third target image.
8. The method of any of claims 4 to 7, wherein the first information comprises a first value and a second time, the second time being an end time when an ambient light sensor of the electronic device acquired the first value;
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time.
9. The method according to any one of claims 4 to 7, wherein the first information further includes a first value and a second time, the second time is an end time when an ambient light sensor of the electronic device acquires the first value, and the time when the electronic device last refreshes an image satisfies a first preset condition includes:
A first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
and a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time.
10. The method of any one of claims 4 to 7, wherein the first preset condition being satisfied by the moment when the electronic device last refreshed the image comprises:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
11. The method of any one of claims 1 to 10, further comprising:
After the HWC module sets a write back flag to a first flag; the HWC module monitors whether data in a core node of the electronic device changes or not, wherein the core node stores a brightness value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a first luminance value from the core node;
after the HWC module acquires a first brightness value from the kernel node, the HWC module acquires a second brightness value from the kernel node in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the HWC module sends the second luma value to the noise algorithm library.
12. The method of claim 11, wherein the method further comprises:
after the HWC module sets a write back flag to a second flag, the HWC module monitors whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening for a change in data in a core node of the electronic device, the HWC module obtaining a third luminance value from the core node;
The HWC module sends the third luma value to the noise algorithm library;
after the HWC module sends the third luma value to the noise algorithm library, in response to listening for a change in data in a core node of the electronic device, the HWC module retrieving a fourth luma value from the core node;
the HWC module sends the fourth luma value to the noise algorithm library.
13. The method of claim 11 or 12, wherein the computing of the first image noise based on the second target image by the noise algorithm library comprises:
and calculating to obtain first image noise based on the second target image and the second brightness value by the noise algorithm library.
14. The method of any of claims 1 to 13, wherein the HWC module receiving the first image comprises:
the HWC module receives a fifth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module derives the first image based on the fifth display parameter.
15. The method of any of claims 1-14, wherein the first area is an area on a display screen of the electronic device that is located above an ambient light sensor of the electronic device.
16. The method of claim 4, wherein the HWC module obtaining the time before the last image refresh of the electronic device comprises:
the HWC module receives a sixth display parameter sent by a Surface flag module of the electronic equipment;
the HWC module stores a time at which the HWC module received the sixth display parameter;
the HWC module acquires the last time when the electronic device refreshes the image, and the method comprises the following steps:
the HWC module obtains the stored time when the sixth display parameter is received, where the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before the time when the electronic device last refreshes an image is obtained.
17. The method of claim 3, wherein the first display parameter comprises: and synthesizing one or more of the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
18. A method for monitoring noise, applied to an electronic device including a first processor, the method comprising:
The first processor receives first information;
after the first processor receives the first information, in response to receiving a first image, the first processor stops acquiring a first target image from the first image, wherein the first target image is an image in a first area;
after the first time is reached, the first processor acquires a third image;
the first processor acquires a second target image from the third image, wherein the second target image is an image in the first area.
19. The method of claim 18, wherein the method further comprises:
in response to receiving the first information, the first processor setting a write back flag as a first flag by a HWC module of the electronic device;
the stopping, in response to receiving the first image, the first processor from acquiring the first target image from the first image comprises:
in response to receiving a first image, the first processor querying, by the HWC module, that the write-back flag is a first flag;
the first processor sending, by the HWC module, the first image to a display subsystem of the electronic device based on the first flag;
The first processor stops storing a second image comprising a first target image on the first image to a write-back memory of the electronic equipment through the display subsystem, wherein the first target image is an image in a first area;
the method further comprises the following steps:
in response to reaching a first time, the first processor setting, by the HWC module, the write-back flag to a second flag;
the first processor acquires a third image, acquires a second target image from the third image, and the second target image is an image in the first area, and comprises:
the first processor obtaining a third image through the HWC module;
the first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sending, by the HWC module, the third image and second information to the display subsystem based on the second flag, the second information instructing the display subsystem to store a fourth image on the third image that includes a second target image in a write-back memory of the electronic device;
in response to receiving the third image and the second information, the first processor stores, by the display subsystem, a fourth image including a second target image on the third image to a write-back memory of the electronic device, where the second target image is an image in the first area;
The first processor retrieves the second target image from the write-back memory through the HWC module;
the method further comprises the following steps:
the first processor sending, by the HWC module, the second target image to a noise algorithm library;
the first processor obtains first image noise through the noise algorithm library based on the second target image calculation.
20. The method of claim 19, wherein the first information comprises a first duration of time for which the display subsystem stops storing images to the write-back memory; the first moment is as follows: the write-back mark is set to a time when a first time length passes after the time of the first mark;
or the first information comprises a first time length, a first value and a second time, the first time length is a time length for the display subsystem to stop storing the image into the write-back memory, and the second time is an ending time when the ambient light sensor of the electronic device acquires the first value; the first time is a time when a second time length passes after the write-back flag is set as the first flag, the second time length is a time length obtained by subtracting a delay time length from the first time length, and the time length of the delay time is a time length obtained by subtracting the second time length from a time when the HWC module receives the first information.
21. The method of claim 19 or 20, wherein the first processor obtaining a first image via the HWC module comprises:
the first processor sends a first signal to a surface flag of the electronic device through the HWC module;
in response to receiving the first signal, the surface Flinger acquires cached first display parameters and sends the first display parameters to the HWC module, wherein the first display parameters are latest cached display parameters in the cached display parameters of the surface Flinger;
the HWC module derives the third image based on the first display parameter.
22. The method of any of claims 19 to 21, wherein after the first processor sets the write back flag to the second flag via the HWC module, the first processor further comprises before the first processor acquires a third image via the HWC module:
the first processor acquires the moment when the image is refreshed on the electronic device last time through the HWC module;
and if the moment when the electronic equipment refreshes the image last time meets a first preset condition, the first processor acquires the third image through the HWC module.
23. The method of claim 22, wherein after the first processor obtains, via the HWC module, a time at which the electronic device last refreshed the image, further comprising:
and if the moment when the electronic equipment refreshes the image last time does not meet a first preset condition, the first processor waits for a Surface flag module of the electronic equipment to send a second display parameter through the HWC module.
24. The method of claim 22, wherein the first processor obtaining the first image through the HWC module if a first preset condition is met at a time when the electronic device last refreshed comprises:
if the moment of last image refreshing of the electronic equipment meets a first preset condition, the first processor waits for a second time length through the HWC module;
and if the HWC module does not receive the third display parameter sent by the Surface flag within the second time length, the first processor acquires the first image through the HWC module.
25. The method of claim 24, wherein the method further comprises:
if the HWC module receives fourth display parameters sent by a Surface flag within the second duration, the first processor acquires a fifth image based on the fourth display parameters through the HWC module;
The first processor querying, by the HWC module, the write-back flag as the second flag;
the first processor sending, by the HWC module, the fifth image and third information to the display subsystem based on the second flag;
in response to receiving the fifth image and the third information, the first processor stores, by the display subsystem, a sixth image including a third target image on the fifth image in a write-back memory of the electronic device, the third target image being an image within the first region;
the first processor retrieves the third target image from the write-back memory through the HWC module;
the first processor sending, by the HWC module, the third target image to a noise algorithm library of the electronic device;
and the first processor calculates and obtains second image noise based on the third target image through the noise algorithm library.
26. The method of any of claims 22 to 25, wherein the first information comprises a first value and a second time, the second time being an end time when an ambient light sensor of the electronic device acquired the first value;
The electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic equipment refreshes the image last time is later than the second time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the moment when the electronic equipment refreshes the image last time is earlier than or equal to the second time;
or, the electronic device meeting the first preset condition at the moment of last image refreshing includes:
a first difference value between the last image refreshing time and the current time of the electronic equipment is smaller than a second difference value between the second time and the current time;
the moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
a first difference value between the last image refreshing time and the current time of the electronic equipment is greater than or equal to a second difference value between the second time and the current time;
alternatively, the first and second electrodes may be,
the electronic equipment meeting a first preset condition at the moment of last image refreshing comprises the following steps:
the moment when the electronic device refreshes the image last time and the moment when the HWC module obtains the target image last time are smaller than a first threshold value; the target image is an image displayed in an area above an ambient light sensor of the electronic device in the display screen;
The moment when the electronic equipment refreshes the image last time does not meet a first preset condition comprises the following steps:
the time when the electronic device last refreshes the image and the time when the HWC module last obtains the target image are greater than or equal to the first threshold.
27. The method of any one of claims 19 to 26, further comprising:
after the first processor sets a write back flag to a first flag via the HWC module; the first processor monitoring, by the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a first luminance value from the core node;
after the first processor acquires a first brightness value from the kernel node through the HWC module, the first processor acquires a second brightness value from the kernel node through the HWC module in response to monitoring that data in the kernel node of the electronic device changes;
in response to reaching a first time, the first processor sends the second luma value to the noise algorithm library through the HWC module.
28. The method of claim 27, wherein the method further comprises:
after the first processor sets a write back flag to a second flag via the HWC module, the first processor monitors, via the HWC module, whether data in a core node of the electronic device has changed, the core node storing a luminance value;
in response to listening that data in a core node of the electronic device has changed, the first processor obtaining, by the HWC module, a third luminance value from the core node;
the first processor sending, by the HWC module, the third luminance value to the noise algorithm library;
after the first processor sends the third luminance value to the noise algorithm library through the HWC module, in response to monitoring that a change in data in a core node of the electronic device occurs, the first processor obtains a fourth luminance value from the core node through the HWC module;
the first processor sends the fourth luma value to the noise algorithm library through the HWC module.
29. The method of claim 27, wherein the first processor obtaining a first image noise based on the second target image calculation via the noise algorithm library comprises:
The first processor calculates a first image noise based on the second target image and the second luminance value through the noise algorithm library.
30. The method of any of claims 19 to 29, wherein the first processor receiving, by the HWC module, the first image comprises:
the first processor receives a fifth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor obtains, by the HWC module, the first image based on the fifth display parameter.
31. The method of any of claims 18 to 30, wherein the first area is an area on a display screen of the electronic device that is located above an ambient light sensor of the electronic device.
32. The method of claim 22, wherein the first processor, prior to the time at which the electronic device last refreshed the image was obtained by the HWC module, comprises:
the first processor receives a sixth display parameter sent by a Surface flag module of the electronic device through the HWC module;
the first processor storing, by the HWC module, a time at which the sixth display parameter was received by the HWC module;
The first processor acquires, by the HWC module, a time when the image is last refreshed on the electronic device, where the time includes:
the first processor acquires, by the HWC module, the stored time when the sixth display parameter is received, where the time when the sixth display parameter is received is the time when the HWC module newly stores the received display parameter before the time when the electronic device last refreshes an image is acquired.
33. The method of claim 21, wherein the first display parameter comprises: and synthesizing the position, the size, the color and the storage address of the interface of the third image on the display screen of the electronic equipment.
34. An electronic device, characterized in that the electronic device comprises a first processor for executing a computer program stored in a memory, to cause the electronic device to implement the method of any of claims 1 to 17 or the method of any of claims 18 to 33.
35. A chip system comprising a first processor coupled to a memory, the first processor executing a computer program stored in the memory to implement the method of any of claims 18 to 33.
36. A computer-readable storage medium, in which a computer program is stored which, when run on a processor, implements the method of any one of claims 1 to 17 or the method of any one of claims 18 to 33.
CN202110606261.8A 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system Active CN113808030B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211137769.9A CN115564668A (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system
CN202110606261.8A CN113808030B (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110606261.8A CN113808030B (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211137769.9A Division CN115564668A (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Publications (2)

Publication Number Publication Date
CN113808030A true CN113808030A (en) 2021-12-17
CN113808030B CN113808030B (en) 2022-09-30

Family

ID=78942437

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110606261.8A Active CN113808030B (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system
CN202211137769.9A Pending CN115564668A (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202211137769.9A Pending CN115564668A (en) 2021-05-31 2021-05-31 Noise monitoring method, electronic equipment and chip system

Country Status (1)

Country Link
CN (2) CN113808030B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1326166A (en) * 2000-05-31 2001-12-12 三星电子株式会社 Method for expressing mode repeatability of images
US20120056091A1 (en) * 2008-09-25 2012-03-08 Apple Inc. Ambient light sensor with reduced sensitivity to noise from infrared sources
US20140166850A1 (en) * 2012-12-13 2014-06-19 Apple Inc. Electronic Device With Display and Low-Noise Ambient Light Sensor
CN106610879A (en) * 2016-12-23 2017-05-03 盛科网络(苏州)有限公司 Method for improving CPU (Central Processing Unit) noise test efficiency of chip
CN207165238U (en) * 2017-05-17 2018-03-30 西安紫光国芯半导体有限公司 A kind of memory of the write-back when carrying out read operation
CN107945747A (en) * 2017-11-22 2018-04-20 广东欧珀移动通信有限公司 Environment light detection method, device, storage medium and electronic equipment
US20180308204A1 (en) * 2017-04-24 2018-10-25 Intel Corporation Fragmented graphic cores for deep learning using led displays
CN108885775A (en) * 2016-04-05 2018-11-23 华为技术有限公司 A kind of display methods and terminal
CN110677596A (en) * 2019-11-04 2020-01-10 深圳市灵明光子科技有限公司 Ambient light adjusting device, ambient light adjusting method, image sensor and electronic device
US20200294468A1 (en) * 2019-03-13 2020-09-17 Apple Inc. Electronic Devices With Ambient Light Sensor Systems
CN111754954A (en) * 2020-07-10 2020-10-09 Oppo(重庆)智能科技有限公司 Screen brightness adjusting method and device, storage medium and electronic equipment
CN112229507A (en) * 2020-10-15 2021-01-15 Tcl通讯(宁波)有限公司 Ambient light detection method and device, storage medium and mobile terminal
CN112312031A (en) * 2019-07-30 2021-02-02 辉达公司 Enhanced high dynamic range imaging and tone mapping
US20210086364A1 (en) * 2019-09-20 2021-03-25 Nvidia Corporation Vision-based teleoperation of dexterous robotic system

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1326166A (en) * 2000-05-31 2001-12-12 三星电子株式会社 Method for expressing mode repeatability of images
US20120056091A1 (en) * 2008-09-25 2012-03-08 Apple Inc. Ambient light sensor with reduced sensitivity to noise from infrared sources
US20140166850A1 (en) * 2012-12-13 2014-06-19 Apple Inc. Electronic Device With Display and Low-Noise Ambient Light Sensor
CN108885775A (en) * 2016-04-05 2018-11-23 华为技术有限公司 A kind of display methods and terminal
CN106610879A (en) * 2016-12-23 2017-05-03 盛科网络(苏州)有限公司 Method for improving CPU (Central Processing Unit) noise test efficiency of chip
US20180308204A1 (en) * 2017-04-24 2018-10-25 Intel Corporation Fragmented graphic cores for deep learning using led displays
CN207165238U (en) * 2017-05-17 2018-03-30 西安紫光国芯半导体有限公司 A kind of memory of the write-back when carrying out read operation
CN107945747A (en) * 2017-11-22 2018-04-20 广东欧珀移动通信有限公司 Environment light detection method, device, storage medium and electronic equipment
US20200294468A1 (en) * 2019-03-13 2020-09-17 Apple Inc. Electronic Devices With Ambient Light Sensor Systems
CN112312031A (en) * 2019-07-30 2021-02-02 辉达公司 Enhanced high dynamic range imaging and tone mapping
US20210086364A1 (en) * 2019-09-20 2021-03-25 Nvidia Corporation Vision-based teleoperation of dexterous robotic system
CN110677596A (en) * 2019-11-04 2020-01-10 深圳市灵明光子科技有限公司 Ambient light adjusting device, ambient light adjusting method, image sensor and electronic device
CN111754954A (en) * 2020-07-10 2020-10-09 Oppo(重庆)智能科技有限公司 Screen brightness adjusting method and device, storage medium and electronic equipment
CN112229507A (en) * 2020-10-15 2021-01-15 Tcl通讯(宁波)有限公司 Ambient light detection method and device, storage medium and mobile terminal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KOBAYASHI K ET AL: "Correlation between noise-after-write and magnetic domain structure conversions in thin-film heads by electron microscopy", 《IEEE TRANSLATION JOURNAL ON MAGNETICS IN JAPAN》 *
阮远忠等: "基于ZYNQ-7000的视频图像处理系统设计与实现", 《软件导刊》 *

Also Published As

Publication number Publication date
CN113808030B (en) 2022-09-30
CN115564668A (en) 2023-01-03

Similar Documents

Publication Publication Date Title
CN113475057B (en) Video frame rate control method and related device
WO2022007862A1 (en) Image processing method, system, electronic device and computer readable storage medium
CN112119641B (en) Method and device for realizing automatic translation through multiple TWS (time and frequency) earphones connected in forwarding mode
CN113625860B (en) Mode switching method and device, electronic equipment and chip system
WO2022100685A1 (en) Drawing command processing method and related device therefor
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
CN116991354A (en) Data processing method and related device
CN113804290B (en) Ambient light detection method, electronic device and chip system
CN111182140A (en) Motor control method and device, computer readable medium and terminal equipment
CN114095666A (en) Photographing method, electronic device and computer-readable storage medium
CN115463419A (en) Image prediction method, electronic device and storage medium
WO2022199613A1 (en) Method and apparatus for synchronous playback
CN114257920A (en) Audio playing method and system and electronic equipment
CN113852755A (en) Photographing method, photographing apparatus, computer-readable storage medium, and program product
CN113808030B (en) Noise monitoring method, electronic equipment and chip system
CN112469012A (en) Bluetooth communication method and related device
CN113923351B (en) Method, device and storage medium for exiting multi-channel video shooting
WO2022033344A1 (en) Video stabilization method, and terminal device and computer-readable storage medium
CN113837990B (en) Noise monitoring method, electronic equipment, chip system and storage medium
CN113820008B (en) Ambient light detection method, electronic device and chip system
CN114740986A (en) Handwriting input display method and related equipment
CN113391735A (en) Display form adjusting method and device, electronic equipment and storage medium
CN113672454B (en) Screen freezing monitoring method, electronic equipment and computer readable storage medium
CN115019803B (en) Audio processing method, electronic device, and storage medium
WO2023020420A1 (en) Volume display method, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant