CN115931115A - Detection method of ambient light, electronic equipment, chip system and storage medium - Google Patents
Detection method of ambient light, electronic equipment, chip system and storage medium Download PDFInfo
- Publication number
- CN115931115A CN115931115A CN202110905134.8A CN202110905134A CN115931115A CN 115931115 A CN115931115 A CN 115931115A CN 202110905134 A CN202110905134 A CN 202110905134A CN 115931115 A CN115931115 A CN 115931115A
- Authority
- CN
- China
- Prior art keywords
- ambient light
- value
- processor
- light data
- light sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Controls And Circuits For Display Device (AREA)
Abstract
The application provides a detection method of ambient light, electronic equipment, a chip system and a storage medium, relates to the technical field of ambient light sensors, and can solve the problem of overlarge power consumption of the electronic equipment. The method comprises the following steps: the method comprises the steps that an ambient light sensor collects ambient light data in a normal sampling mode, and under the condition that the ambient light data meet a certain condition, the ambient light sensor is controlled to collect the ambient light data in a slow sampling mode, wherein the acquisition period of the slow sampling mode is longer than that of the normal sampling mode, the integration duration is the same, and the sleep duration is prolonged equivalently to the ambient light sensor in the slow sampling mode, so that the power consumption of electronic equipment in the slow sampling mode is lower than that in the normal sampling mode, and the power consumption of the electronic equipment is reduced; the fast sampling mode can be set, the sleeping time of the fast sampling mode is shorter than that of the normal sampling mode, therefore, when the ambient light intensity changes greatly, ambient light data are obtained quickly, the brightness of the display screen is adjusted quickly, and the visual experience of a user is improved.
Description
Technical Field
The present disclosure relates to the field of ambient light sensors, and in particular, to a method for detecting ambient light, an electronic device, a chip system, and a storage medium.
Background
The ambient light sensor may sense the light intensity of the surrounding environment. The electronic equipment provided with the ambient light sensor can adjust the brightness of the display screen according to ambient light data collected by the ambient light sensor. On one hand, the visual experience of the user is improved when the user watches the content displayed by the display screen by adjusting the brightness of the display screen to be matched with the light intensity of the surrounding environment; on the other hand, when the light intensity of the environment is low, the brightness of the display screen is adjusted to be relatively dark, so that the power consumption of the electronic equipment is reduced.
The ambient light sensor, when acquiring the ambient light data, typically acquires the ambient light data in an acquisition period, e.g., one acquisition period comprising an integration period and a non-integration period. The ambient light sensor collects ambient light data during the integration period and no longer collects ambient light data during the non-integration period, i.e., the ambient light sensor is in a sleep state during the non-integration period. However, when the ambient light sensor collects the ambient light data in this manner, power consumption remains large for the electronic device.
Disclosure of Invention
The application provides a detection method of ambient light, electronic equipment, a chip system and a storage medium, and solves the problem of high power consumption of the electronic equipment.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a method for detecting ambient light, applied to an electronic device, where the electronic device includes a first processor and a second processor, and the method includes:
the method comprises the steps that a first processor acquires a first value acquired by an ambient light sensor on the electronic equipment in a first sampling mode;
the first processor sends a first value to the second processor;
the second processor receiving a first value;
the second processor sends first information to the first processor based on the first value, the first information corresponding to a second sampling pattern;
in response to receiving the first information, the first processor instructs the ambient light sensor to collect the ambient light based on a second sampling mode, the second sampling mode having a collection period greater than the collection period of the first sampling mode.
In the embodiment of the application, when the ambient light sensor collects ambient light data (including a first value) in a first sampling mode, the ambient light sensor is switched from the first sampling mode to a second sampling mode, the second sampling mode is longer than the collection period of the first sampling mode, and the integration time lengths are the same, which is equivalent to that the ambient light sensor prolongs the sleep time length in the second sampling mode, so that the power consumption of the electronic device in the second sampling mode is lower than that in the first sampling mode, thereby reducing the power consumption of the electronic device; and under the condition that the first processor judges whether to switch the sampling mode based on the ambient light data, the frequency of reporting the ambient light data to the second processor by the first processor is also reduced, so that the power consumption is further reduced.
In one implementation form of the first aspect, the acquisition period comprises an integration duration and a sleep duration; the integration duration of the second sampling pattern is the same as the first sampling pattern.
In one implementation manner of the first aspect, the sending, by the second processor, the first information to the first processor based on the first value includes:
the second processor determines the maximum value and the minimum value in the first ambient light data, the first ambient light data comprises a first value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is the ambient light data which is stored by the second processor after the first storage space is emptied for the last time and meets the storage condition;
if the absolute value of the difference value between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold value, the second processor stores the first value in a first storage space;
after storing the first value in the first storage space, the second processor sends the first information to the first processor if the ambient light data stored in the first storage space is the first amount of ambient light data.
In one implementation of the first aspect, the ambient light data satisfying the storage condition includes:
after the second processor empties the first storage space, the received first ambient light data is ambient light data meeting the storage condition;
under the condition that the absolute value of the difference value between the maximum value and the minimum value in the currently received ambient light data and the currently stored ambient light data in the first storage space is smaller than a first threshold value, the currently received ambient light data is the ambient light data meeting the storage condition;
the condition for emptying the first storage space comprises:
if the absolute value of the difference between the maximum value and the minimum value in the ambient light data currently received by the second processor and the ambient light data currently stored in the first storage space is not less than the first threshold and is less than the second threshold, the condition of emptying the first storage space is satisfied.
In one implementation of the first aspect, after the first processor instructs the ambient light sensor to collect the ambient light based on the second sampling pattern, the method further comprises:
the first processor acquires a second value acquired by the ambient light sensor in a second sampling mode;
the first processor sends a second value to the second processor;
the second processor receives a second value;
the second processor sending second information to the first processor based on the second value, the second information corresponding to the first sampling pattern;
in response to receiving the second information, the first processor instructs the ambient light sensor to collect ambient light based on the first sampling pattern.
In one implementation of the first aspect, the sending, by the second processor, the second information to the first processor based on the second value includes:
the second processor determines the maximum value and the minimum value in the received second ambient light data, wherein the second ambient light data is the ambient light data collected after the ambient light sensor is switched to the second sampling mode, and the second ambient light data comprises a second value;
and if the absolute value of the difference between the maximum value and the minimum value in the second ambient light data is greater than or equal to the first threshold value and the absolute value of the difference between the maximum value and the minimum value in the second ambient light data is less than the second threshold value, the second processor sends second information to the first processor.
In one implementation of the first aspect, after the first processor instructs the ambient light sensor to collect the ambient light based on the second sampling pattern, the method further comprises:
the first processor acquires a third value acquired by the ambient light sensor in a second sampling mode;
the first processor sends a third value to the second processor;
the second processor receiving a third value;
the second processor sending third information to the first processor based on the third value, the third information corresponding to a third sampling pattern;
in response to receiving the third information, the first processor instructs the ambient light sensor to collect ambient light based on a third sampling pattern.
In one implementation manner of the first aspect, the sending, by the second processor, the third information to the first processor based on the third value includes:
the second processor determines the maximum value and the minimum value in the received third ambient light data, wherein the third ambient light data is the ambient light data collected after the ambient light sensor is switched to the second sampling mode, and the third ambient light data comprises a third value;
if the absolute value of the difference between the maximum value and the minimum value in the third ambient light data is greater than or equal to the second threshold, the second processor sends third information to the first processor.
In one implementation of the first aspect, after the first processor instructs the ambient light sensor to collect the ambient light based on the third sampling pattern, the method further comprises:
the first processor acquires a fourth value acquired by the ambient light sensor in a third sampling mode;
the first processor sends a fourth value to the second processor;
the second processor receiving a fourth value;
the second processor sending fourth information to the first processor based on the fourth value, the fourth information corresponding to the first sampling pattern;
in response to receiving the fourth information, the first processor instructs the ambient light sensor to collect ambient light based on the first sampling pattern.
In one implementation manner of the first aspect, the second processor sending fourth information to the first processor based on the fourth value includes:
the second processor determines the maximum value and the minimum value in fourth environment light data, the fourth environment light data comprises a fourth value and environment light data stored in the first storage space, and the environment light data stored in the first storage space is environment light data which is stored by the second processor after the first storage space is emptied for the last time and meets the storage condition;
if the absolute value of the difference value between the maximum value and the minimum value in the fourth ambient light data is smaller than a second threshold value, the second processor stores a fourth value in the first storage space;
after storing the fourth value in the first storage space, if the ambient light data stored in the first storage space is the second amount of ambient light data, the second processor sends fourth information to the first processor.
In one implementation of the first aspect, after the first processor instructs the ambient light sensor to collect the ambient light based on the first sampling pattern, the method further comprises:
the first processor acquires a fifth value acquired by the ambient light sensor in the first sampling mode;
the first processor sends a fifth value to the second processor;
the second processor receiving a fifth value;
the second processor sends fifth information to the first processor based on the fifth value, the fifth information corresponding to a third sampling pattern;
in response to receiving the fifth information, the first processor instructs the ambient light sensor to collect ambient light based on a third sampling pattern.
In one implementation manner of the first aspect, the second processor sending, to the first processor, the fifth information based on the fifth value includes:
the second processor determines a maximum value and a minimum value in fifth ambient light data, where the fifth ambient light data includes the fifth value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is ambient light data which satisfies a first storage condition and is stored after the second processor has emptied the first storage space for the last time;
in one implementation form of the first aspect, the method further comprises:
the first processor acquires a sixth value acquired by the ambient light sensor in any sampling mode, wherein the sixth value is ambient light data acquired by the ambient light sensor in the first gain value, and any sampling mode comprises a first sampling mode, a second sampling mode and a third sampling mode;
if the sixth value is not in the first range, the first processor adjusts the gain value of the ambient light sensor to be a second gain value;
the first processor instructs the ambient light sensor to collect ambient light based on a fourth sampling mode, the collection period of the fourth sampling mode being less than the collection period of the first sampling mode; the first processor acquires a seventh value acquired by the ambient light sensor in a fourth sampling mode and a second gain value;
if the seventh value is within the first range, the first processor instructs the ambient light sensor to collect the ambient light in the first sampling mode, and the first processor sends the seventh value and seventh information to the second processor;
the second processor receives a seventh value and seventh information, wherein the seventh information is used for indicating that the ambient light sensor is switched into the first sampling mode;
the second processor empties the first storage space based on the seventh information;
after emptying the first memory space, the second processor stores the seventh value in the first memory space.
In one implementation manner of the first aspect, if an absolute value of a difference between a maximum value and a minimum value in the first ambient light data is smaller than a first threshold, the second processor stores the first value in the first storage space, and the method includes:
the second processor obtains the brightness level of the first value based on the first value;
and if the absolute value of the difference value between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold corresponding to the brightness level where the first value is located, the second processor stores the first value in the first storage space.
In one implementation form of the first aspect, the brightness level comprises: a first brightness level, a second brightness level, a third brightness level and a fourth brightness level; a critical value between the first brightness level and the second brightness level is a first critical value; a critical value between the second brightness level and the third brightness level is a second critical value, a critical value between the third brightness level and the fourth brightness level is a third critical value, the first critical value is smaller than the second critical value, and the second critical value is smaller than the third critical value;
the second processor obtains the brightness level of the first value based on the first value; the method comprises the following steps:
judging the relation between the first value and the second critical value;
if the first value is equal to the second critical value, the brightness grade of the first value is obtained and is the brightness grade of the second critical value;
if the first value is smaller than the second critical value, the relation between the first value and the first critical value is judged;
if the first value is smaller than the first critical value, the brightness grade of the first value is obtained as a first brightness grade; if the first value is larger than the first critical value, the brightness grade of the first value is obtained as a second brightness grade; if the first value is equal to the first critical value, obtaining the brightness level of the first value as the brightness level of the first critical value;
if the first value is larger than the second critical value, the relation between the first value and the third critical value is judged;
if the first value is smaller than the third critical value, the brightness grade of the first value is obtained as a third brightness grade; if the first value is larger than the third critical value, the brightness grade of the first value is a fourth brightness grade; and if the first value is equal to the third critical value, obtaining the brightness level of which the first value is as the brightness level of which the third critical value is.
In one implementation of the first aspect, the first processor instructs the ambient light sensor to collect the ambient light in the second sampling mode in response to a display screen of the electronic device being switched to a screen-off state.
In one implementation manner of the first aspect, when the display screen of the electronic device is in a screen-off state, the method further includes:
the first processor acquires an eighth value acquired by the ambient light sensor in the second sampling mode, wherein the eighth value is ambient light data acquired by the ambient light sensor in the third gain value;
if the eighth value is not within the first range, the first processor adjusts the gain value of the ambient light sensor to a fourth gain value;
the first processor instructs the ambient light sensor to collect ambient light based on a fourth sampling mode, the collection period of the fourth sampling mode being less than the collection period of the second sampling mode; the first processor acquires a ninth value acquired by the ambient light sensor in a fourth sampling mode and a fourth gain value;
if the ninth value is within the first range, the first processor instructs the ambient light sensor to collect the ambient light in the second sampling mode, and the first processor sends the ninth value to the second processor.
In one implementation manner of the first aspect, the electronic device further includes: an ambient light sensor driver, an ambient light sensor, a HWC module, a library of noise algorithms, the method comprising:
the first processor acquires a first value acquired by the ambient light sensor in a first sampling mode through the drive of the ambient light sensor;
the first processor sends a first value to the HWC module via the ambient light sensor drive;
the second processor receiving the first value via the HWC module, the second processor sending the first value to the noise algorithm library via the HWC module;
the second processor sending, by the noise algorithm library, first information to the HWC module based on the first value;
the second processor sends first information to the ambient light sensor driver through the HWC module, the first information corresponding to a second sampling pattern;
in response to receiving the first information, the first processor instructs, through the ambient light sensor drive, the ambient light sensor to collect ambient light based on a second sampling pattern, the collection period of the second sampling pattern being greater than the collection period of the first sampling pattern.
In a second aspect, the present application provides a method for detecting ambient light, applied to an electronic device, where the electronic device includes a second processor, and the method includes:
the second processor receives a first value, wherein the first value is ambient light data collected by an ambient light sensor of the electronic equipment in a first sampling mode;
the second processor determines the maximum value and the minimum value in the first ambient light data, the first ambient light data comprises a first value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is the ambient light data which is stored by the second processor after the first storage space is emptied for the last time and meets the storage condition;
if the absolute value of the difference value between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold value, the second processor stores the first value in a first storage space;
after the first value is stored in the first storage space, if the ambient light data stored in the first storage space is the first amount of ambient light data, the second processor sends first information to the first processor of the electronic device, where the first information is used to instruct the first processor to control the ambient light sensor to collect the ambient light based on a second sampling mode, and a collection period of the second sampling mode is greater than a collection period of the first sampling mode.
In one implementation form of the second aspect, the method further comprises:
the second processor receives a seventh value and seventh information, the seventh information is used for indicating that the ambient light sensor is switched from the gain adjustment mode to the first sampling mode, the gain adjustment mode of the ambient light sensor is a mode when the ambient light sensor acquires ambient light data in a first range, and the first processor adjusts the gain value of the ambient light sensor;
the second processor empties the first storage space based on the seventh information;
after emptying the first memory space, the second processor stores the seventh value in the first memory space.
In one implementation manner of the second aspect, if an absolute value of a difference between a maximum value and a minimum value in the first ambient light data is smaller than a first threshold, the second processor stores the first value in the first storage space, including:
the second processor obtains the brightness level of the first value based on the first value;
and if the absolute value of the difference value between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold corresponding to the brightness level where the first value is located, the second processor stores the first value in the first storage space.
In one implementation of the second aspect, the brightness level comprises: a first brightness level, a second brightness level, a third brightness level, and a fourth brightness level; a critical value between the first brightness level and the second brightness level is a first critical value; a critical value between the second brightness level and the third brightness level is a second critical value, a critical value between the third brightness level and the fourth brightness level is a third critical value, the first critical value is smaller than the second critical value, and the second critical value is smaller than the third critical value;
the second processor obtains the brightness level of the first value based on the first value; the method comprises the following steps:
judging the relation between the first value and the second critical value;
if the first value is equal to the second critical value, the brightness grade of the first value is obtained and is the brightness grade of the second critical value;
if the first value is smaller than the second critical value, the relation between the first value and the first critical value is judged;
if the first value is smaller than the first critical value, the brightness grade of the first value is obtained as a first brightness grade; if the first value is larger than the first critical value, the brightness grade of the first value is obtained and is a second brightness grade; if the first value is equal to the first critical value, the brightness grade of the first value is the brightness grade of the first critical value;
if the first value is larger than the second critical value, the relation between the first value and the third critical value is judged;
if the first value is smaller than the third critical value, the brightness grade of the first value is obtained as a third brightness grade; if the first value is larger than the third critical value, the brightness grade of the first value is a fourth brightness grade; and if the first value is equal to the third critical value, obtaining the brightness level of which the first value is as the brightness level of which the third critical value is.
In one implementation of the second aspect, the ambient light data satisfying the storage condition includes:
after the second processor empties the first storage space, the received first ambient light data is ambient light data meeting the storage condition;
when the absolute value of the difference between the maximum value and the minimum value in the currently received ambient light data and the currently stored ambient light data in the first storage space is smaller than a first threshold, the currently received ambient light data is the ambient light data meeting the storage condition;
the condition for emptying the first storage space comprises:
if the absolute value of the difference between the maximum value and the minimum value in the ambient light data currently received by the second processor and the ambient light data currently stored in the first storage space is not less than the first threshold and is less than the second threshold, the condition of emptying the first storage space is met.
In a third aspect, the present application provides a method for detecting ambient light, applied to an electronic device, where the electronic device includes a first processor, the method including:
the method comprises the steps that a first processor acquires a first value acquired by an ambient light sensor on the electronic equipment in a first sampling mode;
the first processor determines the maximum value and the minimum value in first ambient light data, the first ambient light data comprises a first value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is the ambient light data which is stored after the first processor empties the first storage space for the last time and meets the storage condition;
if the absolute value of the difference value between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold value, the first processor stores a first value in a first storage space;
after storing the first value in the first storage space, if the ambient light data stored in the first storage space is a first amount of ambient light data, the first processor instructs the ambient light sensor to collect the ambient light based on a second sampling mode, wherein a collection period of the second sampling mode is greater than a collection period of the first sampling mode.
In a fourth aspect, an electronic device is provided, comprising a processor for executing a computer program stored in a memory, implementing the method of any of the first aspect of the present application.
In a fifth aspect, a chip system is provided, which includes a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any one of the second aspect or the third aspect of the present application.
In a sixth aspect, there is provided a computer readable storage medium storing a computer program which, when executed by one or more processors, performs the method of any one of the second and/or third aspects of the present application.
In a seventh aspect, the present application provides a computer program product for causing an apparatus to perform the method of any one of the second and/or third aspects of the present application when the computer program product is run on the apparatus.
It is to be understood that, the beneficial effects of the second to seventh aspects may be referred to the relevant description of the first aspect, and are not repeated herein.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a diagram illustrating a positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present disclosure;
fig. 3 is a diagram illustrating another positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present disclosure;
FIG. 4 is a diagram illustrating another positional relationship between an ambient light sensor and a display screen in an electronic device according to an embodiment of the present application;
FIG. 5 is a diagram illustrating a positional relationship of a target area on a display screen according to an embodiment of the present disclosure;
FIG. 6 is a diagram illustrating a positional relationship between an ambient light sensor and a target area on a display screen according to an embodiment of the present disclosure;
fig. 7 is a technical architecture diagram of a method for detecting ambient light according to an embodiment of the present application;
fig. 8 is a schematic diagram of an acquisition cycle of the ambient light sensor for acquiring ambient light according to an embodiment of the present application;
FIG. 9 is a schematic diagram of time nodes for image refresh and backlight adjustment during an acquisition cycle in the embodiment of FIG. 8;
FIG. 10 is a timing flow diagram of a method for detecting ambient light based on the technical architecture shown in FIG. 7;
fig. 11 is a timing flow diagram between various modules in the AP processor provided by the embodiment of the present application in the embodiment shown in fig. 10;
FIG. 12 is a schematic diagram of calculating integral noise based on image noise and backlight noise at each time node provided by the embodiment shown in FIG. 9;
fig. 13 is a schematic diagram of each time node for performing image refreshing and backlight adjustment in an upward direction of a time axis in an acquisition period according to the embodiment of the present application;
FIG. 14 is a schematic diagram of calculating integral noise based on the image noise and backlight noise at each time node provided by the embodiment shown in FIG. 13;
fig. 15 is a diagram illustrating a sampling mode and a switching relationship of an ambient light sensor in a bright screen state according to an embodiment of the present application;
FIG. 16 is a timing diagram of the ambient light sensor when the current sampling mode is maintained and the sampling mode is switched according to an embodiment of the present disclosure;
FIG. 17 is a timing diagram illustrating the timing of the switching of the ambient light sensor to the gain adjustment mode and back to the normal sampling mode according to an embodiment of the present application;
fig. 18 is a diagram of a technical architecture after adding an intelligent adjustment sampling mode to the technical architecture shown in fig. 7 according to the embodiment of the present application;
fig. 19 is a schematic flowchart illustrating a process of determining whether a change in ambient light data is stable according to an embodiment of the present application;
fig. 20 is a schematic diagram of stable ambient light data obtained by using the current determining routine shown in fig. 19 according to an embodiment of the present application;
fig. 21 is a schematic flowchart of another process for determining whether the ambient light data change is stable according to an embodiment of the present disclosure;
fig. 22 is a schematic diagram of stable ambient light data obtained by using the current determining routine shown in fig. 21 according to an embodiment of the present application;
fig. 23 is a schematic flow chart illustrating a process of obtaining a threshold in a stability determination process according to an embodiment of the present application;
fig. 24 is a schematic flow chart illustrating a threshold value in another stability determination flow according to an embodiment of the present application;
fig. 25 is a diagram illustrating a sampling mode and a switching relationship of an ambient light sensor in a screen-off state according to an embodiment of the present application;
fig. 26 is a diagram of a sampling mode switching process of the ambient light sensor in a bright screen state according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in the embodiments of the present application, "one or more" means one, two or more; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a exists singly, A and B exist simultaneously, and B exists singly, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," "fourth," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The method for detecting the ambient light provided by the embodiment of the application can be applied to electronic equipment provided with a display screen and an ambient light sensor. The electronic device may be a tablet computer, a mobile phone, a wearable device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and other electronic devices. The embodiment of the present application does not limit the specific type of the electronic device.
Fig. 1 shows a schematic structural diagram of an electronic device. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, a camera 193, a display screen 194, and a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a touch sensor 180K, an ambient light sensor 180L, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, and an application program required by at least one function (such as a sound playing function, an image playing function, and the like).
In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
In other embodiments, the power management module 141 may be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio signals into analog audio signals for output, and also used to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic apparatus 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to implement noise reduction functions in addition to listening to voice information. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on. For example, the microphone 170C may be used to collect voice information related to embodiments of the present application.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1.
The embodiment of the present application does not particularly limit the specific structure of the execution body of the detection method of ambient light, as long as communication can be performed by one detection method of ambient light provided according to the embodiment of the present application by running the code recorded with one detection method of ambient light of the embodiment of the present application. For example, an execution subject of the method for detecting ambient light provided by the embodiment of the present application may be a functional module capable of calling a program and executing the program in the electronic device, or a communication device, such as a chip, applied to the electronic device.
Fig. 2 is a front position relationship diagram of a display screen and an ambient light sensor in an electronic device according to an embodiment of the present application.
As shown in fig. 2, the projection of the ambient light sensor on the display screen of the electronic device is located at the upper half of the display screen of the electronic device. When a user holds the electronic equipment, the ambient light sensor positioned on the upper half part of the electronic equipment can detect the light intensity and the color temperature of the environment on the front side (the orientation of the display screen in the electronic equipment) of the electronic equipment, and the light intensity and the color temperature are used for adjusting the brightness and the color temperature of the display screen of the electronic equipment, so that a better visual effect can be achieved. For example, the display screen may not be too bright in dark environments to cause glare, and may not be too dark in bright environments to cause poor visibility.
Fig. 3 is a side view of the display screen and the ambient light sensor in the electronic device. The display screen of the electronic equipment comprises from top to bottom: glass apron (printing opacity), display module assembly and protection pad pasting, wherein, all are used for showing the azimuth relation when electronic equipment's display screen upwards places here under and. Because the ambient light sensor need gather the ambient light of the top of electronic equipment's display screen, consequently, can dig a part with the display module assembly in the display screen, ambient light sensor is placed to this part, is equivalent to ambient light sensor and places the below of the glass apron in the display screen in, and display module assembly is located the same layer. Note that the detection direction of the ambient light sensor coincides with the orientation of the display screen in the electronic device (the orientation of the display screen in the electronic device in fig. 3 is the upper side in the drawing). Obviously, this arrangement of ambient light sensors sacrifices a portion of the display area. When a high screen occupation ratio is pursued, the arrangement mode of the ambient light sensor is not applicable.
Fig. 4 shows another arrangement of the ambient light sensor provided in the embodiments of the present application. And transferring the ambient light sensor from the lower part of the glass cover plate to the lower part of the display module. For example, the ambient light sensor is located below an Active Area (AA) area in the OLED display module, and the AA area is an area in the display module where image content can be displayed. This arrangement of the ambient light sensor does not sacrifice the display area. However, the OLED screen is a self-luminous display screen, when the OLED screen displays an image, a user can see the image from above the display screen, and similarly, the ambient light sensor located below the OLED screen can also collect light corresponding to the image displayed by the OLED screen. Therefore, the ambient light collected by the ambient light sensor includes light emitted by the display screen and actual ambient light from the outside. If the external real ambient light is to be accurately obtained, the light emitted by the display screen needs to be obtained in addition to the ambient light collected by the ambient light sensor.
As can be understood from fig. 4, since the ambient light sensor is located below the AA area, the AA area in the display module is not sacrificed due to the arrangement of the ambient light sensor. Therefore, the projection of the ambient light sensor on the display screen can be located in any area of the front of the display screen, and is not limited to the following arrangement: the projection of the ambient light sensor on the display screen is located at the top of the front of the display screen.
Regardless of which region of the display screen is located below the AA region, the projected area of the ambient light sensor on the display screen is much smaller than the area of the display screen itself. Therefore, not the entire display screen may emit light that interferes with the ambient light collected by the ambient light sensor. But the light emitted from the display area above the ambient light sensor in the display screen and the light emitted from the display area above a certain range around the ambient light sensor will interfere with the ambient light collected by the ambient light sensor.
As an example, the light sensing area of the ambient light sensor has a light sensing angle, and the ambient light sensor may receive light within the light sensing angle but not light outside the light sensing angle. In fig. 5, light emitted from point a above the ambient light sensor (within the sensing angle) and light emitted from point B above a certain range around the ambient light sensor (within the sensing angle) both interfere with the ambient light collected by the ambient light sensor. While the light emitted from point C (located outside the light sensing angle) farther away from the ambient light sensor in fig. 5 will not interfere with the ambient light collected by the ambient light sensor. For convenience of description, a display area of the display screen that interferes with the ambient light collected by the ambient light sensor may be referred to as a target area. The location of the target area in the display screen is determined by the specific location of the ambient light sensor under the AA area. As an example, the target area may be a square area with a side length of a certain length (e.g., 80 microns, 90 microns, 100 microns) centered at a center point of the ambient light sensor. Of course, the target area may also be an area of other shape obtained by measurement that interferes with the light collected by the ambient light sensor.
As another example, fig. 6 is a schematic front view of an OLED screen of an electronic device provided in an embodiment of the present application, and as shown in fig. 6, the electronic device includes a housing, an OLED screen of the electronic device displays an interface, a corresponding area of the display interface in the display screen is an AA area, and an ambient light sensor is located behind the AA area. The center point of the target area coincides with the center point of the ambient light sensor.
It should be noted that, the ambient light sensor is a device, and the manufacturer is different, and the shape of the appearance may also be different. The central point of the ambient light sensor in the embodiment of the present application is the central point of the photosensitive area where the ambient light sensor collects ambient light. In addition, the target area shown in fig. 6 is larger than the projection area of the ambient light sensor on the OLED screen. In practical application. The target area may also be less than or equal to the projection area of the ambient light sensor on the OLED screen. However, the target area is typically larger than the photosensitive area of the ambient light sensor. As mentioned above, the real ambient light from the outside is equal to the ambient light collected by the ambient light sensor minus the light emitted by the display screen. While the light emitted by the display screen has been determined to be the objectLight emitted from the target area. The light emitted by the target area is light generated by the display content of the target area. And the interference of the display content to the ambient light collected by the ambient light sensor comes from two parts: RGB pixel information of the display image and luminance of the display image. It can be understood from the above analysis that the interference to the ambient light collected by the ambient light sensor is: RGB pixel information of an image displayed by the target area and luminance information of the target area. As an example, if the pixel value of a pixel is (r, g, b) and the luminance is L, the normalized luminance of the pixel is: l × (r/255) 2.2 ,L×(g/255) 2.2 ,L×(b/255) 2.2 。
For convenience of description, an image corresponding to the target area may be denoted as a target image, and interference of RGB pixel information and luminance information of the target image on ambient light collected by the ambient light sensor may be denoted as fusion noise. The ambient light collected by the ambient light sensor can be recorded as initial ambient light, and the external real ambient light can be recorded as target ambient light.
From the above description it can be derived: the target ambient light is equal to the initial ambient light minus the fusion noise at each instant in the time period in which the initial ambient light was collected. In the embodiment of the present application, a process of calculating the fusion noise together according to the RGB pixel information and the luminance information is referred to as a noise fusion process.
When the display screen is in a display state, the RGB pixel information of the image displayed in the target area may change, and the brightness information of the displayed image may also change. The fusion noise may be changed whether the RGB pixel information of the image displayed in the target area is changed or the luminance information of the displayed image is changed. Therefore, it is necessary to calculate the fusion noise thereafter from the changed information (RGB pixel information or luminance information). If the image of the target area is unchanged for a long time, the fusion noise is calculated only when the brightness of the display screen is changed. Therefore, in order to reduce the frequency of calculating the fusion noise, the target region may be a region in which the frequency of change of the image displayed on the display screen is low. For example, a status bar area at the top of the front of the electronic device. The projection of the ambient light sensor on the display screen is located to the right in the status bar area of the display screen. Of course, the position of the ambient light sensor may be a position to the left in the status bar area or a position in the middle in the status bar area, and the embodiment of the present application does not limit the specific position of the ambient light sensor.
A technical architecture corresponding to the method for obtaining the target ambient light through the initial ambient light and the content displayed on the display screen according to the embodiment of the present application will be described below with reference to fig. 7.
As shown in fig. 7, the processor in the electronic device is a multi-core processor, which at least includes: an AP (application processor) processor and an SCP (sensor co processor) processor. The AP processor is an application processor in the electronic device, and an operating system, a user interface and an application program are all run on the AP processor. The SCP processor is a co-processor that may assist the AP processor in performing events related to images, sensors (e.g., ambient light sensors), and the like.
Only the AP processor and SCP processor are shown in fig. 7. In practical applications, the multi-core processor may also include other processors. For example, when the electronic device is a mobile phone, the multi-core processor may further include a Baseband (BP) processor that runs mobile phone radio frequency communication control software and is responsible for sending and receiving data.
The AP processor in fig. 7 only shows the content related to the embodiment of the present application, and the implementation of the embodiment of the present application needs to rely on: an Application Layer (Application), a Java Framework Layer (Framework Java), a native Framework Layer (Framework native), a Hardware Abstraction Layer (HAL), a kernel Layer (kernel), and a Hardware Layer (Hardware).
The SCP processor in fig. 7 may be understood as a sensor hub (sensor hub) that can control the sensors and process data related to the sensors. The implementation of the embodiment of the present application needs to rely on: a co-application layer (Hub APK), a co-framework layer (Hub FWK), a co-driver layer (Hub DRV), and a co-hardware layer (Hub hardware).
Various applications exist in the application layer of the AP processor, and application a and application B are shown in fig. 7. Taking application a as an example, after the user starts application a, the display screen will display the interface of application a. Specifically, the application a sends the display parameters (for example, the memory address, the color, and the like of the interface to be displayed) of the interface to be displayed to the display engine service.
And the display engine service in the AP processor sends the received display parameters of the interface to be displayed to a surfaceFlinger of a Framework layer (Framework native) of the AP processor.
The surface Flinger in the native Framework layer (Framework native) of the AP processor is responsible for the fusion of the control interface (surface). As an example, an overlap region of at least two interfaces that overlap is calculated. The interface here may be an interface presented by a status bar, a system bar, the application itself (interface to be displayed by application a), wallpaper, background, etc. Therefore, the surfaceflag can obtain not only the display parameters of the interface to be displayed by the application a, but also the display parameters of other interfaces.
The Hardware abstraction layer of the AP processor is provided with a HWC (Hardware component HAL), and the HWC is a module for synthesizing and displaying an interface in a system and provides Hardware support for a surfaflinger service. Step A1 is that the surfefinger sends the display parameters (e.g., memory address, color, etc.) of each interface to the HWC through the interfaces (e.g., setLayerBuffer, setLayerColor, etc.) for interface fusion.
Generally, in image synthesis (for example, when an electronic device displays an image, it is necessary to synthesize a status bar, a system bar, an application itself, and a wallpaper background), the HWC obtains a synthesized image according to display parameters of each interface through hardware (for example, a hardware synthesizer) underlying the HWC. The HWC in the hardware abstraction layer of the AP processor sends the underlying hardware-synthesized image to the OLED driver, see step A2.
The OLED drive of the kernel layer of the AP processor gives the synthesized image to the display subsystem (DSS) of the hardware layer of the AP processor, see step A3. The display subsystem (DSS) in the hardware layer of the AP processor may perform secondary processing (e.g., HDR10 processing for enhancing image quality) on the combined image, and may display the secondary processed image after the secondary processing. In practical applications, the secondary treatment may not be performed. Taking the example of not performing the secondary processing, the display subsystem of the AP processor hardware layer sends the synthesized image to the OLED screen for display.
If the starting of the application a is taken as an example, the synthesized image displayed by the OLED screen is an interface synthesized by the interface to be displayed by the application a and the interface corresponding to the status bar.
The OLED screen can complete image refreshing and displaying once according to the mode.
In the embodiment of the present application, before the image after the secondary processing (or the synthesized image) is sent to be displayed, the display subsystem (DSS) may store the whole frame of image (which may also be an image of the whole frame of image that is larger than the target area, and may also be an image corresponding to the target area in the whole frame of image) in the memory of the kernel layer of the AP processor, and since the process belongs to Concurrent Write-Back image frame data, the memory may be recorded as a Write-Back (CWB) memory, see step A4.
In the embodiment of the present application, for example, the display subsystem stores the entire frame image in the CWB memory of the AP processor, and after the display subsystem successfully stores the entire frame image in the CWB memory, the display subsystem may send a signal indicating that the storage is successful to the HWC. The whole frame image corresponding to the image stored in the CWB memory by the display subsystem may be recorded as an image to be refreshed (the image to be refreshed may also be understood as an image after the current refresh).
The AP processor may also be configured to allow the HWC to access the CWB memory. The HWC may obtain the target image from the CWB memory after receiving a signal indicating that the storage sent by the subsystem was successful, see step A5.
It should be noted that, regardless of whether the image of the whole frame image or the image of the partial region in the whole frame image is stored in the CWB memory, the HWC can obtain the target image from the CWB memory. The process of the HWC obtaining the target image from the CWB memory may be denoted as HWC matting from the CWB memory.
For convenience and description, the images stored by the display subsystem in the CWB memory may also be referred to as region images. As described above, the region image may be the entire frame image, may be the target image, or may be an image between the range of the target image and the range of the region image. The range of the target image can be understood as the length and width limited range size of the target image, and the range of the whole frame image can also be the length and width limited range size.
As an example, the size of the whole frame image is X1 (pixel) × Y1 (pixel), the size of the target image is X2 (pixel) × Y2 (pixel), and the size of the area image is X3 (pixel) × Y3 (pixel). X3 satisfies that X1 is not less than X3 not less than X2, and Y3 satisfies that Y1 is not less than Y3 not less than Y2.
Of course, when X3= X1 and Y3= Y1, the area image is the entire frame image. When X3= X2 and Y3= Y2, the area image is the target image.
Continuing to take application a as an example, when application a has a brightness adjustment requirement due to switching of the interface, application a sends the brightness to be adjusted to the display engine service.
And the display engine service in the AP processor sends the brightness to be adjusted to the kernel node in the kernel layer of the AP processor so as to adjust the brightness of the OLED screen by related hardware according to the brightness to be adjusted stored in the kernel node.
According to the mode, the OLED screen can complete one-time brightness adjustment.
In the embodiment of the present application, the HWC may be further configured to obtain the brightness to be adjusted from the kernel node, and the brightness to be adjusted may also be recorded as the brightness after the current adjustment, which is specifically referred to in step A5'.
In a specific implementation, the HWC may monitor whether data stored in the kernel node changes based on a uevent mechanism, and obtain currently stored data, that is, a brightness value to be adjusted (the brightness value to be adjusted is used to adjust the brightness of the display screen, and therefore, may also be recorded as the brightness value of the display screen) from the kernel node after monitoring that the data in the kernel node changes. After obtaining the target image or the brightness information to be adjusted, the HWC may send the target image or the brightness information to be adjusted to a noise algorithm library of a hardware abstraction layer of the AP processor. See step A6. The noise algorithm library can calculate and obtain the fusion noise at the refreshing time of the target image after the target image is obtained every time. After each brightness is obtained, the fusion noise at the brightness adjusting moment is calculated and obtained. And the noise algorithm library stores the fusion noise obtained by calculation in a noise memory of the noise algorithm library.
In practical applications, after the HWC obtains the target image, the HWC may store the target image, and the HWC may send the storage address of the target image to the noise algorithm library, and the noise algorithm library may buffer the target image of a frame at the latest time in an address-recording manner. After the HWC obtains the brightness to be adjusted, the HWC may send the brightness to be adjusted to a noise algorithm library, which may buffer a brightness at the latest moment. For convenience of description, the subsequent embodiments of the present application are described in terms of the HWC sending the target image to the noise algorithm library, and in practical applications, the HWC may obtain the target image and then store the target image, and send the storage address of the target image to the noise algorithm library.
As an example, after receiving the storage address of the first frame target image, the noise algorithm library caches the storage address of the first frame target image. And each time a new storage address of the target image is received, the new storage address of the target image is used as the latest storage address of the cached target image. Correspondingly, the noise algorithm library buffers the first brightness after receiving the first brightness, and the new brightness is taken as the latest brightness buffered every time a new brightness is received. In the embodiment of the application, the noise algorithm library caches the acquired target image and brightness value in the data storage library. The target image and the luminance value stored in the data store may be recorded as screen data, i.e. the screen data stored in the data store includes: a target image and a luminance value.
In addition, in order to describe the transfer relationship between parameters such as the target image and the brightness to be adjusted, the embodiment of the present application takes the example that the HWC sends the parameters such as the target image and the brightness to be adjusted to the noise algorithm library. In practice, the relationship between the HWC and the noise algorithm library calls the noise algorithm library for the HWC. When the HWC calls the noise algorithm library, the HWC inputs parameters such as a target image (a storage address of the target image), brightness to be adjusted, and the like as arguments of a calculation model in the noise algorithm library to the noise algorithm library. Other parameters will not be exemplified.
Because brightness adjustment and image refreshing are two completely independent processes, the image may be refreshed at a certain time, and the brightness remains unchanged, then the target image and the current brightness corresponding to the refreshed image are adopted when the fusion noise at the time is calculated (the brightness value stored in the noise algorithm library and stored latest before the time represented by the timestamp of the target image). For convenience of description, the fusion noise at the image refresh timing calculated due to the image refresh may be written as the image noise at the image refresh timing. Similarly, if the image is not refreshed at a certain time and the brightness is adjusted, the adjusted brightness and the current target image (the target image stored in the noise algorithm library and newly stored before the time indicated by the time stamp of the brightness value) are used for calculating the fusion noise at the certain time. For convenience of description, the fusion noise at the luminance adjustment timing calculated due to the luminance adjustment may be regarded as the backlight noise at the luminance adjustment timing.
The target image and the brightness sent by the HWC to the noise algorithm library are both time-stamped, and correspondingly, the image noise and the backlight noise obtained by the computation of the noise algorithm library are also both time-stamped. The timestamp of the image noise is the same as the timestamp of the target image, and the timestamp of the backlight noise is the same as the timestamp of the brightness to be adjusted. The timestamp of the image noise should be the image refresh moment in the strict sense. In practical applications, another time node close to the image refresh time may be used as the image refresh time, for example, a start time (or an end time, or any time between the start time and the end time) when the HWC performs matting to obtain the target image from the CWB memory may be used as the image refresh time. The time stamp of the backlight noise should be strictly speaking the backlight adjustment instant. In practical applications, another time node close to the backlight adjustment time may also be used as the backlight adjustment time, for example, the start time (or the end time, or any time between the start time and the end time) when the HWC executes to obtain the brightness to be adjusted from the kernel node is used as the brightness adjustment time. The timestamp of the image noise and the timestamp of the backlight noise facilitate denoising of the initial ambient light collected by the subsequent ambient light sensor and the ambient light sensor over a time span to obtain the target ambient light. The noise algorithm library stores the image noise and the backlight noise in a noise memory, stores the time stamp of the image noise when the noise algorithm library stores the image noise, and stores the time stamp of the backlight noise when the noise algorithm library stores the backlight noise.
An Ambient Light Sensor (ALS) in the co-hardware layer of the SCP processor collects initial ambient light at a certain collection period after start-up (typically, after the electronic device is powered on, the ambient light sensor is started up). The ambient light sensor of the SCP processor transmits the initial ambient light information to the ambient light sensor driver (ALS DRV) of the co-driver layer (Hub DRV) layer of the SCP processor, see step E2.
And in a cooperative driving (Hub DRV) layer of an SCP processor, an ambient light sensor driving (ALS DRV) carries out preprocessing on initial ambient light information to obtain raw values on four channels of the RGBC. The co-driver layer of the SCP processor transmits raw values on the RGBC four channels to the ambient light sensor application of the co-application layer of the SCP processor, see step E3.
The ambient light sensor of the co-application layer of the SCP processor sends raw values on the RGBC four channels and other relevant data (e.g. start time and end time of each time the ambient light sensor collects initial ambient light) to the HWC of the AP processor via a first inter-core communication (communication between the ambient light sensor application of the SCP processor and the HWC of the AP processor), see step E4.
After the HWC in the AP processor obtains the initial ambient light data reported by the SCP processor, the HWC in the AP processor may send the initial ambient light data to the noise algorithm library. See step A6.
As described above, the noise algorithm library may calculate the image noise at the image refresh timing and the backlight noise at the luminance adjustment timing, and store the calculated image noise and backlight noise in the noise memory in the noise algorithm library. In practical application, the noise algorithm library can calculate and obtain image noise at the image refreshing moment and backlight noise at the brightness adjusting moment. The integral noise between the acquisition start time and the acquisition end time of the initial ambient light may also be obtained from the image noise and the backlight noise stored in the noise memory after the acquisition start time and the acquisition end time of the initial ambient light are obtained. And the noise algorithm library deducts integral noise between the acquisition starting time and the acquisition ending time of the initial environment light from the initial environment light to obtain the target environment light.
As can be understood from the above description of the noise algorithm library, the noise calculation library includes a plurality of calculation models, for example, a first algorithm model, for obtaining the fusion noise according to the target image and the luminance calculation. And the second algorithm model is used for obtaining integral noise between the acquisition starting time and the acquisition ending time of the initial environment light according to the fusion noise at each moment. And the third algorithm model is used for obtaining the target ambient light according to the initial ambient light and the integral noise. In practical applications, the noise algorithm library may further include other calculation models, for example, in a process of obtaining the target ambient light based on the target image, the brightness, and the initial ambient light, if the raw values on the four channels of the initial ambient light are filtered, there is a model for filtering the raw values on the four channels of the initial ambient light, which is not illustrated in the embodiment of the present application.
The inputs to the library of noise algorithms include: target image and brightness obtained by the HWC at various times, and initial ambient light related data obtained by the HWC from the SCP processor. The output of the noise algorithm library is: and the raw value of the target ambient light can be recorded as a second value. In the embodiment of the present application, the process of sending the target image, the brightness, and the initial ambient light to the noise algorithm library by the HWC is denoted as step A6.
The noise computation library also needs to return the target data to the HWC after obtaining the target ambient light, which is denoted as step A7. In practical applications, the output of the noise algorithm library is raw values on four channels of the target ambient light.
The HWC in the AP processor sends the raw values on the four channels of the target ambient light returned by the noise algorithm library to the ambient light sensor application in the cooperative application layer of the SCP processor through first inter-core communication, see step A8.
After the ambient light sensor application of the co-driver layer of the SCP processor obtains the raw values on the target ambient light four channels, the raw values on the target ambient light four channels are stored in the ambient light memory of the co-driver layer. See step E5.
The co-driver layer of the SCP processor is provided with a calculation module that obtains from memory the raw values on the target ambient light four channels, see step E6. When the integration of each time is finished, the ambient light sensor generates an integration interrupt signal, the ambient light sensor sends the integration interrupt signal to the ambient light sensor driver, the ambient light sensor driver calls the calculation module, and the calculation module is triggered to obtain raw values on four channels of the target ambient light from the storage.
The ambient light sensor drive triggers the calculation module to acquire the raw value of the target ambient light after the current integration is finished, so that the raw value of the target ambient light in the previous integration period is acquired at the moment.
Taking the embodiment shown in FIG. 8 as an example, at t 1 After the time integral is finished, the ambient light sensor obtains t 0 Time to t 1 Initial ambient light at time, SCP processor will t 0 Time to t 1 The initial environment light at the moment is sent to an AP processor, and the AP processor obtains t through calculation 0 Time to t 1 Raw value of the target ambient light at the time. AP processor will t 0 Time to t 1 The raw value of the target ambient light at the time is sent to the SCP processor. The SCP processor will store t 0 Time to t 1 Raw value of target ambient light at time instant into memory of SCP processor.
At t 3 After the time integral is finished, the ambient light sensor obtains t 2 Time to t 3 Initial ambient light at time, SCP processor will t 2 Time to t 3 The initial ambient light at the time is sent to the AP processor. An integral interruption signal is generated after the integration of the ambient light sensor is finished every time, the ambient light sensor sends the integral interruption signal to the ambient light sensor drive, the ambient light sensor drive calls the calculation module, and the calculation module is triggered to obtain the currently stored t from the storage 0 Time to t 1 Target of time of dayRaw value of ambient light. Since this time is t 3 After the moment, the calculation module therefore at t 3 After the moment according to t obtained 0 Time to t 1 And calculating the raw value of the target ambient light at the moment to obtain the lux value of the target ambient light. That is, the lux value of the target ambient light calculated by the SCP processor in the T2 period is the lux value of the real ambient light in the T1 period.
As previously mentioned, the ambient light sensor in the SCP processor will end up integrating (t) 3 Time) is followed by an integration interrupt signal (which gives the ambient light sensor drive) and at t 3 And after the moment, the initial ambient light of the T2 period is sent to the AP processor, the target ambient light is sent to the SCP processor after the AP processor calculates and obtains the target ambient light, and the SCP processor stores the target ambient light of the T2 period in a memory. If the SCP processor calculates the lux value using the raw value of the target ambient light for the T2 period, it is necessary to start with receiving the integration interrupt signal from the ambient light sensor driver, and wait until the AP processor transmits the target ambient light to the memory of the SCP processor. The ambient light sensor in the SCP processor is driven to invoke the calculation module to obtain the raw value of the target ambient light for the T2 period from the memory. The waiting time includes at least: the method comprises a process that an SCP processor sends initial ambient light to an AP processor, and a time decision corresponding to a process that the AP processor calculates and obtains target ambient light based on the initial ambient light and other related data and a process that the AP processor sends the target ambient light to a memory in the SCP processor, wherein the time decision is relatively long and is not fixed. Therefore, the ambient light sensor driver in the SCP processor may be configured to invoke the calculation module to fetch the raw value of the target ambient light of the previous cycle from the memory after receiving the integral interrupt signal of the second acquisition cycle, so as to calculate the lux value according to the raw value of the target ambient light of the previous cycle. The lux value of the target ambient light can be recorded as a third value, and the third value and the second value are the lux value and the raw value of the same target ambient light.
And a calculation module in a co-drive layer of the SCP processor obtains the lux value of the target ambient light according to the raw values on the four channels of the target ambient light. And the calculation module in the SCP processor sends the calculated lux value of the target ambient light to the ambient light sensor application of the co-application layer through the interface of the co-framework layer, which refers to steps E7 and E8.
The ambient light sensor application of the co-application layer in the SCP processor transmits the lux value of the target ambient light to the light service (light service) of the native framework layer in the AP processor through the second inter-core communication (communication of the optical service from the SCP processor to the AP processor), see step E9.
A light service (light service) may send the lux value of the target ambient light to the display engine service. The display engine service may send the lux value of the target ambient light to the upper layer to facilitate an application in the application layer to determine whether to adjust the brightness. The display engine service can also send the lux value of the target ambient light to the kernel node so as to enable related hardware to adjust the brightness of the display screen according to the lux value of the target ambient light stored by the kernel node.
After describing the technical architecture on which the method of obtaining the target ambient light depends, the process of obtaining the target ambient light from the target image, the brightness, and the initial ambient light collected by the ambient light sensor will be described from the perspective of the collection period of the ambient light sensor.
As can be understood from the above examples, the target image and the brightness to be adjusted are both obtained by the HWC, so there is a sequential order in the process of obtaining the target image and obtaining the brightness to be adjusted by the HWC. After the HWC acquires the target image or the brightness to be adjusted, the target image or the brightness to be adjusted is sent to the noise algorithm library, and the process that the HWC sends the target image or the brightness to be adjusted to the noise algorithm library also has a sequence. Correspondingly, the time when the noise algorithm library receives the target image and the brightness to be adjusted also has a sequence. However, even if there is a chronological order in the time when the noise algorithm library receives the target image and the brightness to be adjusted, the timestamps of the target image and the brightness to be adjusted may be the same since the HWC may be within the same time metric level when acquiring the target image and the brightness to be adjusted. As an example, within the same millisecond (5 th millisecond), the HWC performs the acquisition of the brightness to be adjusted first and then performs the acquisition of the target image. Although there is a precedence in the execution of the HWC, the time stamps of the target image and the brightness to be adjusted are both 5 th msec.
Referring to fig. 8, the ambient light sensor collects ambient light at a time period T from which the ambient light sensor collects 0 To t 2 (acquisition period T1) from T 2 To t 4 (acquisition period T2), from T 4 To t 6 (acquisition period T3) is one acquisition period. In the acquisition period of T1, the real acquisition time of the ambient light sensor is T 0 To t 1 From t 1 To t 2 The ambient light sensor may be in a sleep state for this period of time. The embodiment of the present application is described by taking as an example that the acquisition period of the ambient light is fixed (i.e., the values of T1, T2, and T3 are the same) and the duration of the integration period is fixed.
As an example, it may be at 350ms (t) 2 -t 0 ) As one acquisition cycle. The actual acquisition time of the ambient light sensor in one acquisition period is 50ms (t) 1 -t 0 ) Then the ambient light sensor will have 300ms (t) in one acquisition period 2 -t 1 ) Is in a dormant state. The above examples of 350ms, 50ms and 300ms are for example only and not intended to be limiting.
For ease of description, the time period (e.g., t) for which the ambient light sensor actually collects may be described 0 To t 1 ) Time periods when the environmental sensor does not initiate acquisition (e.g., t) are noted as integration time periods 1 To t 2 ) Denoted as the non-integration period.
The image displayed on the display screen of the electronic device is refreshed at a certain frequency. Taking 60Hz as an example, it is equivalent to refreshing the display screen of the electronic device 60 times per second, or refreshing the image every 16.7 ms. Image refresh occurs during the acquisition period of the ambient light sensor when the display screen of the electronic device displays images. When the image displayed on the display screen is refreshed, the AP processor performs steps A1 to A6 (transmission target image) in the technical architecture shown in fig. 7. HWC in AP processor from t 0 Starting at the moment, the CWB is controlled to write back all the time, i.e. as long as it existsThe image refresh is repeated all the time.
Note that, in the present embodiment, a refresh rate of 60Hz is taken as an example. In practice, the refresh rate may be 120Hz or other refresh rates. In the embodiment of the present application, the step A1 to the step A6 (sending target images) need to be repeatedly executed every refresh one frame, and in practical applications, the step A1 to the step A6 (sending target images) may also be repeatedly executed every other frame (or two frames, etc.).
The brightness adjustment does not have a fixed periodicity, so the brightness adjustment may also occur during the acquisition period of the ambient light sensor. When the brightness is adjusted, the HWC also performs steps A5' to A6 (sending the brightness to be adjusted) in the technical architecture shown in fig. 7.
After each integration of the ambient light sensor (i.e. at t) 1 After that, t 3 After that, t 5 And then, reporting initial ambient light data (for example, raw values on four channels of the initial ambient light and integration starting time and integration ending time of the current integration process) acquired in the current integration process to the HWC of the AP processor by the SCP processor, sending the related data of the initial ambient light to a noise algorithm library by the HWC of the AP processor, and calculating to obtain target ambient light through the noise algorithm library.
Referring to FIG. 9, taking an acquisition cycle as an example, at t 01 Time (sum t) 0 The same time), t 03 Time t 04 Time t and 11 the moments are all image refreshing moments at t 02 Time t and 12 the time is the brightness adjustment time. Thus, the AP processor can compute t in real time 01 Image noise at time t 02 Backlight noise at time t 03 Image noise at time, t 04 Image noise at time t 11 Image noise and t at time 12 Backlight noise at the moment. At the end of this integration (t) 1 Time of day), the noise memory of the AP processor stores: t is t 01 Image noise at time t 02 Backlight noise at time t 03 Image noise and t at time 04 Temporal image noiseAnd (4) sound.
At the end of this integration (t) 1 Time), the ambient light sensor obtains the initial ambient light of the current integration and the current integration time period. The SCP processor reports the data of the initial environment light to the AP processor, and a noise calculation module in the AP processor obtains t from a noise memory according to the starting time and the ending time of the current integration time period 01 Image noise at time t 02 Backlight noise at time t 03 Image noise at time, t 04 Image noise at time instants. And the noise calculation library calculates and obtains target ambient light according to the initial ambient light collected in the integral time period and the image noise and the backlight noise influencing the integral time period.
During a non-integration period (t) 1 To t 2 ) Since the HWC always controls the CWB write back, therefore, the HWC is on t 11 The refreshed image at the moment is also subjected to matting to obtain a target image, and the noise algorithm library also calculates t 11 Image noise at time instants. Non-integration time period t 12 The brightness changes at the moment, and the noise algorithm base also calculates t 12 Backlight noise at the moment. However, when the target ambient light is obtained by calculation, the required fusion noise is a fusion noise that interferes with the initial ambient light obtained in the current integration period, and therefore, t is not required 11 Image noise sum of time of day t 12 The backlight noise at the moment can also obtain the target ambient light of the current integration time period. In practical application, the noise algorithm library computer obtains t 11 Image noise and t at time 12 After the backlight noise at the moment, t also needs to be set 11 Image noise and t at time 12 The backlight noise at the moment is stored in a noise memory.
The above example describes the process of acquiring the target ambient light from the perspective of the technical architecture based on fig. 7 and from the perspective of the acquisition period of the ambient light sensor based on fig. 9, respectively. A time sequence process diagram for acquiring the target ambient light provided by the embodiment shown in fig. 10 will be described below with reference to the technical architecture shown in fig. 7 and one acquisition cycle of the ambient light sensor shown in fig. 9.
As can be understood from the above description, the process of triggering the AP processor to calculate the image noise by image refresh, the process of triggering the AP processor to calculate the backlight noise by brightness adjustment, and the process of controlling the underlying hardware ambient light sensor to collect the initial ambient light by the SCP processor are performed independently, and there is no chronological order. And processing the target image, the brightness and the initial ambient light obtained in the three independent processes by using a noise algorithm library of the AP processor to obtain the target ambient light.
The same step numbers in the embodiment shown in fig. 10 and in the technical architecture shown in fig. 7 indicate that the same steps are performed. In order to avoid repetitive description, the contents detailed in the embodiment shown in fig. 7 will be briefly described in the embodiment shown in fig. 10.
In connection with FIG. 9, from t 0 The moment begins and the image is refreshed. At the same time, the ambient light sensor enters an integration period to begin collecting initial ambient light.
Accordingly, in FIG. 10, step E1, the ambient light sensor in the co-hardware layer of the SCP processor enters an integration period, from t 0 (t 01 ) The initial ambient light is collected at the beginning of the time.
Step A1, image t 0 (t 01 ) And refreshing at the moment, and sending the display parameters of the interface to the HWC in the hardware abstraction layer of the AP processor by the SurfaceFlinger in the native framework layer of the AP processor. The HWC may send the received display parameters of each layer interface sent by the surfafinger to the hardware at the bottom of the HWC, and the hardware at the bottom of the HWC obtains the image synthesized by each layer interface according to the display parameters of each layer interface. The hardware underlying the HWC returns the synthesized image to the HWC.
In step A2, the HWC in the hardware abstraction layer of the AP processor sends the synthesized image to the OLED driver in the kernel layer of the AP processor.
And step A3, the OLED in the kernel layer of the AP processor drives and sends the synthesized image to a display subsystem of a hardware layer of the AP processor.
And step A4, the display subsystem in the hardware layer of the AP processor stores the image before display in a CWB memory in the kernel layer of the AP processor.
In the embodiment of the present application, the HWC waits for a successful store signal from the display subsystem after sending the synthesized image to the OLED driver.
The display subsystem will send a signal to the HWC that the image was successfully stored in the CWB memory before being sent for display. After receiving the signal that the display subsystem is successfully stored, the HWC performs cutout operation on the image before display stored in the CWB memory in the kernel layer to obtain a target image.
And step A5, the HWC in the hardware abstraction layer of the AP processor performs matting to obtain a target image from the image before display stored in the CWB memory in the kernel layer.
Step A6, after obtaining a target image, the HWC in the hardware abstraction layer of the AP processor sends the target image to a noise algorithm library of the layer, and after receiving the target image, the noise algorithm library calculates t according to the target image and cached current brightness information 01 Image noise at time instants. In the execution process of the step A1 to the step A6, the ambient light sensor in the assistant hardware layer of the SCP processor is always in the integration process in one acquisition period.
In conjunction with FIG. 9, at t 02 At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t 02 At that moment, the brightness of the display screen changes, triggering the execution of step B1 in fig. 10.
In FIG. 10, step B1 (step A5' in the architecture shown in FIG. 7), the HWC of the hardware abstraction layer of the AP processor obtains t from a kernel node in the kernel layer of the AP processor 02 Luminance information of the time instant.
Step B2 (step A6), HWC of hardware abstraction layer of AP processor will t 02 The brightness information of the moment is sent to a noise algorithm library according to t 02 Calculating and obtaining t by the brightness information of the moment and the cached currently displayed target image 02 Backlight noise at the moment.
In the execution process of the step B1 to the step B2, the ambient light sensor in the assistant hardware layer of the SCP processor is always in an integration process in an acquisition period.
After step B2, a noise algorithmNoise memory of the library stores t 01 Image noise and t at time 02 Backlight noise at the moment.
In conjunction with FIG. 9, at t 03 At this point in time, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t 03 At that time, the image is refreshed once.
In fig. 10, since the image is refreshed, the steps C1 to C6 are continuously performed, and the steps C1 to C6 can refer to the descriptions in A1 to A6, and are not described again here.
In the execution process of the step C1 to the step C6, the ambient light sensor in the co-hardware layer of the SCP processor is still in the integration process within one acquisition period.
After step C6, the noise store of the noise algorithm library stores t 01 Image noise at time, t 02 Backlight noise and t at time 03 Image noise at time instants.
Referring to FIG. 9, at t 04 At that moment, the ambient light sensor is still in the integration period, collecting the initial ambient light. At t 04 At that time, the image is refreshed once.
In fig. 10, since the image is refreshed, the steps D1 to D6 are continuously performed, and the steps D1 to D6 can refer to the descriptions in A1 to A6, which are not repeated herein.
In the execution process of the step D1 to the step D6, the ambient light sensor in the assistant hardware layer of the SCP processor is still in the integration process in one acquisition period.
After step D6, the noise memory of the noise algorithm library stores t 01 Image noise at time t 02 Backlight noise at time, t 03 Image noise and t at time 04 Image noise at time instants.
In conjunction with FIG. 9, at t 1 At this time, the current integration of the ambient light sensor is finished, and the integration of the ambient light sensor is finished (t) 1 Time) the ambient light sensor obtains the initial ambient light, and in fig. 10, the SCP processor starts to execute step E2, step E3, step E4, and the data (RGBC four-channel) related to the initial ambient light is obtainedThe raw value, the integration start time, and the integration end time) to the HWC of the hardware abstraction layer of the AP processor.
In conjunction with FIG. 9, during non-integration periods, the image may also be refreshed (e.g., t 11 Temporal image refresh), brightness may also change (e.g., t 12 The brightness changes at the moment). Therefore, in the non-integration period, steps F1 to F6 still exist in fig. 10 (steps F1 to F5 in fig. 10 are omitted, and specifically, steps A1 to A5 may be referred to), so that t 11 The image noise at the time is stored in a noise memory of a noise algorithm library. In the non-integration period, there still exists step G1 to step G2 (step G1 in fig. 9 is omitted, and specifically, refer to step B1), so that t 12 The backlight noise at a time is stored in a noise memory of a noise algorithm library.
In step A6', the HWC in the hardware abstraction layer of the AP processor sends the initial ambient light data to the noise algorithm library. And the noise algorithm library calculates and obtains the target ambient light according to the data of the initial ambient light and the image noise and the backlight noise which interfere with the initial ambient light.
As can be understood from fig. 10, the integration start time and the integration end time of the ambient light sensor are controlled by the corresponding clocks of the ambient light sensor; the process of calculating the image noise by the AP processor is controlled by an image refreshing clock; the process of the AP processor calculating the backlight noise is controlled by the timing of the backlight adjustment. Therefore, the execution of step A1 (or steps C1, D1, F1) is triggered by an image refresh. The execution of step B1 (or step G1) is triggered by the brightness adjustment. The integration start time and the integration end time of the ambient light sensor are completely performed according to a preset acquisition period and each integration duration. The execution of step E2 is therefore triggered by the event that the ambient light sensor integration is over.
From the triggering event perspective, these three processes are completely independent. However, the results obtained by the three processes (image noise, backlight noise, and initial ambient light) are correlated by the denoising process after the ambient light sensor integration period is over. The initial ambient light fused in the denoising process is the initial ambient light collected by the ambient light sensor in the current collection period, and the image noise and the backlight noise removed in the denoising process are image noise and backlight noise which can cause interference on the initial ambient light collected in the current collection period.
The embodiment of the application can obtain by analyzing the structure of the ambient light under the screen: factors disturbing the ambient light collected by the ambient light sensor include the display content of the display area directly above the photosensitive area of the ambient light sensor and the display area directly above a certain area around the photosensitive area of the ambient light sensor. The display content is divided into two parts: RGB pixel information and luminance information of the display image. Therefore, the noise calculation library in the embodiment of the present application obtains the fusion noise according to the RGB pixel information and the luminance information fusion of the target image. Then, integral noise of the integral time period of the initial ambient light is obtained from the fusion noise. The target ambient light is obtained by removing integral noise that interferes with the initial ambient light from the initial ambient light obtained from the ambient light sensor integration period. Because the interference part is removed, accurate target ambient light can be obtained, and the universality is strong.
In addition, since the AP processor of the electronic device can obtain the target image and the luminance information, accordingly, the AP processor obtains image noise and backlight noise. The SCP processor can obtain the initial ambient light. Thus, the SCP processor may send the initial ambient light to the AP processor, where the initial ambient light and the fusion noise are processed by the AP processor to obtain the target ambient light. The problem that the AP processor frequently sends the target image (or image noise) and the brightness information (or backlight noise) to the SCP processor, and the inter-core communication is too frequent and consumes more power is avoided.
Furthermore, the DSS in the AP processor may store the image before display (the image to be displayed in the current refresh of the display screen) in the CWB memory. The HWC in the AP processor extracts a target image from an image before display sending stored in the CWB memory so as to calculate and obtain fusion noise, and the fusion noise obtained by the method is accurate and has low power consumption.
It should be noted that, in the case of displaying an image on the display screen, the brightness of the display screen needs to be adjusted according to the target ambient light. In the case where the display screen does not display any image, it is not necessary to adjust the brightness of the display screen in accordance with the target ambient light. Therefore, the AP processor also needs to monitor the display screen for on and off screen events. When the screen is bright, the method for detecting the target ambient light provided by the embodiment of the application is executed. When the screen is turned off, the AP processor may not perform steps A4 to A6. Similarly, the SCP processor may also control the ambient light sensor to stop collecting the initial ambient light when the screen is turned off, and the SCP processor may not perform steps E2 to E5.
To provide a clearer understanding of the execution inside the AP processor, a timing diagram between various modules inside the AP processor is described, which is obtained by obtaining t in the embodiment shown in fig. 10 01 Image noise at time t 02 The backlight noise at the time is described as an example.
In the embodiment shown in fig. 11, when refreshing the image, the respective modules in the AP processor perform the following steps:
step 1100, after the display engine service obtains the display parameters of the interface to be displayed from the application in the application layer, the display engine service sends the display parameters of the interface to be displayed to the surface flicker.
In step 1101, after the surface flinger obtains the display parameters of the interface to be displayed of the application a from the display engine service, the display parameters (e.g., memory address, color, etc.) of each interface (the interface to be displayed of the application a, the status bar interface, etc.) are sent to the HWC through the interfaces (e.g., setLayerBuffer, setLayerColor).
And 1102, after the HWC receives the display parameters of each interface, acquiring a synthesized image according to the display parameters of the interface to be displayed through hardware of the HWC bottom layer.
And step 1103, after the hwc obtains the image synthesized by the hardware of the bottom layer, the synthesized image is sent to the OLED driver.
And 1104, after receiving the synthesized image sent by the HWC, the OLED driver sends the synthesized image to the display subsystem.
Step 1105, after the display subsystem receives the synthesized image, it performs a secondary processing on the synthesized image to obtain the image before display.
At step 1106, the display subsystem stores the pre-display image in the CWB memory.
It should be noted that, since the OLED screen needs to refresh the image, the display subsystem needs to send the image before being sent to the display screen for display.
In the embodiment of the application, the step of sending the image before being sent and displayed to the display screen by the display subsystem and the step of storing the image before being sent and displayed in the CWB memory by the display subsystem are two independent steps without strict precedence order.
In step 1107, after the display subsystem successfully stores the pre-display image in the CWB memory, it may send a signal to the HWC that the storage was successful.
In step 1108, after receiving the signal that the storage is successful, the HWC performs matting to obtain the target image from the pre-rendering image stored in the CWB memory, and the time when the HWC starts to obtain the target image is used as the timestamp of the target image.
In step 1109, the hwc sends the target image and the timestamp to the noise algorithm library after acquiring the target image and the timestamp.
Step 1110, the noise algorithm library calculates and obtains the image noise (t) at the refresh time corresponding to the target image 01 Image noise at the time of day). The timestamp of the image noise is the timestamp of the target image from which the image noise is obtained. A noise algorithm library stores the image noise and a timestamp of the image noise.
During brightness adjustment, each submodule in the AP processor executes the following steps:
and 1111, after the display engine service obtains the brightness to be adjusted from the application a in the application layer, the display engine service sends the brightness to be adjusted to the kernel node.
In step 1112, the hwc acquires the brightness to be adjusted from the core node after monitoring that the data in the core node changes. The time at which the HWC executes the retrieval of the brightness to be adjusted from the kernel node is the timestamp of the brightness to be adjusted.
In practical applications, the HWC always listens to the kernel node for data changes.
Step 1113, HWC sends the adjusted brightness and the timestamp of the brightness to be adjusted to the noise algorithm library.
Step 1114, the noise algorithm library calculates the backlight noise (t) at the adjustment time of the brightness to be adjusted 02 Backlight noise at the moment). The timestamp of the backlight noise is the timestamp of the brightness to be adjusted of the backlight noise. A noise algorithm base stores the backlight noise and a time stamp of the backlight noise.
After the end of an integration period, the SCP processor sends the initial ambient light collected during the integration period to the HWC in the AP processor.
In step 1115, the HWC of the ap processor receives the initial ambient light sent by the SCP processor and the integration start time and integration end time of the initial ambient light.
After receiving the initial ambient light sent by the SCP processor and the integration start time and the integration end time of the initial ambient light, the hwc sends the initial ambient light and the integration start time and the integration end time of the initial ambient light to the noise algorithm library in step 1116.
In step 1117, the noise algorithm library calculates the integral noise according to the image noise and the corresponding timestamp, the backlight noise and the corresponding timestamp and the integral start time and the integral end time of the initial ambient light. And the noise algorithm library is used for calculating and obtaining the backlight noise according to the integral noise and the initial ambient light.
The embodiment of the application mainly describes a sequential logic diagram among modules when the AP processor obtains target ambient light.
The process of obtaining the target ambient light from the target image, the brightness, and the initial ambient light calculation by the noise algorithm library in the embodiment shown in fig. 7 will be described below.
Step one, when a noise calculation library obtains a target image, calculating and obtaining image noise at the refreshing time of the target image according to the target image and the brightness of a display screen at the refreshing time of the target image; and when the noise calculation library acquires a brightness, calculating to acquire the backlight noise at the brightness adjusting time according to the brightness and the target image at the brightness adjusting time.
Although the image noise and the backlight noise are different names, the calculation process is calculated according to a frame target image and a luminance value.
Firstly, weighting and operation are carried out according to the RGB value of each pixel point and the weighting coefficient of each pixel point, and the weighted RGB value of the target image is obtained. And determining the weighting coefficient of each pixel point according to the distance between the coordinate of the pixel point and the reference coordinate of the target image. The coordinates of the center point of the photosensitive area of the ambient light sensor may be used as reference coordinates of the target image.
And step two, the noise calculation library obtains fusion noise according to the weighted RGB value and the brightness of the target image. The fusion noise may be obtained by a table lookup method (in the table, fusion noise corresponding to the weighted RGB value of the target image and the luminance is set), or may be obtained by a preset functional relationship (the independent variable is the weighted RGB value and the luminance of the target image, and the dependent variable is the fusion noise). The fusion noise obtained at this time is a raw value of four channels.
And step three, calculating and obtaining integral noise in the integral time period of the initial environment light by the noise calculation library according to the fusion noise at each moment.
It should be noted that image noise is not generated by the image refresh process itself. In the integration time period, in the time period before the image refreshing, the interference to the initial environment light is the image noise corresponding to the image before the refreshing, and in the time period after the image refreshing, the interference to the initial environment light is the image noise corresponding to the image after the refreshing.
Similarly, the backlight noise is not generated by the process of brightness adjustment itself. In the integration time period, in the time period before brightness adjustment, the interference to the initial environment light is backlight noise corresponding to the brightness before adjustment, and in the time period after brightness adjustment, the interference to the initial environment light is backlight noise corresponding to the brightness after adjustment.
As described above, the noise memory stores the image noise and the backlight noise at each time point calculated by the noise algorithm library. The noise stored in the noise memory is collectively referred to as fusion noise.
A1, an AP processor takes out fusion noise from an exit position of a noise memory through a noise algorithm library, and updates the exit position of the noise memory or the fusion noise of the exit position through the noise algorithm library;
step B1, if the timestamp corresponding to the currently taken out fusion noise is before the first time or the first time, the AP processor continues to execute the step A1 through a noise algorithm library until the currently taken out fusion noise is after the first time;
step B2, if the currently extracted fusion noise is after the first time, the AP processor executes the following steps through the noise algorithm library:
step C1, if the timestamp of the currently taken out fusion noise is after the first time for the first time and before the second time, calculating and obtaining integral noise between the first time and the moment corresponding to the timestamp of the currently taken out fusion noise according to the fusion noise taken out last time, and continuing to execute from the step A1;
step C2, if the timestamp of the currently taken out fusion noise is after the first time for the first time and after the second time or the second time, calculating and obtaining integral noise between the first time and the second time according to the fusion noise taken out last time, and continuing to execute the step D1;
step C3, if the timestamp of the fusion noise which is taken out at present is not after the first time for the first time and is before the second time, integral noise between the time corresponding to the timestamp of the fusion noise which is taken out at last time and the time corresponding to the timestamp of the fusion noise which is taken out at present is obtained according to the calculation of the fusion noise which is taken out at last time; and continuing to execute from the step A1;
step C4, if the timestamp of the currently taken out fusion noise is not after the first time for the first time and is after the second time or the second time, calculating integral noise between the time corresponding to the timestamp of the last taken out fusion noise and the second time according to the last taken out fusion noise, and continuing to execute the step D1;
and D1, obtaining target ambient light according to the integral noise between the first time and the second time and the initial ambient light.
The first time is the starting time of one integration time period, and the second time is the ending time of the same integration time period.
When the noisy memory is a FIFO (First Input First Output) memory. The FIFO memory is a first-in first-out double-port buffer, and one of two ports of the memory is an input port of the memory and the other port of the memory is an output port of the memory. In the structure of the memory, the data entering the memory first is shifted out first, and correspondingly, the sequence of the shifted-out data is consistent with the sequence of the fed-in data. The position of the outlet of the FIFO memory is the memory address corresponding to the output port of the FIFO memory.
When the FIFO memory shifts out a datum, the process is as follows: the fused noise stored in the exit position is removed from the exit position (first position), and then the data in the second position from the exit position is moved to the exit position, and the data in the third position from the exit position is moved to the second position from the exit position, \ 8230; \8230; sequentially.
Of course, in practical applications, after the fused noise stored in the first position (A1) is removed from the exit position (first position, A1), the exit position of the memory may be updated to the second position (A2). After the merged noise stored in the current exit position (A2) is removed again, the exit position of the memory is continuously updated to a third position (A3) \8230, which is performed in sequence.
The process of obtaining the target ambient light based on the above calculation may refer to the embodiment described with reference to fig. 12 to the embodiment shown in fig. 14.
Referring to fig. 12, fig. 12 is a block diagram illustrating a noise calculation library in an AP processor according to image noise according to an embodiment of the present disclosureAnd a process of calculating integral noise from the backlight noise. The respective time in the process may be compared with the description of the respective time in the embodiments shown in fig. 9 and fig. 10: at t 01 Refreshing the image at all times to obtain t 01 Image noise at a time; at t 02 Adjusting brightness at a moment to obtain t 02 Backlight noise at the moment; at t 03 Constantly refreshing the image to obtain t 03 Image noise at a time; at t 04 Refreshing the image at all times to obtain t 04 Image noise at time instants.
From t 01 Time to t 02 At the moment, the displayed image is t 01 The brightness of the display screen of the image after the moment refreshing is t 01 Luminance at time (t) 01 The brightness at the moment is the brightness value stored in the noise algorithm library at t 01 Brightness value most recently stored before time), t 01 The image noise at time t 01 The brightness of the image after the moment refreshing on the display screen is t 01 Noise in the case of the brightness of the time instant. Thus, the initial ambient light includes a duration of "t 02 -t 01 ", time stamp t 01 The image noise of (1).
From t 02 Time to t 03 At the moment, the brightness of the display screen is t 02 The brightness after the moment adjustment is t, the image displayed by the display screen 01 Image after temporal refresh, t 02 Backlight noise at time t 02 The brightness after the moment adjustment is displayed on the display screen to be t 01 Noise in the case of time-adjusted images. Thus, the initial ambient light includes a duration of "t 03 -t 02 ", time stamp t 02 Backlight noise at the moment.
From t 03 Time to t 04 At the moment, the displayed image is t 03 The brightness of the display screen of the image after the moment refreshing is t 02 Brightness adjusted at time t 03 The image noise at time t 03 The brightness of the image after the moment refreshing on the display screen is t 02 Noise in the case of the luminance after the time adjustment. Thus, the initial ambient light includes a duration of "t 04 -t 03 ", time stamp t 03 The image noise of (1).
From t 04 Time to t 1 At the moment, the displayed image is t 04 The brightness of the display screen of the image after the moment refreshing is t 02 Brightness adjusted at time t 04 The image noise at time t 04 The brightness of the image after the moment refreshing on the display screen is t 02 Noise in the case of the luminance after the time adjustment. Thus, the initial ambient light includes a duration of "t 1 -t 04 ", time stamp t 04 The image noise of (1).
Based on the above understanding, the AP processor, in calculating the integral noise:
t 01 image noise pair t of time 01 Time to t 02 The initial ambient light at the moment causes interference;
t 02 backlight noise pair t of time instants 02 Time to t 03 The initial ambient light at that moment causes interference;
t 03 image noise pair t of time 03 Time to t 04 The initial ambient light at the moment causes interference;
t 04 image noise pair t of time 04 Time to t 1 The initial ambient light at the moment causes interference.
Thus, t can be calculated separately 01 Time to t 02 Integral noise at time, t 02 Time to t 03 Integral noise at time t 03 Time to t 04 Integral noise at time t 04 Time to t 1 Integral noise at time instants.
For t 01 Time to t 02 The integral noise at time is: (t) 02 -t 01 )/(t 1 -t 0 )×N t01 。
For t 02 Time to t 03 The integral noise at time is: (t) 03 -t 02 )/(t 1 -t 0 )×N t02 。
For t 03 Time to t 04 The integral noise at time is: (t) 04 -t 03 )/(t 1 -t 0 )×N t03 。
For t 04 Time to t 1 The integral noise at time is: (t) 1 -t 04 )/(t 1 -t 0 )×N t04 。
Wherein, N t01 Denotes t 01 Fusion noise of time of day, N t02 Denotes t 02 Fusion noise of time of day, N t03 Denotes t 03 Fusion noise of time of day, N t04 Denotes t 04 Fusion noise at time.
While each sub-period (t) within the integration period 01 To t 02 ,t 02 To t 03 ,t 03 To t 04 ,t 04 To t 1 ) The integrated noise of (a) together is the integrated noise of the whole integration period.
The start time of the integration period in the above example is just the time of image refresh, i.e., the image noise at the start time of the integration period can be obtained.
In practical applications, it is possible that the start time of the integration period is not the time of image refresh nor the time of backlight adjustment. In this case, it is necessary to acquire the fusion noise corresponding to the change time (image refresh time or backlight adjustment time) that is the latest before the start of the current integration period.
Referring to fig. 13, an integration time period (t) is obtained for a noise calculation library in an AP processor provided in an embodiment of the present application 01 Time to t 1 Time of day), t 01 The time is not the starting time of the current integration time period, but is the image refreshing time of one time in the current integration time period. The latest change time (image refresh time or brightness adjustment time) before the start of the current integration period is t -1 The time is an image refresh time.
Referring to fig. 14, if the latest change time before the start of the current integration period is the image refresh time, the image noise corresponding to the image refresh time will be t 0 Time to t 01 The initial ambient light at that moment causes interference.
Of course, if the latest change time is the brightness adjustment time, the backlight noise corresponding to the brightness adjustment time will be t 0 To t 01 The initial ambient light at the moment causes interference.
In the embodiment shown in fig. 14, the integration noise corresponding to each sub-period in the integration period is:
for t 0 Time to t 01 The integral noise at time is: (t) 01 -t 0 )/(t 1 -t 0 )×N t-1 。
For t 01 Time to t 02 The integral noise at time is: (t) 02 -t 01 )/(t 1 -t 0 )×N t01 。
For t 02 Time to t 03 The integral noise at time is: (t) 03 -t 02 )/(t 1 -t 0 )×N t02 。
For t 03 Time to t 04 The integral noise at time is: (t) 04 -t 03 )/(t 1 -t 0 )×N t03 。
For t 04 Time to t 1 The integral noise at time is: (t) 1 -t 04 )/(t 1 -t 0 )×N t04 。
Wherein, N t-1 Represents t -1 Merging noise of moments, N t01 Represents t 01 Merging noise of moments, N t02 Represents t 02 Merging noise of moments, N t03 Represents t 03 Merging noise of moments, N t04 Represents t 04 Fusion noise at time.
As can be understood from the above example, the obtained integral noise is also a raw value on four channels.
The timestamps in the above examples are different, and in practical applications, the HWC may perform both the process of acquiring the target image and the process of acquiring the brightness to be adjusted within one time measurement unit (e.g., within 1 ms). However, the time stamp of the target image acquired at this time and the brightness to be adjusted are the same.
If a target image and a brightness value with the same timestamp exist and the noise algorithm library receives the target image first, the noise algorithm library calculates image noise according to the latest brightness value before the target image and the target image, and calculates backlight noise according to the target image and the brightness value with the same timestamp when calculating backlight noise corresponding to the brightness value;
if the target image and the brightness value with the same timestamp exist and the noise algorithm library receives the brightness value first, the noise algorithm library calculates the backlight noise according to the latest target image before the brightness value and the brightness value, and calculates the image noise according to the target image and the brightness value with the same timestamp when calculating the image noise corresponding to the target image.
The noise algorithm library firstly receives a target image, then calculates to obtain image noise, and firstly stores the image noise to a noise memory. The fusion noise stored in the noise memory has a time sequence, that is, before being stored in the noise memory, whether the fusion noise to be stored currently is behind the timestamp of the fusion noise stored last time is judged, if the fusion noise to be stored currently is behind the timestamp of the fusion noise stored last time, the fusion noise to be stored currently is stored, and if the fusion noise to be stored currently is before or the same as the timestamp of the fusion noise stored last time, the noise to be stored currently is discarded. Therefore, the backlight noise obtained by the post-calculation is discarded.
In practice, the timestamp of the target image may be the time when the HWC starts to perform a write-back memory fetch of the target image from the CWB as the timestamp of the target image. The timestamp of the luminance value may be the time at which the HWC started to retrieve the luminance value from the kernel node as the timestamp of the luminance value. The HWC may switch to obtaining the luminance values during the process of obtaining the target image. Therefore, the HWC performs the capturing of the target image first and then the capturing of the luminance value, and the timestamp of the luminance value is later than the timestamp of the target image. In practical applications, the HWC may obtain the luminance value and send the luminance value to the noise algorithm library, and the noise algorithm library calculates the backlight noise and stores the backlight noise. And obtaining a target image after the HWC and sending the target image to a noise algorithm library, and calculating by the noise algorithm library to obtain image noise and storing the image noise. This results in the time at which the timestamp of the image noise is currently ready to be stored being before the time at which the timestamp of the backlight noise was last stored.
And step four, removing integral noise of the whole integral time period from the initial environment light by a noise algorithm library to obtain the target environment light.
In the embodiment of the present application, the initial ambient light sent by the SCP processor to the HWC of the AP processor is initial ambient light data in the form of RGBC four-channel raw values. The HWC sends the initial ambient light data, also in the form of RGBC four-channel raw values, to the noise algorithm library. The raw values over the four channels of the integrated noise are obtained in step three. Therefore, in this step, the raw values of the four channels of the initial ambient light and the raw values of the integrated noise four channels may be calculated to obtain the raw values on the four channels of the target ambient light.
After calculating and obtaining raw values on four channels of the target ambient light, the noise algorithm library can send the raw values on the four channels of the target ambient light to the SCP processor, and the SCP processor calculates and obtains the lux value of the target ambient light according to the raw values on the four channels of the target ambient light.
As an example, the lux values may be weighted according to the raw values of the respective channels multiplied by coefficients of the respective channels (which may be provided by the vendor of the ambient light sensor).
As previously described, the ambient light sensor may detect the initial ambient light in the environment of the electronic device with a fixed acquisition period and a fixed integration duration. However, in some scenarios, the fixed acquisition period and fixed integration duration approach often results in too high power consumption or poor user experience.
In some scenarios, the brightness of the environment in which the electronic device is located varies less. In order to reduce power consumption, the integration duration of the ambient light sensor may be kept constant, and the sleep duration of the ambient light sensor is prolonged, which is equivalent to prolonging the acquisition period (or, the integration duration of the ambient light sensor is kept constant, and the acquisition period of the ambient light sensor is prolonged, which is equivalent to prolonging the sleep duration). Thus, a normal sampling mode (denoted as a first sampling mode) and a slow sampling mode (denoted as a second sampling mode) may be set for the ambient light sensor. The ambient light sensor is mainly in a normal sampling mode, and the sampling mode of the ambient light sensor can be switched to a slow sampling mode under the condition that the brightness change of the environment where the electronic equipment is located is small.
As an example, in the normal sampling mode, the acquisition period may be 350ms, the integration duration may be 50ms, and the sleep duration may be 300ms. The acquisition period of the slow sampling mode is longer than that of the normal sampling mode, for example, the longest acquisition period that can be supported by the ambient light sensor can be set; of course, any time between the acquisition period in the normal sampling mode and the longest acquisition period supported may be set. In the embodiment of the present application, the acquisition period of the slow sampling mode may be set to be 711ms, the integration duration is the same as the integration duration of the normal sampling mode, and is also 50ms, and the sleep duration is 661ms.
In another scene, the brightness of the environment where the electronic equipment is located is changed greatly, so that the brightness of the display screen of the electronic equipment can be adjusted quickly according to the ambient light data collected by the ambient light sensor, and the visual experience of a user is improved; it is desirable to shorten the acquisition period of the ambient light sensor. The ambient light sensor still mainly uses the normal sampling mode, and under the condition that the brightness change of the environment where the electronic device is located is large, the sampling mode of the ambient light sensor can be switched to the fast sampling mode (denoted as a third sampling mode).
As an example, the acquisition period of the fast sampling mode may be 100ms, the integration duration may be 50ms, and the sleep duration is 50ms.
In addition, it should be noted that the implementation of the embodiments of the present application is not necessarily limited to the electronic device of the under-screen ambient light sensor.
If the method is applied to the electronic equipment of the non-screen ambient light sensor, the slow sampling mode is set, which is equivalent to prolonging the sleep duration of the ambient light sensor, and the power consumption can be reduced.
If the method is applied to the electronic equipment of the under-screen ambient light sensor, a slow sampling mode is set, the ratio of the sleep duration of the ambient light sensor is prolonged, the frequency of reporting ambient light data by the SCP processor and the frequency of processing the ambient light data by the AP processor are further reduced, and the power consumption can also be reduced; in addition, the number of times noise is removed from the initial ambient light collected by the ambient light sensor is also reduced, and thus, power consumption can be further reduced.
Of course, if the method is applied to the electronic device of the under-screen ambient light sensor, since the noise interfering with the initial ambient light is the noise corresponding to the brightness of the display screen and the image displayed in the target area during the integration period, the method may be set to acquire the target image only in the integration period (or in the time range including the integration period), and no longer acquire the target image in other periods. If the dormancy duration is prolonged, the total duration of the target image needing to be acquired is correspondingly reduced, and the power consumption is also greatly reduced.
Since the switching of the sampling mode is not necessarily limited to the electronics of the ambient light sensor under the screen. Thus, in subsequent embodiments, the data collected by the ambient light sensor is recorded as ambient light data. If the subsequent embodiment is applied to the electronic device of the on-screen ambient light sensor, the ambient light data in the embodiment of the present application is the initial ambient light collected by the ambient light sensor.
When the ambient light sensor collects ambient light data, the ambient light sensor is equivalent to simulating human eyes to sense the ambient light intensity. In order to enable the ambient light data reported to the ambient light sensor to be closer to the perception of human eyes, the ambient light sensor needs to multiply the collected original data by a certain coefficient (the coefficient is gain), so as to obtain the ambient light data capable of reflecting the intensity of light received by human eyes with higher precision. However, the gain value is different in different luminance ranges. Thus, an event occurs to adjust the gain.
As an example, in a very bright environment, the raw data collected by the ambient light sensor is large, and in order to make the ambient light data reported to the ambient light sensor driver, which is obtained by the ambient light sensor based on the raw data, within a proper range (e.g., [ a, b ] range), the reported ambient light data can be obtained by using a small gain value and the current raw data. In a very dark environment, the raw data acquired by the ambient light sensor is small, and in order to enable the ambient light data obtained by the ambient light sensor based on the raw data to be within a proper range, the ambient light data can be obtained by adopting a large gain value and the current raw data.
In one implementation, the ambient light sensor acquires ambient light data (values in raw format) at a gain value. The ambient light sensor reports ambient light data (raw value) to the ambient light sensor driver. If the ambient light sensor drive judges that the received raw value exceeds the preset range, the currently adopted gain value is not appropriate, and the ambient light sensor drive can adjust the gain value of the ambient light sensor. And the ambient light sensor continues to obtain the raw value acquired in the next acquisition period by using the adjusted gain value.
In addition, the ambient light sensor driver will not send the currently acquired raw value outside the preset range to the AP processor, but will discard the raw value.
In another implementation, the ambient light sensor obtains the ambient light data (raw format value) with a certain gain, and determines whether the obtained ambient light data (raw value) exceeds a preset range, and if the obtained ambient light data (raw value) exceeds the preset range, a specific bit in a preset register is set as a specific character. The ambient light sensor driver determines whether to adjust the gain value of the ambient light sensor by reading a particular character in a particular bit in the register.
It should be noted that the focus of the embodiments of the present application is not how to adjust the gain of the ambient light sensor. But rather by the above example illustrates the occurrence of gain adjustment events at the SCP processor side. And, during the adjustment of the gain, since the ambient light data is out of range, the out of range ambient light data is not reported to the AP processor.
In practical applications, the gain value is not appropriate, and usually, the brightness of the environment where the electronic device is located is greatly changed, so that, in order to improve user experience, the AP processor needs to quickly acquire the ambient light data to quickly adjust the display screen of the electronic device. Therefore, the acquisition period of the ambient light sensor is shorter when the gain is adjusted relative to the acquisition period in the normal sampling mode. For convenience of description, this sampling mode may be referred to as a gain adjustment mode (may also be referred to as a fourth sampling mode). In the gain adjustment mode, the acquisition period may be 60ms, the integration duration may be 50ms, and the sleep duration may be 10ms.
It should be noted that the numbers used to represent the acquisition period, the integration period, and the sleep period in the above exemplary sampling patterns are only used for example, and are not limited.
Generally, after the electronic device is powered on, the ambient light sensor may be controlled to operate in a normal sampling mode. And then switching to a slow sampling mode or a fast sampling mode according to the change of the environment in which the ambient light sensor is positioned.
Of course, under the condition that the ambient light sensor works in the slow sampling mode or the fast sampling mode, the ambient light sensor can be switched back to the normal sampling mode according to the change of the environment where the ambient light sensor is located.
And after the ambient light data collected by the ambient light sensor exceeds the range of [ a, b ], switching to a gain adjustment mode. After the gain adjustment is complete, the ambient light sensor may switch back to the normal sampling mode.
Other switching between several sampling modes may specifically refer to the description of the subsequent embodiments.
In order to more clearly illustrate the switching conditions between the sampling modes, the following embodiments are described.
In a specific implementation, the ambient light sensor is used to detect information related to the light intensity of the environment where the electronic device is located, and therefore, the ambient light data collected by the ambient light sensor is used as a basis for switching between modes.
When the method is applied to an electronic device with a non-screen ambient light sensor, ambient light data collected by the ambient light sensor can be understood as data of the surrounding environment where the electronic device is located. Thus, ambient light data collected by the ambient light sensor may be used as a basis for switching between modes.
When applied to electronic equipment of an off-screen ambient light sensor, ambient light data collected by the ambient light sensor includes two parts: noise and real ambient light data. If the ambient light data collected by the ambient light sensor is stable within a certain range, the noise and the real ambient light data are doubly stable. Therefore, the ambient light data collected by the ambient light sensor can also be used as a basis for switching between modes.
Of course, in practical applications, there may be a case: both the external real ambient light and the noise are changed greatly, however, the data change of the ambient light formed by the external real ambient light and the noise is small. In general, when determining whether or not the ambient light data is stable, stability is determined by a change in a plurality of consecutive ambient light data. Therefore, even if there is such a situation (the actual ambient light in the outside changes greatly, and the current noise just compensates for the large change), it is likely that the current single ambient light data is stable, however, the subsequent continuous multiple ambient light data changes will usually become large (it is unlikely that the noise multiple times in succession will all just compensate for the change). Therefore, in practical applications, if the change of a plurality of continuous ambient light data is judged to be stable, the situation can be ignored.
As described above, in the embodiments of the present application, noise may be subtracted from the ambient light data (initial ambient light) collected by the ambient light sensor to obtain external real ambient light (target ambient light). Therefore, in the embodiment of the present application, the target ambient light data in the above embodiment may also be used as a basis for switching between modes. The embodiment of the present application does not limit this.
Certainly, in practical application, the collected ambient light data may also be used for determining the stability, and the target ambient light data is used for determining the stability and is also used as a condition for determining the stability, which is not limited in the embodiment of the present application.
As described above, the condition for switching to the gain adjustment mode in the above sampling mode is related to the value itself of the ambient light data; the switching between the slow sampling mode, the normal sampling mode and the fast sampling mode of the above-described sampling modes is related to a change in the ambient light data.
Therefore, two variation thresholds C1 (noted as a first threshold) and C2 (noted as a second threshold) may be set for the switching conditions between the slow sampling mode, the normal sampling mode, and the fast sampling mode, with C1 being smaller than C2. Wherein, the change threshold C1 is a critical threshold for switching between the slow sampling mode and the normal sampling mode, and C2 is a critical threshold between the normal sampling mode and the fast sampling mode. The change threshold is a threshold corresponding to a change in the ambient light data collected by the ambient light sensor.
Of course, in practical applications, more or less sampling modes than the above embodiments may be set. The embodiment of the present application does not limit this.
In addition, a number threshold N1 (denoted as a first number), N2 (denoted as a second number), N3, and N4 may be set as the number conditions of the ambient light data at the time of switching, respectively; of course, in practical applications, time thresholds (T1, T2, T3, and T4) corresponding to the number threshold may also be set, and specific reference may be made to the description of the following embodiments.
For convenience of description, the embodiment of the present application describes the change of the ambient light data by taking not less than 0 as an example. I.e. the variations in the ambient light data are each obtained by subtracting a small value from the ambient light data from a large value in the ambient light data.
Referring to fig. 15, after the electronic device is powered on, the ambient light sensor in the electronic device is normally in a normal sampling mode. When an ambient light sensor of the electronic device is in a normal sampling mode, if the change of continuous N1 (or within T1 time) ambient light data is stable within a range of [0, C1 ], the ambient light sensor can be controlled to switch to a slow sampling mode; if ambient light data is received such that a change in the ambient light data occurs within the range [ C2, + ∞), the ambient light sensor may be controlled to switch to the fast sampling mode.
When an ambient light sensor in the electronic equipment is in a slow sampling mode, if the ambient light data change is stable within the range of [0, C1 ], keeping the slow sampling mode; if an ambient light data is received such that the ambient light data variation occurs within the range [ C1, C2), the ambient light sensor may be controlled to switch to the normal sampling mode, and if an ambient light data is received such that the ambient light data variation occurs within the range [ C2, + ∞) the ambient light sensor may be controlled to switch to the fast sampling mode.
When an ambient light sensor in an electronic device is in a fast sampling mode, if N2 (or within T2) consecutive ambient light data changes within a range of [0, C2 ], the ambient light sensor may be controlled to switch to a normal sampling mode. To avoid excessive power consumption, the normal sampling mode may be switched to after the number of the acquired ambient light data exceeds N3 (N3 is greater than N2), or after switching to the fast sampling mode T3 for a time, even if the change of the ambient light data is not stabilized within the range of [0, C2 ]. Wherein the values of N1 and N2 may be equal.
No matter which sampling mode (normal sampling mode, slow sampling mode and fast sampling mode) the ambient light sensor works in, the ambient light sensor can be controlled to switch to the gain adjustment mode as long as an event (ambient light data overflows a preset range) that the gain value needs to be adjusted occurs. And if the obtained raw value is in the range of [ a, b ] or is adjusted for a preset number of times (for example, N4) or is switched to the gain adjustment mode T4 for time by adjusting the gain value, exiting the gain adjustment mode and simultaneously switching to the normal sampling mode. Wherein the values of N3 and N4 may be equal.
As mentioned above, the ambient light sensor driver in the SCP processor needs to control the ambient light sensor to collect the ambient light, and therefore, the ambient light sensor driver configures the collection period or sleep duration (the integration duration is fixed) of the ambient light sensor. The process for determining the switching condition between the modes may be executed in the SCP processor (referred to as a first processor), or may be executed in the AP processor (referred to as a second processor).
Referring to fig. 16, for example, when the AP processor is operated, in an ith acquisition period, the ambient light sensor obtains ambient light data (raw value) according to a currently set gain value, and the ambient light sensor reports the ambient light data to the ambient light sensor driver; the ambient light sensor drive sends the ambient light data to the calculation module, and the calculation module converts the ambient light data in the raw format into the ambient light data in the lux format. The calculation module sends ambient light data (lux values) to the ambient light sensor application. The ambient light sensor application sends the ambient light data to the HWC module in the AP processor through inter-core communication.
Referring to fig. 16, the HWC module in the ap processor sends the ambient light data (lux values) to the noise algorithm library, which determines whether to switch to another sampling mode based on the received ambient light data and the current sampling mode. In the case of determining that the current sampling mode is maintained, the AP processor continues to wait for ambient light data to be sent in the next (i +1 th) acquisition period.
Referring to fig. 16, in the (i + 1) th collection period, the ambient light sensor obtains ambient light data (raw value) with the currently set gain value, and the ambient light sensor reports the ambient light data to the ambient light sensor driver; the ambient light sensor drive sends the ambient light data to the calculation module, and the calculation module converts the ambient light data in the raw format into the ambient light data in the lux format. The calculation module sends ambient light data (lux values) to the ambient light sensor application. The ambient light sensor application sends ambient light data to the HWC module in the AP processor through inter-core communication.
Referring to fig. 16, the HWC module in the ap processor sends the ambient light data (lux values) to the noise algorithm library, which determines whether to switch to another sampling mode based on the received ambient light data and the current sampling mode. If the noise algorithm library in the AP processor judges that switching to another sampling mode is required based on the received ambient light data and the current sampling mode, the noise algorithm library returns the sampling mode to be switched to the HWC module, the HWC module sends the sampling mode to be switched to the ambient light sensor application in the SCP processor through inter-core communication, the ambient light sensor application sends the sampling mode to be switched to the ambient light sensor driver, the ambient light sensor driver stores relevant parameters (for example, sleep duration or acquisition period) of the sampling mode to be switched in a register of the ambient light sensor itself, and the ambient light sensor acquires ambient light using the relevant parameters stored in the register.
And the ambient light sensor continues to obtain the ambient light data acquired in the (i + 2) th acquisition cycle according to the switched sampling mode, and continues to transmit the ambient light data to the AP processor according to the mode.
Of course, the SCP processor may transmit the information related to the acquisition period or the sleep duration together when transmitting the ambient light data (for example, the acquisition period or the sleep duration may be acquired from a register). The AP processor can determine the current sampling mode of the ambient light sensor according to the received acquisition period or the sleep duration and other related information.
It should be noted that before the SCP processor sends the ambient light data to the AP processor, the ambient light data may be preprocessed (e.g., normalized gain processing) and then the preprocessed ambient light data may be sent to the AP processor. Certainly, in practical applications, the SCP processor may also send the ambient light data and the corresponding gain value when obtaining the ambient light data to the AP processor, and the AP processor preprocesses the ambient light data based on the gain value corresponding to the ambient light data, so as to obtain the preprocessed ambient light data.
In addition, in the embodiment shown in fig. 16, in the (i + 1) th collection period, the time period from when the ambient light sensor obtains the ambient light data at the end of integration to when the ambient light sensor drives to store the sleep time period corresponding to the sampling mode to be switched in the register is usually in the order of ms (generally, several ms). I.e. the ambient light sensor can obtain information about the sampling mode to be switched during the sleep period of the (i + 1) th acquisition cycle.
The ambient light sensor drive can indicate that the sleep time of the ambient light sensor in the (i + 1) th acquisition cycle is the sleep time corresponding to the sampling mode before switching, and the sleep time of the (i + 2) th acquisition cycle is the sleep time corresponding to the sampling mode after switching; and the sleep duration of the ambient light sensor in the (i + 1) th acquisition cycle can be indicated to be the sleep duration corresponding to the switched sampling mode.
Of course, in order to enable the ambient light sensor to be quickly switched to another sampling mode and obtain ambient light data in another sampling mode, the sleep duration of the ambient light sensor in the (i + 1) th collection period may also be indicated as a shorter sleep duration of the sleep duration corresponding to the sampling mode before switching and the sleep duration corresponding to the sampling mode after switching. Certainly, in the (i + 2) th acquisition cycle, the sleep duration sleeps according to the sleep duration of the switched mode.
As an example, if switching from the normal sampling mode to the slow sampling mode, the sleep duration of the current acquisition cycle is performed according to the sleep duration of the normal sampling mode.
And if the slow sampling mode is switched to the normal sampling mode, executing the sleep duration of the current acquisition period according to the sleep duration of the normal sampling mode.
And if the slow sampling mode or the normal sampling mode is switched to the fast sampling mode, executing the sleep duration of the current acquisition period according to the sleep duration of the fast sampling mode.
And if the fast sampling mode is switched to the normal sampling mode, executing the dormant time of the current acquisition period according to the dormant time of the fast sampling mode.
And if the acquisition period is switched to the gain adjustment mode, executing the sleep time of the current acquisition period according to the sleep time of the gain adjustment mode.
Of course, in practical applications, on the SCP processor side, in the case where it is determined that the sampling mode is to be switched (including switching to the gain adjustment mode), how the ambient light sensor switches to another sampling mode may be determined according to logic internal to the ambient light sensor driver that controls the ambient light sensor. The embodiment of the present application does not limit this.
As can be appreciated from the above example, the switching to the fast sampling mode, the slow sampling mode, or the normal sampling mode is triggered by a noise algorithm library in the AP processor based on the received ambient light data. The switch to the gain adjustment mode is triggered by an ambient light sensor drive in the SCP processor.
Referring to fig. 17, in practical applications, a plurality of gain values and a range [ a, b ] of raw values collected by the ambient light sensor may be set in advance.
In the j-th acquisition period, the ambient light sensor acquires ambient light data according to the current gain value Z3 and reports the ambient light data to the ambient light sensor driver, the ambient light sensor driver judges that the ambient light data (obtained based on the gain value Z3) is smaller than a, and if the ambient light data exceeds the range, the ambient light sensor driver adjusts the gain value of the ambient light sensor to be Z4 (Z4 is larger than Z3).
In the j +1 th acquisition period, the ambient light sensor reads ambient light data by using a gain value Z4 and reports the ambient light data to the ambient light sensor driver; the ambient light sensor drives and judges that the ambient light data read by the gain value Z4 is still smaller than a and exceeds the range; the ambient light sensor drive continues to adjust the gain value of the ambient light sensor to Z5 (Z5 is greater than Z4).
In the j +2 th acquisition period, the ambient light sensor reads ambient light data by using a gain value Z5 and reports the ambient light data to the ambient light sensor driver; and the ambient light sensor driver judges that the ambient light data read by the gain value Z5 is in the range of [ a, b ], the gain adjustment is completed, and the ambient light sensor driver can send the ambient light data to the AP processor.
As can be appreciated from fig. 17, the SCP processor can also implement the switching gain adjustment mode without the involvement of the AP processor. During the time that the SCP processor adjusts the gain, the SCP processor does not send ambient light data beyond the a, b range to the AP processor. The SCP will not send the collected ambient light data to the AP processor until it switches from the gain adjustment mode to the normal sampling mode. As previously described, it is now necessary to return to the normal sampling mode. The SCP processor controls the ambient light sensor to work in a normal sampling mode by modifying the acquisition period or the dormancy duration stored by the register. And the SCP processor reports the ambient light data and the information such as the acquisition period or the dormancy duration and the like stored in the register to the AP processor. That is, when the (j + 2) th collection period reports the ambient light data, the collection period or the sleep time stored in the register may be reported together. In practical applications, the SCP processor may report the acquisition period or the sleep duration (not shown in fig. 16) together each time it reports the ambient light data.
In practical application, the integration durations of several modes are the same, the difference is that the sleep durations are different, and any one of the acquisition period or the sleep duration can be reported. The AP processor may derive a sampling pattern in which the ambient light sensor is currently located based on either the acquisition period or the sleep duration.
Referring to fig. 18, a technical architecture diagram is provided for an embodiment of the present application.
If the sampling mode of the ambient light sensor is set to be intelligently adjustable, the following is added to the technical architecture shown in fig. 7.
Referring to fig. 16, after the ambient light sensor obtains the initial ambient light, the ambient light sensor drives the initial ambient light in raw format. The ambient light sensor driver may invoke the computing module to obtain the initial ambient light in the lux format based on the initial ambient light in the raw format.
In the technical architecture shown in fig. 7, the ambient light sensor drives the initial ambient light that can report in the raw format in step E3, and therefore, in the technical architecture shown in fig. 18, the data reported in step E3 and step E4 may include not only the initial ambient light in the raw format but also the initial ambient light in the lux format.
As shown in connection with fig. 16, after the AP processor receives the initial ambient light, it may in some cases need to return to the SCP processor a sleep duration to cause the ambient light sensor to switch sampling modes.
As shown in fig. 7, after receiving the initial ambient light, the AP processor may obtain the target ambient light, which needs to be returned to the SCP processor.
It should be noted that the AP processor sends the target ambient light to the SCP processor and sends the sleep duration to the SCP processor as different inter-core communication information.
Taking the sleep duration returned by the AP processor to the SCP processor as an example, the HWC in the AP processor sends the sleep duration to the ambient light sensor application, the ambient light sensor application sends the sleep duration to the ambient light sensor driver, and the ambient light sensor driver writes the sleep duration into a register in the ambient light sensor. The ambient light sensor collects ambient light for a sleep duration stored in a register.
After describing each sampling mode and the conditions for switching between different sampling modes, the following describes, by way of example, a process of determining the ambient light data change (N1 consecutive ambient light data changes are stable in the range of [0, C1 ]) when the AP processor switches from the normal sampling mode to the slow sampling mode.
As an example, the ambient light data collected by the ambient light sensor may be stored in a pre-set data storage structure. The data storage structure may be a ring storage structure. In case the data queue stored in the ring-shaped memory structure is empty, the first data stored in the ring-shaped memory structure is the first ambient light data as described in the subsequent embodiments of the application.
Referring to fig. 19, for the first ambient light data D1, which is considered to be stable data, stored in a data storage structure (FIFO), the storage structure for storing the ambient light data is denoted as the first storage space.
For the second ambient light data D2, if the absolute value of the difference from the first ambient light data D1 is smaller than C1, the second ambient light data D2 is stable and continues to be stored in the data storage structure in order. At this point 2 stable ambient light data were obtained.
If the absolute value of the difference from the first ambient light data D1 is greater than or equal to C1, the second ambient light data D2 is unstable, so that the ambient light change is greater than or equal to C1, and the FIFO is cleared to wait for the first ambient light data to arrive. Of course, in practical applications, the currently received ambient light data D2 may also be stored as the first ambient light data in the FIFO.
For the third ambient light data D3, if the absolute value of the difference between the third ambient light data D3 and the first ambient light data D1 is smaller than C1, and the absolute value of the difference between the third ambient light data D3 and the second ambient light data D2 is smaller than C1, the third ambient light data D3 is stable, and then the third ambient light data D3 is stored in the data storage structure sequentially. At this point 3 stable ambient light data were obtained.
If the first ambient light data D3 does not satisfy any of the above, the third ambient light data D3 is unstable, so that the ambient light change is greater than or equal to C1, and the FIFO is cleared to wait for the first ambient light data to arrive. Of course, in practical applications, the currently received ambient light data D3 may also be stored as the first ambient light data in the FIFO.
For the fourth ambient light data D4, if the absolute values of the differences between the fourth ambient light data D4 and the first ambient light data D1 are all smaller than C1, and the absolute values of the differences between the fourth ambient light data D4 and the third ambient light data D3 are smaller than C1, the fourth ambient light data D4 is stable, and then the fourth ambient light data D4 is stored in the data storage structure continuously in sequence. At this point 4 stable ambient light data were obtained.
If the first ambient light data is not satisfied with any of the above items, the fourth ambient light data D4 is unstable, so that the ambient light change is greater than or equal to C1, and the FIFO is cleared to wait for the first ambient light data to arrive. Of course, in practical applications, the currently received ambient light data D4 may also be stored as the first ambient light data in the FIFO.
……
For the ith ambient light data Di, if the absolute value of the difference between the ith ambient light data Di and the 1 st ambient light data D1 is smaller than C1, and the absolute value of the difference between the ith ambient light data D1 and the (i-1) th ambient light data D (i-1) is smaller than C1, the ith ambient light data Di is stable, and then the ith ambient light data Di is stored in the data storage structure continuously in sequence. At this point i stable ambient light data are obtained.
If the condition is not met, the ith ambient light data Di is unstable, so that the ambient light change is larger than or equal to C1, emptying the FIFO, and waiting for the first ambient light data to arrive. Of course, in practical applications, the currently received ambient light data Di may also be stored as the first ambient light data in the FIFO.
When the variation of the N1 pieces of continuously received ambient light data is less than C1, it indicates that the environment where the electronic device is located is relatively stable, and the current normal sampling mode can be switched to the slow sampling mode.
As can be understood from the above example, each time the absolute value of the difference between the received ambient light data and the first ambient light data is less than C1, and the absolute value of the difference between the received ambient light data and the previous ambient light data is less than C1, the currently received ambient light data is stable, and the currently received ambient light data is sequentially stored in the data storage structure.
In addition, if the current sampling mode is not the normal sampling mode but the fast sampling mode, the threshold in the above example is not C1 but C2, and the other determination flow is the same. The stability determination process (the continuous N2 changes of the ambient light data are stable in the range of [0, C2) when the AP processor switches from the fast sampling mode to the normal sampling mode will not be described in detail in the embodiment of the present application.
In addition, it should be noted that, since it is also possible to switch to the fast sampling mode in the normal sampling mode, it is also necessary to determine whether the absolute value of any difference is greater than or equal to C2 when the absolute value of any difference is greater than or equal to C1, and switch to the fast sampling mode if the absolute value of any difference is greater than or equal to C2.
As previously mentioned, the ambient light data stored in the data storage structure forms a data queue. The first stored ambient light data in the data queue corresponds to the head of the queue, and the last stored ambient light data corresponds to the tail of the queue.
If the number of the ambient light data that can be stored by the data storage structure may be greater than or equal to N1, it is determined that a condition for switching from the normal sampling mode to the slow sampling mode is satisfied when N1 ambient light data exist in a data queue stored in the data storage structure.
In the above embodiment, the first ambient light data is used as a reference, and each new incoming ambient light data is compared with the first ambient light data and is also compared with the previous ambient light data. When N1 pieces of ambient light data are both limited between the upper limit (D1 + C1) and the lower limit (D1-C1) (the difference between the upper limit and the lower limit is 2C 1) in the embodiment shown in fig. 20, and the change of two consecutive pieces of ambient light data is smaller than C1, it is determined that the change of N1 pieces of ambient light data is stable within the range of [0, C1 ].
However, if the change of the environment belongs to a gradual change, that is, the ambient light data becomes larger or smaller in sequence. Although the absolute value of the difference between two consecutive ambient light data is less than C1. However, the absolute value of the difference between two ambient light data several data apart is greater than or equal to C1. Specifically, reference may be made to data D5 and data D8 in fig. 20, and it is obvious that the absolute value of the difference between data D5 and data D8 is greater than C1. I.e., the variation between the data D5 and the data D8 is not stabilized in the range of [0, C1).
In order to solve the above problem, the embodiment of the present application further provides another more accurate stability judgment process.
Referring to fig. 21, for the first ambient light data D1, which is considered to be stable data, it is stored in the data storage structure.
For the second ambient light data D2, the larger value of the first ambient light data D1 and the second ambient light data D2 is taken as the maximum value, and the smaller value of the first ambient light data D1 and the second ambient light data D2 is taken as the minimum value. If the difference between the maximum and minimum values is less than C1, the second ambient light data is considered stable and stored in the data storage structure.
Otherwise, the second ambient light data D2 is unstable, so that the ambient light variation is greater than or equal to C1, and the FIFO is cleared to wait for the first ambient light data to arrive. Of course, in practical applications, the currently received ambient light data D2 may also be stored as the first ambient light data in the FIFO. For the third ambient light data D3, the largest one of the third ambient light data D3, the currently stored maximum value, and the minimum value is updated to the maximum value, and the smallest one of the third ambient light data D3, the currently stored maximum value, and the minimum value is updated to the minimum value. At this time, if the difference between the updated maximum value and the updated minimum value is smaller than C1, it is considered that the third ambient light data is stable and stored in the data storage structure.
Otherwise, the third ambient light data D3 is unstable, so that the ambient light variation is greater than or equal to C1, and the FIFO is cleared to wait for the first ambient light data to arrive. Of course, in practical applications, the currently received ambient light data D3 may also be stored as the first ambient light data in the FIFO.
For the fourth ambient light data D4, the largest one of the fourth ambient light data D4, the currently stored maximum value, and the minimum value is updated to the maximum value, and the smallest one of the fourth ambient light data D4, the currently stored maximum value, and the minimum value is updated to the minimum value. At this time, if the difference between the updated maximum value and the updated minimum value is smaller than C1, it is determined that the fourth ambient light data is stable and stored in the data storage structure.
Otherwise, the fourth ambient light data D4 is unstable, so that the ambient light variation is greater than or equal to C1, and the FIFO is cleared to wait for the first ambient light data to arrive. Of course, in practical applications, the currently received ambient light data D4 may also be stored as the first ambient light data in the FIFO.
……
For the ith ambient light data Di, the largest one of the ith ambient light data Di, the currently stored maximum value, and the minimum value is updated to the maximum value, and the smallest one of the ith ambient light data Di, the currently stored maximum value, and the minimum value is updated to the minimum value. At this time, if the difference between the updated maximum value and the updated minimum value is smaller than C1, the i-th ambient light data Di is considered to be stable. Stored in a data storage structure.
Otherwise, the ith ambient light data Di is unstable, so that the ambient light variation is greater than or equal to C1, and the FIFO is emptied to wait for the first ambient light data to arrive. Of course, in practical applications, the currently received ambient light data Di may also be stored as the first ambient light data in the FIFO.
As can be understood from the above example, if the difference between the updated maximum value and the updated minimum value is greater than or equal to C1, it is considered that the currently received ambient light data is unstable, so that the ambient light change is greater than or equal to C1, the ambient light data stored in the data storage structure may be cleared, and the first ambient light data may be waited for. Or storing the currently received ambient light data as the first ambient light data in the FIFO.
Since the maximum and minimum values are updated once every time ambient light data is received; thus, the updated maximum and minimum values are the maximum and minimum values of all ambient light data starting from the first ambient light data (first after FIFO emptying) to the current reception. Therefore, if the absolute value of the difference between the maximum value and the minimum value is smaller than C1, the absolute value of the difference between any two ambient light data in all the ambient light data received currently from the first ambient light data is smaller than C1, and therefore, when N1 ambient light data are all limited between the upper limit and the lower limit in the embodiment shown in fig. 22 (the difference between the upper limit and the lower limit is C1) by the stability determination method shown in fig. 21, it is determined that the change of the N1 ambient light data is stable within the range of [0, C1), thereby solving the problem in the embodiment shown in fig. 20.
This embodiment may also be applied to the stability determination process when the AP processor switches from the fast sampling mode to the normal sampling mode (N2 consecutive ambient light data changes are stable in the range of [0, C2). The embodiment of the present application will not be described in detail.
In practical applications, the perception of the light intensity by the vision of the user is different in the ambient light of different scenes. As an example, in the dark, a user can very easily perceive the faint light of fireflies. In sunny daylight, the user is less likely to perceive a bright street light instead. Therefore, the ambient light data needs to be divided into a plurality of levels, and different levels set different threshold values C1 for stability determination.
By way of example, if the ambient light data is in the [0,5] range, which may be recorded as range level 1, then C1 is 2.
If the ambient light data is within the range of (5, 20], which can be recorded as range level 2, C1 is 3.
If the ambient light data is within the (20, 100) range, which can be recorded as range level 3, C1 is 6.
If the ambient light data is within the range of (100, + ∞), which can be recorded as range class 4, C1 is 5% (or 6%, 7%, 8%, etc.) of the first ambient light data (the first ambient light data after FIFO emptying).
When C1 corresponding to different range levels is obtained, a specific numerical value of C1 may be obtained based on the range level in which the lux value of the currently received ambient light data is located.
Referring to fig. 23, when the ambient light data is received, it is first determined whether the currently received ambient light data is within range level 1, and if so, it is determined that the currently received ambient light data belongs to the range of [0,5], and the corresponding threshold is 2.
If not, continuously judging whether the currently received ambient light data is in the range level 2, if so, determining that the currently received ambient light data belongs to the range (5, 20), and the corresponding threshold value is 3.
If not, continuously judging whether the currently received ambient light data is in the range grade 3, if so, determining that the currently received ambient light data belongs to the (20, 100) range, and the corresponding threshold value is 6.
If not, determining that the currently received ambient light data belongs to the range level 4, and the corresponding threshold is 5% of the first ambient light data.
As can be understood from the above procedure, when there are k sets of range levels, there are correspondingly k sets of ranges composed of numerical elements. Since there are no identical numerical elements in any two of the k range sets, the k sets of range sets may be sorted based on the size of the numerical elements in the range sets. By way of example, k range sets may be ordered from small to large, with the numerical elements in the first range set all being smaller than the numerical elements in the second range set, with the numerical elements in the second range set all being smaller than the numerical elements in the third range set, \\ 8230; \8230; and with the numerical elements in the k-1 range set all being smaller than the numerical elements in the k-th range set. Of course, the k range sets may also be ordered from large to small.
And when receiving one piece of ambient light data, sequentially judging according to the logic on the basis of the sorted range set until obtaining the corresponding threshold value. In the best case, the corresponding threshold value can be obtained by the first judgment; in the worst case, the corresponding threshold value can be obtained after k-1 judgments.
In practical applications, if the normal sampling mode is taken as an example, the ambient light data is obtained every time 350ms passes, the threshold needs to be determined according to the threshold determining manner provided in the above embodiments, which results in too much power consumption. Therefore, the embodiments of the present application may adopt the following manner to reduce power consumption.
Take the four ranges in the above embodiment as an example: [0,5] Range, (5, 20] Range, (20, 100] Range, (100, + ∞) Range first, 3 thresholds between the four ranges are determined, 5 (E1), 20 (E2), 10 (E3).
Referring to fig. 24, when the ambient light data is equal to the threshold value at any time of determination, it can be determined which level the received ambient light data belongs to, and it is not necessary to continue the subsequent determination process, so that the case of being equal to is omitted in the embodiment shown in fig. 24, and only the case of being greater than or less than is described.
After receiving the ambient light data, the ambient light data may be compared to a threshold value E2 (20);
if the value is less than E2 (20), continuing to compare with a smaller critical value E1 (5); if less than E1 (5), then it falls within the range of [0,5], i.e. range level 1, and the threshold C1 can be obtained as 2; if it is larger than E1 (5), it falls within the range (5, 20), i.e., range class 2, and the threshold C1 can be found to be 3.
If greater than E2 (20), a comparison continues with a larger threshold E3 (100), if less than E3 (100), a threshold C1 of 6 is obtained in the (20, 100) range, i.e., range level 3, and if greater than E3 (100), a threshold C1 of 5% of the first ambient light data is obtained in the (100, + ∞) range, i.e., range level 4.
In addition, in the embodiment of the present application, the 4 range levels divided as described above may be respectively referred to as a first luminance level, a second luminance level, a third luminance level, and a fourth luminance level. A critical value between the first brightness level and the second brightness level is a first critical value; a critical value between the second brightness level and the third brightness level is a second critical value, a critical value between the third brightness level and the fourth brightness level is a third critical value, the first critical value is smaller than the second critical value, and the second critical value is smaller than the third critical value.
In practical applications, each of the first, second, third and fourth luminance levels may further include 1 or more sub-luminance levels.
By way of example, the first brightness level may be further subdivided into a plurality of sub-brightness levels (e.g., subdividing the [0,5] range into [0,2] sub-brightness levels and (2, 3] sub-brightness levels). In the case of determining that the currently received ambient light data belongs to the first brightness level, it may be further determined which sub-brightness level of the first brightness level the currently received ambient light data belongs to based on the above-described determination manner.
It will also be appreciated from the above example that if there are k range levels in total (the most subdivided range levels), then k-1 critical values need to be determined. The k-1 cut-off values are sorted from large to small or from small to large.
Take the order from small to large as an example. Selecting a p-th critical value in the k-1 critical values as a reference for first judgment; if the ambient light data is less than the p-th critical value, one of the critical values less than the p-th critical value (from the 1 st critical value to the p-1 st critical value) is continuously selected as a reference for the second judgment \8230, 8230.
In this way, each time one of the threshold values is selected as the judgment reference, the threshold value biased toward the middle position among the selectable threshold values is selected as much as possible. For example, when the selectable critical value is 2m +1, the m critical value is selected as much as possible. When the selectable threshold is 2m, the m-1 th or the m-th threshold is selected as much as possible to improve the efficiency.
In the embodiment of the present application, the threshold C2 is a switching condition related to the fast sampling mode. As mentioned above, when the corresponding variation of the currently received ambient light data is too large (larger than C2), the fast sampling mode is switched to. Therefore, the threshold C2 is generally a relatively large value. In this case, it is not necessary to set the range level associated with the luminance value, and it is not necessary to set different threshold values C2 for different range levels. That is, the threshold C2 in the embodiment of the present application may be a fixed value.
In practical applications, when the electronic device is in a screen-off state, for example, a screen-off state, a screen-off display (AOD) state, and the like, the ambient light sensor also needs to collect ambient light data according to a collection period.
Referring to fig. 25, a switching relationship diagram of a sampling mode of the ambient light sensor in the screen-out state according to the embodiment of the present application is provided.
In the screen-off state, the ambient light sensor may mainly operate in the slow sampling mode, for example, after the electronic device is switched from the screen-on state to the screen-off state, the ambient light sensor operates in the slow sampling mode.
Of course, if an event occurs that requires an adjustment of the gain value, the ambient light sensor may be controlled to switch to the gain adjustment mode. And if the obtained ambient light data is in the range of [ a, b ] or is adjusted for a preset number of times (for example, N4) or is switched to the gain adjustment mode T4 for a time by adjusting the gain value, exiting the gain adjustment mode and returning to the slow sampling mode.
In order to make the above example more clearly understood, the switching process between the modes may be described by the embodiment shown in fig. 26.
In the switching process, after the electronic equipment is started, the ambient light sensor on the electronic equipment works in a normal sampling mode. Then, the normal sampling mode is switched to a slow sampling mode through the step S1; then, the slow sampling mode is switched into the fast sampling mode through the step S2; then, the fast sampling mode is switched to a normal sampling mode through the step S3; and then switching from the normal sampling mode to the fast sampling mode.
Of course, in the slow sampling mode, if the condition for switching to the normal sampling mode is satisfied, the slow sampling mode may be switched to the normal sampling mode through step S2'.
In the slow sampling mode, the fast sampling mode and the normal sampling mode, as long as the overflow of the ambient light data acquired by the SCP processor side occurs, the gain adjustment mode is switched to. In the gain adjustment mode, if the gain adjustment is completed, the normal sampling mode is returned.
In a particular implementation, the ambient light sensor operates in a normal sampling mode. N1 and N2 are both 10, and C2 is a fixed value.
The method comprises the steps that an SCP processor acquires ambient light data H1 acquired by an ambient light sensor on electronic equipment in an acquisition period corresponding to a normal sampling mode; the ambient light data H1 is within the range [ a, b ] and does not overflow.
The SCP processor sends ambient light data H1 to the AP processor.
The AP processor stores the ambient light data H1 as the first ambient light data in the FIFO.
The SCP processor acquires environmental light data H2 acquired by an environmental light sensor on the electronic equipment in an acquisition period corresponding to a normal sampling mode; the ambient light data H2 is within the range [ a, b ] and does not overflow.
The SCP processor sends ambient light data H2 to the AP processor.
After receiving the ambient light data H2, the AP processor obtains a corresponding threshold C1 based on the level range in which H2 is located.
The AP processor derives a maximum value and a minimum value based on the ambient light data H1 and the ambient light data H2.
The absolute value of the difference between the maximum and minimum values is less than C1, and the AP processor stores the ambient light data H2 as second ambient light data in the FIFO.
……
The SCP processor acquires ambient light data H10 (recorded as a first value) acquired by an ambient light sensor on the electronic equipment in an acquisition period corresponding to a normal sampling mode; the ambient light data H10 is within [ a, b ] and does not overflow.
The SCP processor sends ambient light data H10 to the AP processor.
After receiving the ambient light data H10, the AP processor obtains a corresponding threshold C1 based on the level range in which H10 is located.
The AP processor updates the maximum and minimum values based on the ambient light data H10 and the currently stored maximum and minimum values. Note that the maximum value and the minimum value obtained at this time are the maximum value and the minimum value in the first ambient light data composed of H10 and the ambient light data stored in the FIFO.
The absolute value of the difference between the updated maximum and minimum values is less than C1 and the AP processor stores the ambient light data H10 as the 10 th ambient light data in the FIFO.
At this time, 10 continuous ambient light data are stored, and the variation is within the range of [0, C1 ], the AP processor empties the FIFO, and sends information (which may be a unique identifier corresponding to the sampling mode to be switched, or a sleep duration or an acquisition period of the sampling mode to be switched, etc.) indicating that the sampling mode is switched to the slow sampling mode to the SCP processor, where the information is recorded as the first information.
As can be understood from the above examples, in the normal sampling mode, the condition for storing the currently received ambient light data is satisfied includes: emptying the first ambient light data received after the FIFO; or after the currently received ambient light data updates the maximum value and the minimum value, the absolute value of the difference between the maximum value and the minimum value is smaller than C1. For convenience of description, the condition for storing the ambient light data is referred to as a first storage condition. The first storage condition is a storage condition in a normal sampling mode.
It should be noted that, since the stability determination condition may be different in different sampling modes, for example, the stability determination condition in the normal sampling mode is that N1 ambient light data changes stabilize at [0, C1), and the stability determination condition in the fast sampling mode is that N2 ambient light data changes stabilize at [0, C2). Therefore, when the AP processor determines that the sampling mode needs to be switched, the FIFO is emptied.
The SCP processor receives the information indicating the switch to the slow sampling mode, and the SCP processor modifies the relevant information (sleep duration or acquisition period) in the register to instruct the ambient light sensor to acquire the ambient light data for the sleep duration or acquisition period stored in the register.
At this time, the ambient light sensor is switched to the slow sampling mode through step S1.
The ambient light sensor collects ambient light data H11 in a slow sampling mode.
The SCP processor acquires ambient light data H11 acquired by an ambient light sensor on the electronic equipment in an acquisition period corresponding to a slow sampling mode; the ambient light data H11 is within the range [ a, b ] without overflow.
The SCP processor sends ambient light data H11 to the AP processor.
The AP processor stores the ambient light data H11 as 1 st ambient light data in the FIFO.
The ambient light sensor collects ambient light data H12 in a slow sampling mode.
The SCP processor acquires ambient light data H12 (recorded as a third value) acquired by an ambient light sensor on the electronic equipment in an acquisition period corresponding to a slow sampling mode; the ambient light data H12 is within the range [ a, b ] without overflow.
The SCP processor sends ambient light data H12 to the AP processor.
The AP processor derives a corresponding threshold C1 based on the rank range in which H12 is located.
The AP processor derives a maximum and minimum value based on the ambient light data H12 and H11. The currently received ambient light data and the ambient light data stored in the FIFO constitute third ambient light data, and the maximum value and the minimum value obtained at this time are the maximum value and the minimum value of the second ambient light data.
And if the absolute value of the difference value between the maximum value and the minimum value is larger than C2, emptying the FIFO by the AP processor, and determining that the sampling mode is switched to a fast sampling mode.
It should be noted that, if the absolute value of the difference between the maximum value and the minimum value is greater than or equal to C1 and the absolute value of the difference is less than C2, the AP processor determines that the switching sampling mode is the normal sampling mode, i.e., executes step S2' in fig. 26. In the embodiment of the present application, the step S2 is switched to the fast sampling mode as an example.
The AP processor sends information indicating switching to the fast sampling mode, which is denoted as third information, to the SCP processor.
For ease of distinction, the ambient light data H12 that results in switching from the current slow sampling mode to the normal sampling mode may be noted as the second value. And the AP processor sends information for indicating the switching from the slow sampling mode to the normal sampling mode to the SCP processor as second information. The currently received ambient light data H12 and the ambient light data stored in the FIFO constitute second ambient light data.
The ambient light data H12 that results in switching from the current slow sampling mode to the fast sampling mode is noted as a third value. And sending information used for indicating that the AP processor is switched from the slow sampling mode to the fast sampling mode to the SCP processor by the AP processor is marked as third information. The currently received ambient light data H12 and the ambient light data stored in the FIFO constitute third ambient light data.
In addition, it should be noted that, in the slow sampling mode, it is not necessary to determine whether there is a change in a certain amount of ambient light data that is stable within a certain range. Therefore, in the slow sampling mode, the FIFO may not be emptied before switching to the other sampling mode. Of course, in the slow sampling mode, the FIFO is not switched to other sampling modes only if the change of the ambient light data is always smaller than C1, so that the FIFO is not emptied before the slow sampling mode is switched to other sampling modes (the normal sampling mode and the fast sampling mode). Therefore, it is understood that in the slow sampling mode, the received ambient light data is stored following the first storage condition before switching to the other sampling modes (the normal sampling mode and the fast sampling mode); or in the slow sampling mode, the received ambient light data meets the storage condition before switching to other sampling modes (the normal sampling mode and the fast sampling mode).
The SCP processor receives the information indicating the switch to the fast sampling mode, and the SCP processor modifies the relevant information (sleep duration or acquisition period) in the register to instruct the ambient light sensor to acquire the ambient light data for the sleep duration or acquisition period stored in the register.
At this time, the ambient light sensor switches to the fast sampling mode, via step S2.
The ambient light sensor collects ambient light data H13 in a fast sampling mode.
The SCP processor acquires ambient light data H13 acquired by an ambient light sensor on the electronic equipment in a corresponding acquisition period of a fast sampling mode; the ambient light data H13 is within the range of [ a, b ] without overflow.
The SCP processor sends ambient light data H13 to the AP processor.
The AP processor stores the ambient light data H13 as the first ambient light data in the FIFO.
The SCP processor acquires ambient light data H14 acquired by an ambient light sensor on the electronic equipment in a corresponding acquisition period of a fast sampling mode; the ambient light data H14 is within the range [ a, b ] without overflow.
The SCP processor sends ambient light data H14 to the AP processor.
After receiving the ambient light data H14, the AP processor obtains a maximum value and a minimum value based on the ambient light data H13 and the ambient light data H14.
The absolute value of the difference between the maximum and minimum values is less than C2, then the AP processor stores the ambient light data H14 as second ambient light data in the FIFO.
……
The SCP processor acquires ambient light data H22 (recorded as a fourth value) acquired by an ambient light sensor on the electronic equipment in an acquisition period corresponding to a fast sampling mode; the ambient light data H22 is within the range of [ a, b ] without overflow.
The SCP processor sends ambient light data H22 to the AP processor.
After receiving the ambient light data H22, the AP processor updates the maximum and minimum values based on the ambient light data H22 and the currently stored maximum and minimum values. Note that the maximum value and the minimum value obtained at this time are the maximum value and the minimum value in the fourth ambient light data composed of H22 and the ambient light data stored in the FIFO.
The absolute value of the difference between the updated maximum and minimum values is less than C2 and the AP processor stores the ambient light data H22 as the 10 th ambient light data in the FIFO.
At this time, 10 consecutive ambient light data have been obtained and the variations are all within the range of [0, C2), the AP processor sends a message to the SCP processor indicating a switch to the normal sampling mode. This information is denoted as fourth information.
As can be understood from the above examples, in the normal sampling mode, the condition for storing the currently received ambient light data is satisfied includes: emptying the first ambient light data received after the FIFO; or after the maximum value and the minimum value of the currently received ambient light data are updated, the absolute value of the difference between the maximum value and the minimum value is smaller than C2. For convenience of description, the condition for storing the ambient light data is referred to as a second storage condition. The second storage condition is a storage condition in the fast sampling mode.
At this time, since the normal sampling mode is to be switched, the AP processor may empty the FIFO to determine stability in the normal sampling mode.
The SCP processor receives the information indicating the switch to the normal sampling mode, and the SCP processor modifies the relevant information in the register to instruct the ambient light sensor to collect ambient light data for the sleep duration or collection period stored in the register.
At this time, the ambient light sensor switches to the normal sampling mode, via step S3.
The SCP processor acquires environmental light data H23 acquired by an environmental light sensor on the electronic equipment in an acquisition period corresponding to a normal sampling mode; the ambient light data H23 is within the range of [ a, b ] without overflow.
The SCP processor sends ambient light data H23 to the AP processor.
After receiving the ambient light data H23, the AP processor stores the ambient light data H23 as the first ambient light data in the FIFO.
The SCP processor acquires ambient light data H24 (recorded as a fifth value) acquired by an ambient light sensor on the electronic equipment in an acquisition period corresponding to a normal sampling mode; the ambient light data H24 is within the range [ a, b ] without overflow.
The SCP processor sends ambient light data H24 to the AP processor.
After receiving the ambient light data H24, the AP processor obtains a corresponding threshold C1 based on the level range in which H24 is located.
The AP processor obtains a maximum value and a minimum value based on the ambient light data H24 and the ambient light data H23. The currently received ambient light data H24 and the ambient light data stored in the FIFO are denoted as fifth ambient light data.
The absolute value of the difference between the maximum and minimum values is greater than C2, the AP processor clears the FIFO and sends a message to the SCP processor indicating a switch to fast sampling mode. This information is denoted as fifth information.
The SCP processor receives information indicating a switch to the fast sampling mode, and the SCP processor modifies the relevant information in the register to instruct the ambient light sensor to collect ambient light data for a sleep period or collection period stored in the register.
At this time, the ambient light sensor switches to the fast sampling mode, via step S4.
In the above process, any ambient light data collected by the ambient light sensor overflows (any ambient light data from H1 to H24 is not in the range of [ a, b ]), the SCP processor controls the ambient light sensor to operate in the gain adjustment mode based on the embodiment shown in fig. 17 until switching to the normal sampling mode.
In the embodiment of the present application, the ambient light data that overflows the [ a, b ] range (denoted as the first range) is denoted as a sixth value, and the ambient light sensor acquires the sixth value based on the first gain value. In the embodiment of the present application, the gain value adjusted from the first gain value is recorded as a second gain value, and the ambient light data collected by the ambient light sensor at the second gain value is recorded as a seventh value. And under the condition that the seventh value does not overflow, the ambient light sensor can adjust the sleep time length to the sleep time length of the normal sampling mode, and the SCP processor sends the seventh value and seventh information indicating switching to the normal sampling mode to the AP processor.
In practical application, in the screen-off state, the slow sampling mode may be switched to the gain adjustment mode, and then the gain adjustment mode may be switched to the slow sampling mode. Reference is made to the description of the embodiments above. In the screen-off state, the overflowed ambient light data collected by the ambient light sensor in the slow sampling mode is recorded as an eighth value. And acquiring based on the third gain value when the ambient light sensor acquires the eighth value. In the embodiment of the present application, the gain value adjusted from the third gain value is recorded as a fourth gain value, and the ambient light data acquired by the ambient light sensor at the fourth gain value is recorded as a ninth value. And under the condition that the ninth value does not overflow, the ambient light sensor adjusts the sleep time to the sleep time of the slow sampling mode, and the SCP processor sends the ninth value to the AP processor.
In the embodiment, the SCP processor side obtains the ambient light data and reports the ambient light data to the AP processor; the AP processor side obtains a data characteristic value of the ambient light data to determine whether to switch the sampling mode.
In practical application, the SCP processor side obtains the ambient light data, and the SCP processor side can also obtain the data characteristic value of the ambient light data to determine whether to switch the sampling mode. That is, the stability judgment algorithm running in the noise algorithm library of the AP processor may be loaded in the SCP processor.
The detailed process of acquiring the ambient light data by the SCP processor and determining whether to switch the sampling mode by the SCP processor according to the data characteristic value of the ambient light data will not be described in detail in the embodiments of the present application.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the foregoing method embodiments may be implemented.
Embodiments of the present application further provide a computer program product, which when run on an electronic device, enables the electronic device to implement the steps in the above method embodiments.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above may be implemented by a computer program instructing related hardware to execute the computer program, and the computer program may be stored in a computer readable storage medium, and when executed by a processor, may implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to an electronic device, a recording medium, computer Memory, read-Only Memory (ROM), random-Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-drive, a removable hard drive, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
The embodiments of the present application further provide a chip system, where the chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement the steps of any of the method embodiments of the present application. The chip system may be a single chip or a chip module composed of a plurality of chips.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (28)
1. A method for detecting ambient light, applied to an electronic device, the electronic device comprising: a first processor and a second processor, the method comprising:
the first processor acquires a first value acquired by an ambient light sensor on the electronic equipment in a first sampling mode;
the first processor sending the first value to the second processor;
the second processor receiving the first value;
the second processor sending first information to the first processor based on the first value, the first information corresponding to a second sampling pattern;
in response to receiving the first information, the first processor instructs the ambient light sensor to collect ambient light based on the second sampling pattern, the second sampling pattern having a collection period that is greater than a collection period of the first sampling pattern.
2. The method of claim 1, wherein the second processor sending first information to the first processor based on the first value comprises:
the second processor determines a maximum value and a minimum value in first ambient light data, where the first ambient light data includes the first value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is the ambient light data that satisfies a first storage condition and is stored after the second processor has emptied the first storage space for the last time;
if the absolute value of the difference between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold, the second processor stores the first value in the first storage space;
after storing the first value in the first storage space, if the ambient light data stored in the first storage space is a first amount of ambient light data, the second processor sends the first information to the first processor.
3. The method of claim 1 or 2, wherein after the first processor instructs the ambient light sensor to collect ambient light based on a second sampling pattern, the method further comprises:
the first processor acquires a second value acquired by the ambient light sensor in the second sampling mode;
the first processor sending the second value to the second processor;
the second processor receiving the second value;
the second processor sending second information to the first processor based on the second value, the second information corresponding to the first sampling pattern;
in response to receiving the second information, the first processor instructs the ambient light sensor to collect ambient light based on the first sampling pattern.
4. The method of claim 3, wherein the second processor sending second information to the first processor based on the second value comprises:
the second processor determines a maximum value and a minimum value in received second ambient light data, where the second ambient light data is the ambient light data collected after the ambient light sensor is switched to the second sampling mode, and the second ambient light data includes the second value;
and if the absolute value of the difference between the maximum value and the minimum value in the second ambient light data is greater than or equal to a first threshold value and the absolute value of the difference between the maximum value and the minimum value in the second ambient light data is less than a second threshold value, the second processor sends the second information to the first processor.
5. The method of claim 1 or 2, wherein after the first processor instructs the ambient light sensor to collect ambient light based on a second sampling pattern, the method further comprises:
the first processor acquires a third value acquired by the ambient light sensor in the second sampling mode;
the first processor sending the third value to the second processor;
the second processor receiving the third value;
the second processor sending third information to the first processor based on the third value, the third information corresponding to a third sampling pattern;
in response to receiving the third information, the first processor instructs the ambient light sensor to collect ambient light based on the third sampling pattern.
6. The method of claim 5, wherein the second processor sending third information to the first processor based on the third value comprises:
the second processor determines a maximum value and a minimum value in received third ambient light data, where the third ambient light data is the ambient light data collected after the ambient light sensor is switched to the second sampling mode, and the third ambient light data includes the third value;
and if the absolute value of the difference value between the maximum value and the minimum value in the third ambient light data is greater than or equal to a second threshold, the second processor sends the third information to the first processor.
7. The method of claim 5 or 6, wherein after the first processor instructs the ambient light sensor to collect ambient light based on a third sampling pattern, the method further comprises:
the first processor acquires a fourth value acquired by the ambient light sensor in the third sampling mode;
the first processor sending the fourth value to the second processor;
the second processor receiving the fourth value;
the second processor sending fourth information to the first processor based on the fourth value, the fourth information corresponding to the first sampling pattern;
in response to receiving the fourth information, the first processor instructs the ambient light sensor to collect ambient light based on the first sampling pattern.
8. The method of claim 7, wherein the second processor sending fourth information to the first processor based on the fourth value comprises:
the second processor determines a maximum value and a minimum value in fourth ambient light data, where the fourth ambient light data includes the fourth value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is ambient light data which is stored in the second storage space after the second processor has emptied the first storage space for the last time and meets a second storage condition;
if the absolute value of the difference between the maximum value and the minimum value in the fourth ambient light data is smaller than a second threshold, the second processor stores the fourth value in the first storage space;
after storing the fourth value in the first storage space, if the ambient light data stored in the first storage space is a second amount of ambient light data, the second processor sends the fourth information to the first processor.
9. The method of claim 3, 4, 7, or 8, wherein after the first processor instructs the ambient light sensor to collect ambient light based on the first sampling pattern, the method further comprises:
the first processor obtains a fifth value acquired by the ambient light sensor in the first sampling mode;
the first processor sending the fifth value to the second processor;
the second processor receiving the fifth value;
the second processor sending fifth information to the first processor based on the fifth value, the fifth information corresponding to a third sampling pattern;
in response to receiving the fifth information, the first processor instructs the ambient light sensor to collect ambient light based on the third sampling pattern.
10. The method of claim 9, wherein the second processor sending fifth information to the first processor based on the fifth value comprises:
the second processor determines a maximum value and a minimum value in fifth ambient light data, where the fifth ambient light data includes the fifth value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is ambient light data which satisfies a first storage condition and is stored after the second processor has emptied the first storage space for the last time;
and if the absolute value of the difference value between the maximum value and the minimum value in the fifth ambient light data is greater than or equal to a second threshold, the second processor sends fifth information to the first processor.
11. The method of any one of claims 1 to 10, further comprising:
the first processor acquires a sixth value acquired by the ambient light sensor in any sampling mode, wherein the sixth value is ambient light data acquired by the ambient light sensor in a first gain value, and any sampling mode comprises the first sampling mode, the second sampling mode and a third sampling mode;
if the sixth value is not within the first range, the first processor adjusts the gain value of the ambient light sensor to a second gain value;
the first processor instructing the ambient light sensor to collect ambient light based on a fourth sampling pattern having a collection period that is less than a collection period of the first sampling pattern; the first processor acquires a seventh value acquired by the ambient light sensor in the fourth sampling mode and the second gain value;
if the seventh value is within the first range, the first processor instructs the ambient light sensor to collect ambient light in the first sampling mode, and the first processor sends the seventh value and seventh information to the second processor;
the second processor receiving the seventh value and the seventh information, the seventh information indicating that the ambient light sensor switched to the first sampling mode;
the second processor empties the first storage space based on the seventh information;
after flushing the first memory space, the second processor stores the seventh value in the first memory space.
12. The method of claim 2, wherein the second processor stores the first value in the first storage space if an absolute value of a difference between a maximum value and a minimum value in the first ambient light data is less than a first threshold, comprising:
the second processor obtains the brightness level of the first value based on the first value;
and if the absolute value of the difference value between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold corresponding to the brightness level where the first value is located, the second processor stores the first value in the first storage space.
13. The method of claim 12, wherein the brightness level comprises: a first brightness level, a second brightness level, a third brightness level, and a fourth brightness level; a critical value between the first brightness level and the second brightness level is a first critical value; a critical value between the second brightness level and the third brightness level is a second critical value, a critical value between the third brightness level and the fourth brightness level is a third critical value, the first critical value is smaller than the second critical value, and the second critical value is smaller than the third critical value;
the second processor obtains the brightness level of the first value based on the first value; the method comprises the following steps:
judging the relation between the first value and the second critical value;
if the first value is equal to the second critical value, the brightness grade of the first value is the brightness grade of the second critical value;
if the first value is smaller than the second critical value, judging the relation between the first value and the first critical value;
if the first value is smaller than the first critical value, the brightness grade where the first value is located is obtained as the first brightness grade; if the first value is larger than the first critical value, the brightness level where the first value is located is obtained as the second brightness level; if the first value is equal to the first critical value, the brightness grade of the first value is the brightness grade of the first critical value;
if the first value is larger than the second critical value, judging the relation between the first value and the third critical value;
if the first value is smaller than the third critical value, the brightness grade where the first value is located is obtained and is the third brightness grade; if the first value is larger than the third critical value, the brightness level where the first value is located is obtained as the fourth brightness level; and if the first value is equal to the third critical value, obtaining that the brightness level of the first value is the brightness level of the third critical value.
14. The method of any one of claims 1 to 13, wherein a display screen of the electronic device is in a bright screen state.
15. The method of any of claims 1 to 14, further comprising:
in response to a display screen of the electronic device switching to a screen-off state, the first processor instructs the ambient light sensor to collect ambient light in a second sampling mode.
16. The method of claim 15, wherein when the display screen of the electronic device is in a screen-off state, the method further comprises:
the first processor acquires an eighth value acquired by the ambient light sensor in the second sampling mode, wherein the eighth value is ambient light data acquired by the ambient light sensor at a third gain value;
if the eighth value is not within the first range, the first processor adjusts the gain value of the ambient light sensor to a fourth gain value;
the first processor instructing the ambient light sensor to collect ambient light based on a fourth sampling pattern having a collection period that is less than a collection period of the second sampling pattern; the first processor obtains a ninth value acquired by the ambient light sensor in the fourth sampling mode and the fourth gain value;
if the ninth value is within the first range, the first processor instructs the ambient light sensor to collect ambient light in a second sampling mode, and the first processor sends the ninth value to the second processor.
17. The method of claim 2, wherein the ambient light data meeting the first storage condition comprises:
after the second processor empties the first storage space, the received first ambient light data is the ambient light data meeting the first storage condition;
when the absolute value of the difference between the maximum value and the minimum value in the currently received ambient light data and the currently stored ambient light data in the first storage space is smaller than a first threshold, the currently received ambient light data is ambient light data satisfying the first storage condition;
the condition that the second processor empties the first storage space comprises:
in a normal sampling mode, if an absolute value of a difference between a maximum value and a minimum value in the ambient light data currently received by the second processor and the ambient light data currently stored in the first storage space is not less than a first threshold and is less than a second threshold, a condition for emptying the first storage space is satisfied.
18. The method of claim 1, wherein the acquisition period comprises an integration duration and a sleep duration; the integration time duration of the second sampling pattern is the same as the integration time duration of the first sampling pattern.
19. The method of claim 1, wherein the electronic device further comprises: an ambient light sensor driver, an ambient light sensor, a HWC module, a library of noise algorithms, the method comprising:
the first processor obtains a first value collected by the ambient light sensor in a first sampling mode through the drive of the ambient light sensor;
the first processor sending the first value to the HWC module through the ambient light sensor drive;
the second processor receiving the first value through the HWC module, the second processor sending the first value to the noise algorithm library through the HWC module;
the second processor sending, by a noise algorithm library, first information to the HWC module based on the first value;
the second processor sending the first information to the ambient light sensor driver via the HWC module, the first information corresponding to a second sampling pattern;
in response to receiving the first information, the first processor instructs, by the ambient light sensor driver, the ambient light sensor to collect ambient light based on the second sampling pattern, the second sampling pattern having a collection period that is greater than a collection period of the first sampling pattern.
20. A method for detecting ambient light, applied to an electronic device including a second processor, the method comprising:
the second processor receives a first value, wherein the first value is ambient light data collected by an ambient light sensor of the electronic equipment in a first sampling mode;
the second processor determines a maximum value and a minimum value in first ambient light data, where the first ambient light data includes the first value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is the ambient light data that satisfies a storage condition and is stored after the second processor has emptied the first storage space for the last time;
if the absolute value of the difference between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold, the second processor stores the first value in the first storage space;
after the first value is stored in the first storage space, if the ambient light data stored in the first storage space is a first amount of ambient light data, the second processor sends first information to a first processor of the electronic device, where the first information is used to instruct the first processor to control the ambient light sensor to collect the ambient light based on a second sampling mode, and a collection period of the second sampling mode is greater than a collection period of the first sampling mode.
21. The method of claim 20, wherein the method further comprises:
the second processor receives a seventh value and seventh information, the seventh information being used for indicating that the ambient light sensor is switched from a gain adjustment mode to the first sampling mode, the gain adjustment mode of the ambient light sensor being a mode when the ambient light sensor collects ambient light data in a first range and the first processor adjusts the gain value of the ambient light sensor;
the second processor empties the first storage space based on the seventh information;
after flushing the first memory space, the second processor stores the seventh value in the first memory space.
22. The method of claim 20 or 21, wherein the second processor stores the first value in the first storage space if the absolute value of the difference between the maximum value and the minimum value in the first ambient light data is less than a first threshold, comprising:
the second processor obtains the brightness level of the first value based on the first value;
and if the absolute value of the difference value between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold corresponding to the brightness level where the first value is located, the second processor stores the first value in the first storage space.
23. The method of claim 22, wherein the brightness level comprises: a first brightness level, a second brightness level, a third brightness level, and a fourth brightness level; a critical value between the first brightness level and the second brightness level is a first critical value; a critical value between the second brightness level and the third brightness level is a second critical value, a critical value between the third brightness level and the fourth brightness level is a third critical value, the first critical value is smaller than the second critical value, and the second critical value is smaller than the third critical value;
the second processor obtains the brightness level of the first value based on the first value; the method comprises the following steps:
judging the relation between the first value and the second critical value;
if the first value is equal to the second critical value, the brightness grade of the first value is the brightness grade of the second critical value;
if the first value is smaller than the second critical value, judging the relation between the first value and the first critical value;
if the first value is smaller than the first critical value, the brightness level where the first value is located is obtained as the first brightness level; if the first value is larger than the first critical value, the brightness grade where the first value is located is obtained and is the second brightness grade; if the first value is equal to the first critical value, obtaining the brightness level of the first value as the brightness level of the first critical value;
if the first value is larger than the second critical value, judging the relation between the first value and the third critical value;
if the first value is smaller than the third critical value, the brightness level where the first value is located is obtained as the third brightness level; if the first value is larger than the third critical value, the brightness level where the first value is located is obtained and is the fourth brightness level; and if the first value is equal to the third critical value, obtaining that the brightness level of the first value is the brightness level of the third critical value.
24. The method of any of claims 20 to 23, wherein the ambient light data meeting the storage condition comprises:
after the second processor empties the first storage space, the received first ambient light data is ambient light data meeting storage conditions;
when the absolute value of the difference between the maximum value and the minimum value in the currently received ambient light data and the currently stored ambient light data in the first storage space is smaller than a first threshold, the currently received ambient light data is the ambient light data meeting the storage condition;
the condition for emptying the first storage space comprises:
and if the absolute value of the difference between the maximum value and the minimum value in the ambient light data currently received by the second processor and the ambient light data currently stored in the first storage space is not less than a first threshold and is less than a second threshold, the condition of emptying the first storage space is met.
25. A method for detecting ambient light, applied to an electronic device including a first processor, the method comprising:
the first processor acquires a first value acquired by an ambient light sensor on the electronic equipment in a first sampling mode;
the first processor determines a maximum value and a minimum value in first ambient light data, where the first ambient light data includes the first value and ambient light data stored in a first storage space, and the ambient light data stored in the first storage space is the ambient light data that satisfies a storage condition and is stored after the first processor has emptied the first storage space for the last time;
if the absolute value of the difference between the maximum value and the minimum value in the first ambient light data is smaller than a first threshold, the first processor stores the first value in the first storage space;
after storing the first value in the first storage space, if the ambient light data stored in the first storage space is a first amount of ambient light data, the first processor instructs the ambient light sensor to collect the ambient light based on a second sampling mode, wherein a collection period of the second sampling mode is greater than a collection period of the first sampling mode.
26. An electronic device, characterized in that the electronic device comprises a first processor and a second processor for running a computer program stored in a memory, to cause the electronic device to carry out the method according to any one of claims 1 to 19.
27. A chip system comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the method of any of claims 20 to 24 or the method of claim 25.
28. A computer-readable storage medium, in which a computer program is stored which, when run on a processor, implements the method of any one of claims 20 to 24 and/or the method of claim 25.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110905134.8A CN115931115A (en) | 2021-08-06 | 2021-08-06 | Detection method of ambient light, electronic equipment, chip system and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110905134.8A CN115931115A (en) | 2021-08-06 | 2021-08-06 | Detection method of ambient light, electronic equipment, chip system and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115931115A true CN115931115A (en) | 2023-04-07 |
Family
ID=86551054
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110905134.8A Pending CN115931115A (en) | 2021-08-06 | 2021-08-06 | Detection method of ambient light, electronic equipment, chip system and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115931115A (en) |
-
2021
- 2021-08-06 CN CN202110905134.8A patent/CN115931115A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113804290B (en) | Ambient light detection method, electronic device and chip system | |
CN109793498B (en) | Skin detection method and electronic equipment | |
CN110956939B (en) | Method for adjusting screen brightness and electronic equipment | |
CN113630572A (en) | Frame rate switching method and related device | |
CN113810601B (en) | Terminal image processing method and device and terminal equipment | |
WO2022100685A1 (en) | Drawing command processing method and related device therefor | |
WO2022007862A1 (en) | Image processing method, system, electronic device and computer readable storage medium | |
CN115794287A (en) | Display method, electronic equipment and computer storage medium | |
CN114999421B (en) | Screen brightness adjusting method and electronic equipment | |
CN114579076A (en) | Data processing method and related device | |
CN113810603B (en) | Point light source image detection method and electronic equipment | |
CN113744750B (en) | Audio processing method and electronic equipment | |
CN113448482A (en) | Sliding response control method and device of touch screen and electronic equipment | |
CN115565208A (en) | Display method and electronic equipment | |
CN114089932A (en) | Multi-screen display method and device, terminal equipment and storage medium | |
CN113810589A (en) | Electronic device, video shooting method and medium thereof | |
CN113572948A (en) | Video processing method and video processing device | |
CN110473562B (en) | Audio data processing method, device and system | |
WO2024156206A9 (en) | Display method and electronic device | |
CN113852755A (en) | Photographing method, photographing apparatus, computer-readable storage medium, and program product | |
CN113805983B (en) | Method for adjusting window refresh rate and electronic equipment | |
CN115931115A (en) | Detection method of ambient light, electronic equipment, chip system and storage medium | |
CN114461093A (en) | Detection method of ambient light, electronic equipment, chip system and storage medium | |
CN113837990B (en) | Noise monitoring method, electronic equipment, chip system and storage medium | |
CN113820008B (en) | Ambient light detection method, electronic device and chip system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |