WO2023016008A1 - 参数的调整方法、显示的控制方法、电子设备及介质 - Google Patents
参数的调整方法、显示的控制方法、电子设备及介质 Download PDFInfo
- Publication number
- WO2023016008A1 WO2023016008A1 PCT/CN2022/092653 CN2022092653W WO2023016008A1 WO 2023016008 A1 WO2023016008 A1 WO 2023016008A1 CN 2022092653 W CN2022092653 W CN 2022092653W WO 2023016008 A1 WO2023016008 A1 WO 2023016008A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- ambient light
- brightness
- light brightness
- display screen
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 137
- 230000004044 response Effects 0.000 claims abstract description 9
- 238000001514 detection method Methods 0.000 claims description 56
- 238000004590 computer program Methods 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 17
- 238000004891 communication Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 14
- 238000004422 calculation algorithm Methods 0.000 description 8
- 238000010295 mobile communication Methods 0.000 description 8
- 238000007726 management method Methods 0.000 description 6
- 230000007423 decrease Effects 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 239000000284 extract Substances 0.000 description 3
- 235000019557 luminance Nutrition 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241000287181 Sturnus vulgaris Species 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
Definitions
- the present application relates to the technical field of image processing, and in particular to a parameter adjustment method, a display control method, electronic equipment and media.
- the electronic device may use a camera to capture a face image of the user, and judge whether the user stares at the display screen for a long time through the face image.
- the quality of the image captured by the camera restricts the accuracy of the electronic device in determining whether the user's eyes are looking at the display. For this reason, after the electronic device obtains the image captured by the camera, if it judges that the image is underexposed or overexposed, it will adjust the exposure parameters of the camera, so that the camera can capture images with normal brightness with reasonable exposure parameters. However, it takes a long time for the electronic device to adjust the exposure parameters of the camera, which results in a long time-consuming and low accuracy detection of the human eyes watching the display screen in the image.
- the present application provides a parameter adjustment method, a display control method, electronic equipment and media, with the purpose of reducing the time-consuming adjustment of the exposure parameters of the camera by the electronic equipment, thereby reducing the time-consuming detection of human eyes watching the display screen in the image, And improve detection accuracy.
- the present application provides a method for adjusting parameters, which is applied to an electronic device.
- the electronic device includes a front-facing camera and a display screen, and the front-facing camera is configured to respond to a first instruction to run and capture images with initial exposure parameters.
- the initial exposure parameters of the front camera match the brightness of the ambient light;
- the adjustment method of the parameters provided by the application includes: when the display screen of the electronic device displays data, obtain the image captured by the front camera; use the image data of the image to calculate the Image brightness: Adjust the initial exposure parameters of the front camera based on the difference between the image brightness and the standard brightness to obtain the exposure parameter adjustment value, which is configured to the front camera, and the front camera receives the second command , in response to the second instruction, run and capture images with the adjustment value of the exposure parameter.
- the front camera is configured to run and shoot images with the initial exposure parameters that match the brightness of the ambient light, which can ensure that the image brightness of the images captured by the camera running with the initial exposure parameters is relatively reasonable, not too bright or too dark,
- the number of times to adjust the initial exposure parameters of the front camera is relatively small based on the difference between the image brightness of the image and the standard brightness, and the exposure parameters do not need to be adjusted multiple times, which reduces the time-consuming adjustment of the exposure parameters of the front camera.
- the exposure parameters that meet the image brightness requirements are quickly adjusted to ensure that the image with image brightness that meets the requirements is judged as soon as possible whether the human eye is watching the display screen, thereby improving the accuracy of judging the human eye watching the display screen in the image.
- the method when the data is displayed on the display screen, before acquiring the image captured by the front-facing camera in response to the first instruction and running with the initial exposure parameters, the method further includes: determining the initial exposure parameters by using ambient light brightness.
- using the ambient light brightness to determine the initial exposure parameter includes: determining an exposure parameter that matches the ambient light brightness based on a correspondence between the ambient light brightness and the exposure parameter.
- the method for generating the corresponding relationship between ambient light brightness and exposure parameters includes: acquiring multiple sets of sample images, one set of sample images corresponds to one ambient light brightness, and one set of sample images includes multiple sample images, Each sample image corresponds to an exposure parameter; use the ambient light brightness corresponding to multiple sets of sample images, and the exposure parameters corresponding to the sample image of the human eye watching the display screen in multiple sets of sample images to generate the corresponding relationship between ambient light brightness and exposure parameters .
- the method for generating the corresponding relationship between ambient light brightness and exposure parameters includes: acquiring a plurality of historical exposure parameter adjustment values, and the ambient light brightness corresponding to each historical exposure parameter adjustment value, each historical exposure The parameter adjustment value meets the requirement of the standard brightness; using multiple historical exposure parameter adjustment values and the ambient light brightness corresponding to each historical exposure parameter adjustment value, a corresponding relationship between the ambient light brightness and the exposure parameter is generated.
- the initial exposure parameter of the front camera is adjusted based on the difference between the image brightness of the image and the standard brightness, and after obtaining the adjusted value of the exposure parameter, it further includes: updating the ambient light brightness and exposure with the adjusted value of the exposure parameter Correspondence of parameters.
- the initial exposure parameters include: at least one of exposure duration, analog gain, and digital gain.
- the method before adjusting the initial exposure parameter of the front camera based on the difference between the image brightness and the standard brightness to obtain the adjusted value of the exposure parameter, the method further includes: determining the standard brightness by using the ambient light brightness.
- the standard brightness is determined by the brightness of the ambient light. Based on the standard brightness, the exposure parameters of the front camera are adjusted, and the front camera shoots with the adjusted value of the exposure parameters to obtain the standard brightness requirements that meet the brightness of the ambient light.
- the image can ensure the image quality of the image captured by the front camera, and improve the accuracy of judging whether the human eye in the image is watching the display screen.
- using ambient light brightness to determine a standard brightness includes: determining a standard brightness that matches the ambient light brightness based on a correspondence between the ambient light brightness and the standard brightness.
- the method for generating the corresponding relationship between ambient light brightness and standard brightness includes: acquiring multiple sets of sample images, one set of sample images corresponds to one ambient light brightness, and one set of sample images includes multiple sample images; A corresponding relationship between the ambient light brightness and the standard brightness is generated by using the ambient light brightness corresponding to the multiple sets of sample images, and the image brightness of the sample image of the human eyes watching the display screen in the multiple sets of sample images.
- the corresponding relationship between the ambient light brightness and the standard brightness includes: multiple ambient light brightness intervals and standard brightness matching each ambient light brightness interval.
- using the image data of the image to calculate the image brightness of the image includes: obtaining the RGB component of each pixel in the image; calculating the mean value of the RGB component of each pixel in the image, and the calculated The mean is taken as the image brightness of the image.
- the data when the data is displayed on the display screen, after acquiring the image taken by the front camera in response to the first instruction and running with the initial exposure parameters, further includes: the comparison result of the image data and the sample feature library according to the image , configure the confidence level for the image, and the confidence level is used to characterize the probability of the human eye watching the display screen in the image; if the confidence level of the image is not less than the preset threshold value, which matches the ambient light brightness, the control display The screen does not turn off.
- the preset threshold value matches the brightness of the ambient light, and the confidence of the image is compared with the preset threshold value, so as to ensure that the preset threshold value compared with the image confidence degree can conform to the environment.
- Brightness requirements ensure the detection accuracy of human eyes watching the display screen in the detection image.
- the display screen before controlling the display screen to not turn off the screen, it also includes: determining that in the images captured by the front camera within a preset period of time, the confidence level of one frame of image is not less than the preset threshold value, and the preset The time period is determined by the screen off time that the display is set to.
- it further includes: if the confidence levels of the images captured by the front-facing camera within a preset period of time are all less than a preset threshold value, controlling the display screen to turn off.
- the method when displaying data on the display screen, before acquiring the image captured by the front camera in response to the first instruction and running with the initial exposure parameters, the method further includes: determining a preset threshold value by using ambient light brightness.
- using ambient light brightness to determine the preset threshold value includes: determining a threshold value that matches the ambient light brightness based on a correspondence between the ambient light brightness and the threshold value.
- the correspondence between the ambient light brightness and the threshold value includes: a plurality of ambient light brightness intervals, and a matching threshold value for each ambient light brightness interval.
- the threshold value for matching the strong light interval and the threshold value for matching the low light interval are smaller than the threshold value for matching the normal light interval.
- the image brightness of the image captured by the camera is too low or too high, the image is underexposed or overexposed, and the confidence in the image configuration will be reduced.
- the threshold value for matching the strong light interval and the threshold value for matching the low light interval are smaller than the threshold value for matching the normal light interval.
- the method before controlling the display screen to not turn off, the method further includes: acquiring a face detection result of the image, and determining that the face detection result is a detected face.
- the face detection result of the image is obtained again, and after the face detection result is determined to be a face detection, the display screen is not turned off to avoid
- the problem of misjudgment of human eyes watching the display screen in the image is solved by using the preset threshold value matched with the ambient light brightness.
- the present application provides a display control method, which is applied to an electronic device.
- the electronic device includes a front camera and a display screen.
- the display control method includes: when displaying data on the display screen, acquiring the data captured by the front camera Image; according to the image data of the image and the comparison result of the sample feature library, the confidence degree is configured for the image, and the confidence degree is used to represent the probability of human eyes watching the display screen in the image; if the confidence degree of the image is not less than the preset threshold value, Then it is determined that the human eyes in the image are watching the display screen, and the preset threshold value matches the brightness of the ambient light.
- the preset threshold value matches the brightness of the ambient light
- the image confidence level is compared with the preset threshold value, which ensures that the preset threshold value compared with the image confidence level can meet the ambient light intensity.
- Brightness requirements ensure the detection accuracy of the human eye watching the display screen in the detection image.
- the method further includes: controlling the display screen not to turn off.
- the display screen before controlling the display screen to turn off the screen, it also includes: determining that among the multiple frames of images captured by the front camera within a preset period of time, there is a frame of human eyes watching the display screen in the preset period of time. Determined by the screen off time set by the display.
- the method further includes: if the confidence level of the image is less than a preset threshold value, determining that the human eyes in the image are not looking at the display screen.
- determining that the human eyes in the image are watching the display screen, before the preset threshold value matches the brightness of the ambient light further includes: using the ambient light Brightness, to determine the preset threshold value.
- using ambient light brightness to determine the preset threshold value includes: determining a threshold value that matches the ambient light brightness based on a correspondence between the ambient light brightness and the threshold value.
- the correspondence between the ambient light brightness and the threshold value includes: a plurality of ambient light brightness intervals, and a matching threshold value for each ambient light brightness interval.
- the threshold value for matching the strong light interval and the threshold value for matching the low light interval are smaller than the threshold value for matching the normal light interval.
- the image brightness of the image captured by the camera is too low or too high, the image is underexposed or overexposed, and the confidence in the image configuration will be reduced.
- the threshold value for matching the strong light interval and the threshold value for matching the low light interval are smaller than the threshold value for matching the normal light interval.
- the initial exposure parameters of the front camera match the ambient light brightness.
- the front camera is configured to run and capture images with initial exposure parameters that match the brightness of the ambient light, which can ensure that the image brightness of the images captured by the camera with the initial exposure parameters is relatively reasonable, not too bright or too dark
- the number of times to adjust the initial exposure parameters of the front camera is relatively small based on the difference between the image brightness of the image and the standard brightness, and there is no need to adjust the exposure parameters multiple times, reducing the time-consuming adjustment of the exposure parameters of the front camera.
- the exposure parameters that meet the image brightness requirements are quickly adjusted to ensure that the image with image brightness that meets the requirements is judged as soon as possible whether the human eye is watching the display screen, thereby improving the accuracy of judging the human eye watching the display screen in the image.
- the method when displaying data on the display screen, before acquiring the image captured by the front camera, the method further includes: determining an initial exposure parameter based on a correspondence between ambient light brightness and the exposure parameter.
- the method for generating the corresponding relationship between ambient light brightness and exposure parameters includes: acquiring multiple sets of sample images, one set of sample images corresponds to one ambient light brightness, and one set of sample images includes multiple sample images, Each sample image corresponds to an exposure parameter; use the ambient light brightness corresponding to multiple sets of sample images, and the exposure parameters corresponding to the sample image of the human eye watching the display screen in multiple sets of sample images to generate the corresponding relationship between ambient light brightness and exposure parameters .
- the method for generating the corresponding relationship between ambient light brightness and exposure parameters includes: acquiring a plurality of historical exposure parameter adjustment values, and the ambient light brightness corresponding to each historical exposure parameter adjustment value, and adjusting the historical exposure parameter The value meets the standard brightness requirement; multiple historical exposure parameter adjustment values and the ambient light brightness corresponding to each historical exposure parameter adjustment value are used to generate a corresponding relationship between the ambient light brightness and the exposure parameter.
- the initial exposure parameters include: at least one of exposure duration, analog gain, and digital gain.
- the method before determining that the human eyes in the image are gazing on the display screen, the method further includes: acquiring a face detection result of the image; and determining that the face detection result is a detected face.
- the face detection result of the image is obtained again, and after the face detection result is determined to be a detected face, it is determined that the human eyes in the image are gazing and displayed
- the screen avoids the problem of misjudgment of human eyes watching the display screen in the image performed with the preset threshold value matching the ambient light brightness.
- the present application provides an electronic device, including: a display screen, an ambient light detector, a front-facing camera, one or more processors, and a memory storing programs, wherein: the display screen is used to display data; the environment The light detector is used to detect the ambient light to obtain the brightness of the ambient light; the front camera is used to run and capture images with the initial exposure parameters when the display screen displays data; when the program in the memory is executed by one or more processors, the electronic The device executes the parameter adjustment method of the first aspect or any possible implementation manner of the first aspect, or the display control method of the second aspect or any possible implementation manner of the second aspect.
- the present application provides a readable storage medium on which a computer program is stored, wherein, when the computer program is executed by a processor, the parameters of the first aspect or any possible implementation manner of the first aspect are realized.
- FIG. 1 is a diagram of an application scenario of an electronic device provided by the present application
- Figure 2a is a schematic structural diagram of the electronic device provided by the present application.
- Figure 2b is a diagram showing the operation process of the logic unit in the electronic device provided by the present application.
- FIG. 3 is a display diagram of multiple frames of images taken by the camera provided by the present application.
- FIG. 4a is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
- FIG. 4b is an example diagram of a software structure of an electronic device provided in an embodiment of the present application.
- FIG. 5 is a schematic structural diagram of an electronic device provided in Embodiment 1 of the present application.
- FIG. 6 is a display diagram of multiple frames of images captured by the camera provided in Embodiment 1 of the present application.
- FIG. 7 is a schematic structural diagram of an electronic device provided in Embodiment 2 of the present application.
- FIG. 8 is a display diagram of multiple frames of images captured by the camera provided in Embodiment 2 of the present application.
- FIG. 9 is a schematic structural diagram of an electronic device provided in Embodiment 3 of the present application.
- FIG. 10 is a display diagram of multiple frames of images captured by the camera provided in Embodiment 3 of the present application.
- FIG. 11 is a display diagram of multiple frames of images captured by the camera provided in Embodiment 3 of the present application.
- FIG. 12 is a flowchart of a display control method provided in Embodiment 4 of the present application.
- words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner.
- Users can consult webpages, news, articles, etc. through electronic devices, and can also play games and watch videos through electronic devices.
- users browse webpages, news, articles, play games or watch videos through electronic devices users will stare at the display screen of electronic devices for a long time. The corresponding events will be executed, such as the display screen does not turn off, the ringtone volume decreases, etc.
- Figure 1 shows a scenario where a user browses a web page through an electronic device.
- the following uses this scenario as an example to introduce a solution in which the electronic device detects that the user is staring at the display screen for a long time and executes a corresponding event.
- the simplest image front end (Image Front End lit, IFE lit) unit is an integrated unit in the image signal processor, the image output by the camera will reach the IFE lit integrated unit, which will store the image output by the camera in the memory The security buffer.
- the automatic exposure module belongs to a logic unit of the controller, and is obtained by running an automatic exposure (automatic exposure, AE) algorithm by the controller.
- the AO (always on) module is also a logical unit of the controller, which is obtained by running the AO (always on) scheme on the controller.
- the AO solution refers to the intelligent perception solution based on the AO camera (always on camera), which usually includes functions such as human eye gaze recognition, machine owner recognition, and gesture recognition.
- the typical feature is long-term low-power operation.
- the camera driver is also a logical unit of the controller, which is used to configure parameters for the camera and turn the camera on or off.
- the display screen of the electronic device displays the webpage, and the user looks at the display screen of the electronic device to view the webpage.
- the electronic device sends an instruction, and the front camera of the electronic device operates after responding to the instruction, and executes step S1 to capture the face image of the user.
- the simplest image front-end unit performs step S2 to read the face image, and stores the face image in the security buffer of the internal memory based on the security mechanism.
- the AO module executes step S3-1, acquires the image data of the face image stored in the security buffer of the internal memory, and determines whether the user's eyes are watching the display screen by analyzing the image data.
- step S4 is executed to control the display screen of the electronic device not to turn off the screen.
- the image quality of the face image captured by the camera restricts the accuracy of the AO module in determining whether the user's eyes are watching the display. Especially when the image brightness of the face image captured by the camera is high or low, the error of the AO module in determining whether the user's eyes are watching the display screen is relatively large. For this reason, in Fig. 2 a, automatic exposure module obtains the image data of the face image of memory storage as shown in step S3-2; Utilizes image data, calculates the image brightness of face image, and the image brightness that calculates and standard brightness Do the comparison and get the comparison result; adjust the exposure parameters of the camera according to the comparison result, generally the exposure time and gain, and obtain the exposure time adjustment value and gain adjustment value.
- the automatic exposure module also executes step S5, and transmits the calculated exposure duration adjustment value and gain adjustment value to the AO module, and the AO module then sends the exposure duration adjustment value and gain adjustment value to the camera as shown in step S6 in Figure 2a Drive, the camera is driven as shown in step S7 in Figure 2a, and the camera is configured to run with the exposure time adjustment value and the gain adjustment value.
- the electronic device can send an instruction again, and the camera responds to the instruction of the electronic device, and operates to capture images with the adjustment value of the exposure time and the adjustment value of the gain.
- the following describes how the AO module analyzes the image data, determines whether the user's eyes are watching the display screen, and the automatic exposure module adjusts the exposure parameters of the camera in conjunction with FIG. 2b.
- the image sequence includes multiple frames of images captured by the camera, such as image frames 1, 2, 3, 4...n, where the camera starts to operate with a common exposure time and gain.
- the common exposure time and gain can be pre-set.
- the automatic exposure module sequentially acquires the image data of each frame of the image sequence according to the storage order of the images.
- the automatic exposure module uses the image data of image frame 1 to calculate the image brightness of image frame 1, compares the image brightness of image frame 1 with the standard brightness, and obtains the comparison result .
- the automatic exposure module will not perform operations, and the camera will still run with the original exposure time and gain.
- the duration and gain refer to the aforementioned general exposure duration and gain. If the comparison result reflects that the difference between the image brightness of image frame 1 and the standard brightness is not less than the preset value, the automatic exposure module adjusts the exposure time and gain of the camera according to the comparison result, and obtains the adjustment value of exposure time 1 and gain 1 .
- the automatic exposure module adjusts the exposure time by 1 and the gain by 1, and transmits it to the camera driver through the AO module.
- the camera driver configures the camera to shoot images with exposure time 1 adjustment and gain 1 adjustment.
- the automatic exposure module follows the above processing method, Using the image data of image frame 2 and image frame 3, the exposure time length 1 adjustment value and gain 1 adjustment value are calculated; the camera driver configuration camera also runs and captures images with exposure time length 1 adjustment value and gain 1 adjustment value.
- the image frame 4 is captured by the camera configured with an exposure time of 1 adjustment and a gain of 1 adjustment.
- the automatic exposure module also samples the above-mentioned processing method, and uses the image data of image frame 4 to calculate the exposure time length 2 adjustment value and the gain 2 adjustment value; the camera driver configures the camera to operate and capture images with the exposure time length 2 adjustment value and the gain 2 adjustment value. This is repeated until the difference between the image brightness of the image frame compared by the automatic exposure module and the standard brightness is less than a preset value, such as ⁇ 10%, and stop.
- a preset value such as ⁇ 10%, and stop.
- the AO module also sequentially acquires the image data of each frame of the image sequence according to the storage order of the images. For the image data of each frame of image acquired by the AO module, the AO module executes the following process to obtain the judgment result of human eyes watching the display screen in each frame of image. The following takes the image data of the image frame 1 processed by the AO module as an example.
- the AO module compares the image data of image frame 1 with the sample feature library, and configures a confidence level for image frame 1 according to the comparison result between the image data of image frame 1 and the sample feature library, which is used to characterize the image frame 1.
- the probability that the human eye looks at the display The AO module judges whether the confidence degree of image frame 1 is less than the threshold value, and the confidence degree of image frame 1 is not less than the threshold value, then it is determined that the human eyes in image frame 1 are watching the display screen, and the confidence degree of image frame 1 is less than the threshold value , it is determined that the human eye in image frame 1 is not looking at the display screen.
- the sample feature library includes feature data of human eyes gazing at images on the display screen.
- the method of determining the characteristic data is: obtain a large number of sample images, the sample images include the sample images of the human eyes watching the display screen and the sample images of the human eyes not watching the display screen, and use the image data of each sample image to learn, and obtain the representative human Characteristic data of the eye-gaze display image.
- the sample images where the human eyes look at the display screen and the sample images where the human eyes do not watch the display screen both refer to the face images captured by the front camera of the electronic device.
- the AO module determines that there is one frame of image in the image within a certain period of time, and the human eye is watching the display screen, and then executes corresponding events such as controlling the display screen of the electronic device to not turn off the screen, and the volume of the ringtone to decrease.
- the AO module proposed above analyzes the image data to determine whether the user’s eyes are watching the display screen, and the automatic exposure module adjusts the exposure parameters of the camera, and there are the following problems:
- Problem 1 It takes a long time for the automatic exposure module to adjust the exposure parameters of the camera, and the AO module determines that the detection accuracy of the human eyes watching the display screen in the image is reduced.
- image 11 is obtained when the camera is in a dark environment with a common exposure time and gain operation
- image 21 is obtained when the camera is in a bright environment with strong light and with a general exposure time and gain operation.
- the automatic exposure module acquires the image data of the image 11, and uses the image data of the image 11 to adjust the general exposure duration and gain of the camera through the processing flow proposed in FIG. 2b to obtain an exposure duration adjustment value and a gain adjustment value.
- the image captured by the camera with the exposure time adjustment value and the gain adjustment value is image 12 .
- the automatic exposure module obtains the image data of image 12, and then further adjusts the exposure time and gain of the camera, and repeats this until the automatic exposure module uses the image data of image 14 to adjust the exposure time and gain of the camera to be provided to the camera, and the camera then shoots The image brightness of image 15 meets the requirements.
- the camera captures images 22 to 25 sequentially, and the brightness of the image 25 meets the requirements.
- the automatic exposure module shortens the difference between the image brightness and the standard brightness of the image in smaller steps, and adjusts the exposure time and gain accordingly, so that the image brightness of the image acquired by the automatic exposure module is too bright or too dark, The exposure time and gain will be adjusted repeatedly to ensure that the image brightness of the image captured by the camera meets the requirements. Then it brings the problem that the automatic exposure module takes a long time to adjust the exposure parameters of the camera.
- the automatic exposure module repeatedly adjusts the exposure time and gain until the reasonable exposure time and gain provide the camera with an image that meets the requirements.
- the AO module will obtain image data for each frame of image captured by the camera to judge the image. Whether the human eyes in the display are looking at the display. However, the image brightness of the image does not meet the requirements, and the accuracy of the AO module in determining whether the human eye in the image is looking at the display screen will be reduced.
- the electronic equipment is in an environment with different brightness.
- the automatic exposure adjustment module uses a common standard brightness to judge whether the image brightness of the image meets the requirements, and adjusts the exposure time and gain based on the standard brightness when the requirements are not met.
- the general standard brightness itself cannot guarantee the reasonable brightness of the images captured by the camera in different brightness environments. Adjusting the exposure time and parameters based on the standard brightness will lead to the exposure time and gain that cannot be adjusted accurately by the automatic exposure adjustment module.
- the exposure time and gain cannot be adjusted accurately, the camera runs with the adjusted exposure time and gain and the image quality of the captured image is reduced, the AO module uses the image with low image quality to perform the operation of determining the human eye in the image to watch the display screen, accurately The degree will definitely not be high.
- the confidence of the AO module for the image configuration will decrease.
- the confidence level of the configuration of the AO module is less than the threshold value due to insufficient or too high image brightness, which will lead to misjudgment by the AO module and affect the detection accuracy of the AO module.
- the three schemes provided by the embodiments of the present application can be applied to mobile phones, tablet computers, desktops, laptops, notebook computers, ultra-mobile personal computers (Ultra-mobile Personal Computer, UMPC), handheld computers, netbooks, personal digital Assistant (Personal Digital Assistant, PDA), wearable electronic devices, smart watches and other electronic devices.
- Ultra-mobile Personal Computer Ultra-mobile Personal Computer, UMPC
- handheld computers netbooks
- Personal digital Assistant Personal Digital Assistant, PDA
- wearable electronic devices smart watches and other electronic devices.
- Fig. 4a is a composition example of an electronic device provided by the embodiment of the present application.
- the electronic device 400 may include a processor 410, an external memory interface 420, an internal memory 421, a display screen 430, a camera 440, an antenna 1, an antenna 2, a mobile communication module 450, a wireless communication module 460, and an ambient light sensor 470 etc.
- the structure shown in this embodiment does not constitute a specific limitation on the electronic device.
- the electronic device may include more or fewer components than shown, or combine some components, or separate some components, or arrange different components.
- the illustrated components can be realized in hardware, software or a combination of software and hardware.
- the processor 410 may include one or more processing units, for example: the processor 410 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
- application processor application processor, AP
- modem processor graphics processing unit
- graphics processing unit graphics processing unit
- ISP image signal processor
- controller video codec
- digital signal processor digital signal processor
- baseband processor baseband processor
- neural network processor neural-network processing unit
- the controller may be the nerve center and command center of the electronic device 400 .
- the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
- Video codecs are used to compress or decompress digital video.
- An electronic device may support one or more video codecs.
- the electronic device can play or record video in multiple encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
- the NPU is a neural-network (NN) computing processor.
- NPU neural-network
- Applications such as intelligent cognition of electronic devices can be realized through NPU, such as: image recognition, face recognition, speech recognition, text understanding, etc.
- a memory may also be provided in the processor 410 for storing instructions and data.
- the memory in processor 410 is a cache memory.
- the memory may hold instructions or data that the processor 410 has just used or recycled. If the processor 410 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 410 is reduced, thus improving the efficiency of the system.
- processor 410 may include one or more interfaces.
- the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
- I2C integrated circuit
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- PCM pulse code modulation
- UART universal asynchronous transmitter
- MIPI mobile industry processor interface
- GPIO general-purpose input and output
- subscriber identity module subscriber identity module
- SIM subscriber identity module
- USB universal serial bus
- the GPIO interface can be configured by software.
- the GPIO interface can be configured as a control signal or as a data signal.
- the GPIO interface can be used to connect the processor 410 with the display screen 430 , the camera 440 , the wireless communication module 460 and so on.
- the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
- the interface connection relationship between the modules shown in this embodiment is only for schematic illustration, and does not constitute a structural limitation of the electronic device 400 .
- the electronic device 400 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
- the external memory interface 420 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device.
- the external memory card communicates with the processor 410 through the external memory interface 420 to implement a data storage function. Such as saving music, video and other files in the external memory card.
- the internal memory 421 may be used to store computer-executable program codes including instructions.
- the processor 410 executes various functional applications and data processing of the electronic device 400 by executing instructions stored in the internal memory 421 .
- the internal memory 421 may include an area for storing programs and an area for storing data.
- the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
- the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device.
- the internal memory 421 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
- the processor 410 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 421 and/or instructions stored in a memory provided in the processor.
- the electronic device realizes the display function through the GPU, the display screen 430 , and the application processor.
- the GPU is a microprocessor for image processing, and is connected to the display screen 430 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
- Processor 410 may include one or more GPUs that execute program instructions to generate or alter display information.
- the display screen 430 is used to display images, videos and the like.
- the display screen 430 includes a display panel.
- the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oled, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
- the electronic device may include 1 or N display screens 430, where N is a positive integer greater than 1.
- a series of graphical user interface can be displayed on the display screen 430 of the electronic device, and these GUIs are the main screen of the electronic device.
- GUI graphical user interface
- the size of the display screen 430 of the electronic device is fixed, and only limited controls can be displayed on the display screen 430 of the electronic device.
- a control is a GUI element, which is a software component contained in an application that controls all data processed by the application and the interaction of these data. Users can interact with the control through direct manipulation. , so as to read or edit the relevant information of the application.
- controls may include visual interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and Widgets.
- the display screen 430 may display virtual buttons (one-key arrangement, start arrangement, scene arrangement).
- the electronic device can realize the shooting function through ISP, camera 440 , video codec, GPU, display screen 430 and application processor.
- the ISP is used to process data fed back by the camera 440 .
- the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
- ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
- the ISP may be located in the camera 440 .
- the camera 440 includes a lens and a photosensitive element (also an image sensor). Camera 440 is used to capture still images or video. The object generates an optical image through the lens and projects it to the photosensitive element.
- the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
- CMOS complementary metal-oxide-semiconductor
- the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
- the ISP outputs the digital image signal to the DSP for processing. DSP converts digital image signals into standard RGB, YUV and other image signals.
- the electronic device may include 1 or N cameras 440, where N is a positive integer greater than 1.
- DSP is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when an electronic device selects a frequency point, a digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
- the wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 450, the wireless communication module 460, the modem processor and the baseband processor.
- Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
- Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
- the antenna may be used in conjunction with a tuning switch.
- the mobile communication module 450 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
- the mobile communication module 450 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
- the mobile communication module 450 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
- the mobile communication module 450 can also amplify the signal modulated by the modem processor, convert it into electromagnetic wave and radiate it through the antenna 1 .
- at least part of the functional modules of the mobile communication module 450 may be set in the processor 410 .
- at least part of the functional modules of the mobile communication module 450 and at least part of the modules of the processor 410 may be set in the same device.
- the wireless communication module 460 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
- the wireless communication module 460 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 460 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 410 .
- the wireless communication module 460 can also receive the signal to be transmitted from the processor 410 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
- the ambient light sensor 470 is used for sensing ambient light brightness.
- the electronic device can adaptively adjust the brightness of the display screen 430 according to the perceived ambient light brightness.
- the ambient light sensor 470 can also be used to automatically adjust the white balance when taking pictures.
- the ambient light sensor 470 can also cooperate with the proximity light sensor to detect whether the electronic device is in the pocket, so as to prevent accidental touch.
- an operating system runs on top of the above components.
- Hongmeng system iOS operating system, Android operating system, Windows operating system, etc.
- An application program can be installed and run on the operating system.
- Fig. 4b is a block diagram of the software structure of the electronic device according to the embodiment of the present application.
- the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
- the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
- the application layer can consist of a series of application packages. As shown in FIG. 4b, the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, display, music, ringtone, and short message.
- applications such as camera, gallery, calendar, call, map, navigation, WLAN, display, music, ringtone, and short message.
- the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
- the application framework layer includes some predefined functions. As shown in Figure 4b, the application framework layer may include a window manager, a content provider, a phone manager, a resource manager, a notification manager, a view system, etc., and in some embodiments of the present application, the application framework layer also includes Awareness services can be included.
- a window manager is used to manage window programs.
- the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
- Content providers are used to store and retrieve data and make it accessible to applications.
- Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
- the phone manager is used to provide communication functions of electronic devices. For example, the management of call status (including connected, hung up, etc.).
- the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
- the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
- the notification manager is used to notify the download completion, message reminder, etc.
- the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
- prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
- the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
- the view system can be used to build applications.
- a display interface can consist of one or more views.
- a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
- the perception service is used to implement the AO scheme proposed above.
- the display of the control application layer will be displayed in a way that does not turn off the screen, and the ringtone will be reduced in volume when the ringtone needs to be output.
- the Android Runtime includes core library and virtual machine.
- the Android runtime is responsible for the scheduling and management of the Android system.
- the cold start of the application will run in the Android runtime, and the Android runtime obtains the optimized file status parameters of the application, and then the Android runtime can judge whether the optimized file is outdated due to system upgrades through the optimized file status parameters , and return the judgment result to the application control module.
- the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
- the application layer and the application framework layer run in virtual machines.
- the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
- the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
- a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
- the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
- the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
- the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
- the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, compositing and layer processing, etc.
- the 2D graphics engine is a drawing engine for 2D drawing.
- the kernel layer is the layer between hardware and software.
- the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
- the camera driver is used to configure the parameters of the camera and turn the camera on or off.
- an embodiment of the present application provides a controller, which may be understood as a processing unit of the processor shown in FIG. 4a.
- the controller includes an AO module, an automatic exposure module and a camera driver.
- the AO module, automatic exposure module and camera driver are all logic units of the controller, and their functions are as described above.
- the AO module is configured with a table of correspondences between ambient light brightness and exposure parameters.
- the table of correspondences between ambient light brightness and exposure parameters includes a plurality of ambient light brightness and exposure parameters matching each ambient light brightness.
- the table of correspondences between ambient light brightness and exposure parameters includes multiple ambient light brightness intervals and exposure parameters that match each ambient light brightness interval.
- the exposure parameter includes: at least one of exposure duration and gain, because gain generally includes digital gain and analog gain, therefore, the exposure parameter may include at least one of exposure duration, digital gain, and analog gain.
- the exposure time refers to the shutter speed, that is, how long the shutter is pressed. The longer the exposure time, the more photons will reach the surface of the image sensor, and the brighter the image captured by the camera will be, otherwise the darker the image will be. But if it is overexposed, the image will be too bright and image details will be lost; if it is underexposed, the image will be too dark and image details will also be lost.
- the analog gain and digital gain refer to the amplification gain of the signal after double sampling, which are used to control the image sensor to collect data.
- the way of generating the correspondence table between ambient light brightness and exposure parameters is as follows:
- the front camera of the electronic device is configured to operate with multiple exposure parameters to take pictures under one ambient light brightness, collect the images captured by the front camera when the human eye is watching the display screen under different ambient light brightness, and obtain the images under different ambient light brightness Multi-frame sample image.
- multiple frames of sample images under the ambient light brightness are processed in the following manner to obtain exposure parameters matching the ambient light brightness.
- the AO module uses the image data of a frame of sample image to identify whether the human eyes in the sample image are watching the display screen. If the AO module recognizes that the human eye is not watching the display screen in the image, ignore the sample image and collect the next frame sample image of the ambient light brightness; if the AO module recognizes that the human eye is watching the display screen in the image, record the sample image corresponding Exposure parameters, and collect the next frame sample image of the ambient light brightness. In this way, exposure parameters corresponding to multiple frames of sample images recorded under one ambient light brightness are obtained, and exposure parameters corresponding to multiple frames of sample images recorded under one ambient light brightness are used to calculate exposure parameters matching the ambient light brightness.
- the median value of the exposure parameters corresponding to the recorded multi-frame sample images is selected as the exposure parameter for matching the ambient light brightness; in other embodiments, the average value of the exposure parameters corresponding to the recorded multi-frame sample images is calculated, As the exposure parameter for the matching of the ambient light brightness; in other embodiments, one frame of sample image is randomly selected from the recorded multiple frames of sample images, and the exposure parameter corresponding to the selected sample image is used as the exposure parameter for the matching of the ambient light brightness.
- the way of generating the correspondence table between ambient light brightness and exposure parameters is as follows:
- the automatic exposure module executes the process of adjusting the exposure parameters of the camera, the adjusted exposure parameters corresponding to the image brightness whose standard brightness difference satisfies the preset value, and also record the automatic exposure module’s execution corresponding to the process of adjusting the exposure parameters of the camera Ambient light brightness. Using each recorded ambient light brightness and the adjusted exposure parameters of the automatic exposure module, a corresponding relationship table between the ambient light brightness and the exposure parameters is constructed.
- Table 1 below provides an example of a table of correspondences between ambient light brightness and exposure parameters.
- the correspondence table shown in this example includes: multiple ambient light brightness ranges, and the matching exposure duration and analog gain for each ambient light brightness range.
- the exposure time corresponding to the ambient light brightness is 4025us, which belongs to the maximum exposure time. Therefore, the table of correspondence between ambient light brightness and exposure parameters provided in Table 1 belongs to shutter priority, that is, when the ambient light brightness is low, the maximum exposure time is set preferentially.
- Table 2 below provides another example of the correspondence table between ambient light brightness and exposure parameters.
- the correspondence table shown in this example also includes: multiple ambient light brightness ranges, and the matching exposure time and analog gain for each ambient light brightness range.
- Table 2 shows the corresponding relationship between ambient light brightness and exposure parameters, which belongs to gain priority. It can also be seen from the data in the first five rows: when the ambient light brightness is low, the maximum analog gain is set first.
- Table 1 and Table 2 above are tables of correspondences between ambient light brightness, exposure duration, and analog gain, but this does not constitute a limitation on the table of correspondences between ambient light brightness and exposure parameters.
- the table of correspondences between ambient light brightness and exposure parameters may include the correspondence between ambient light brightness and exposure duration, analog gain, and digital gain.
- the AO module executes step S11 to obtain the ambient light brightness detected by the ambient light sensor, and screens from the table of correspondence between ambient light brightness and exposure parameters, and screens out the ambient light sensor detection The exposure parameters matching the ambient light brightness, and then as shown in step S12, sending the exposure parameters matching the ambient light brightness to the camera driver.
- the AO module can also acquire the standard brightness, and execute step S13 to send the exposure parameters matching the standard brightness and ambient light brightness to the automatic exposure module.
- step S14 is executed to configure the camera to operate with the exposure parameters matching the brightness of the ambient light.
- the camera runs with the exposure parameters matched with the ambient light brightness, executes step S15, captures the face image of the user, and sends the captured face image.
- the simplest image front-end unit executes step S16, reads the face image, and stores the face image in the security buffer of the internal memory based on the security mechanism.
- the exposure parameters selected by the AO module to match the brightness of the ambient light may also be referred to as initial exposure parameters.
- the camera operates with the initial exposure parameters, and the face image is captured according to the shooting frequency of the camera to obtain the face image.
- the camera operates with the initial exposure parameters and can capture multiple frames of face images.
- the AO module executes step S17-1, acquires the image data of the face image stored in the security buffer of the internal memory, and then analyzes the image data to determine whether the user's eyes are watching the display screen.
- step S18 is executed to control the display screen of the electronic device not to turn off.
- the way the AO module analyzes the image data to determine whether the user's eyes are watching the display screen can be as described in Embodiment 3 below.
- the automatic exposure module executes step S17-2, obtains the image data of the face image stored in the security buffer of the internal memory, and then uses the image data of the face image to calculate the image brightness of the face image; compares the image brightness of the face image with the standard brightness Do the comparison and get the comparison result.
- the image data of the face image acquired by the automatic exposure module includes red (Red), green (Green), and blue (Blue) components of each pixel in the face image, and the automatic exposure module calculates each The average value of the red (Red), green (Green), and blue (Blue) components of the pixel is used as the image brightness of the face image.
- the automatic exposure module will not perform any operation, and the camera will still operate with the exposure parameters matching the ambient light brightness.
- the automatic exposure module is also used to adjust the exposure parameter for ambient light brightness matching according to the comparison result to obtain an exposure parameter adjustment value.
- the automatic exposure module then executes step S19 to transmit the adjusted value of the exposure parameter to the AO module.
- the AO module executes step S20 to send the adjusted value of the exposure parameter to the camera driver.
- the camera is driven as shown in step S21, and the camera is configured to operate and capture images with the adjusted value of the exposure parameter.
- the cut-off conditions for the automatic exposure module to adjust the exposure parameters are as described above.
- the adjusted value of the exposure parameter after the last adjustment of the exposure parameter by the automatic exposure module will be sent to the AO module, and the AO module will update the corresponding relationship table between the ambient light brightness and the exposure parameter as the environment in the corresponding relationship table.
- the exposure parameters are matched to the brightness of the ambient light detected by the light sensor.
- the automatic exposure module executes the process of adjusting the exposure parameters of the camera, the corresponding relationship table between ambient light brightness and exposure parameters will be updated accordingly, and the updated corresponding relationship table between ambient light brightness and exposure parameters will be saved by the AO module to Memory.
- the memory is then referred to as the internal memory 421 shown in FIG. 4a.
- the image captured by the camera running with common exposure parameters in a dark environment is the image 11 shown in FIG. 6 .
- the camera is also in a dark environment, but it runs and shoots with exposure parameters that match the brightness of the ambient light, and the captured image is the image 14 shown in FIG. 6 .
- the automatic exposure module acquires the image data of the image 14, and uses the image data of the image 14 to adjust the exposure parameters matched with the ambient light brightness configured by the camera through the processing flow proposed in the foregoing content to obtain the exposure parameter adjustment value.
- the image captured by the camera with the adjusted value of the exposure parameter is the image 15 that meets the brightness requirement.
- the image captured by the camera running with common exposure parameters in a bright environment with strong light is the image 21 shown in FIG. 6 .
- the camera is also in a dark environment, but it runs and shoots with exposure parameters that match the brightness of the ambient light, and the captured image is the image 24 shown in FIG. 6 .
- the automatic exposure module acquires the image data of the image 24, and uses the image data of the image 24 to adjust the exposure parameters matched with the ambient light brightness configured by the camera through the processing flow proposed in the foregoing content to obtain the exposure parameter adjustment value.
- the image captured by the camera with the adjusted value of the exposure parameter is the image 25 that meets the brightness requirement.
- the AO module screens out the exposure parameters that match the ambient light brightness detected by the ambient light sensor from the corresponding relationship table between the ambient light brightness and the exposure parameters, and allocates them to the camera through the camera drive, which can ensure that the camera can use the exposure
- the image brightness of the image captured by the parameter operation is relatively reasonable, but not too bright or too dark.
- the automatic exposure module does not need to repeatedly adjust the exposure parameters, which reduces the time-consuming of the automatic exposure module to adjust the exposure parameters of the camera.
- the automatic exposure module quickly adjusts the exposure parameters that meet the image brightness requirements, and the AO module can judge whether the human eye is watching the display screen with the image whose image brightness meets the requirements, ensuring the accuracy of the AO module evaluation.
- controller which may be understood as a processing unit of the processor shown in FIG. 4a.
- the controller includes an AO module, an automatic exposure module and a camera driver.
- the AO module, automatic exposure module and camera driver are all logic units of the controller, and their functions are as described above.
- the AO module is configured with a corresponding relationship table between ambient light brightness and standard brightness.
- the table of correspondences between ambient light brightness and standard brightness includes a plurality of ambient light brightness and standard brightness matching each ambient light brightness.
- the table of correspondences between ambient light brightness and standard brightness includes a plurality of ambient light brightness intervals and standard brightness that matches each ambient light brightness interval.
- Standard brightness can adjust the exposure parameters of the camera so that the image captured by the camera can achieve the target brightness.
- the value range of standard brightness is: 0 ⁇ 1024. Under different ambient light levels, the standard brightness is slightly different. Therefore, a corresponding relationship table between different ambient light brightness and standard brightness is constructed.
- the way to generate the correspondence table between ambient light brightness and standard brightness is:
- the front camera of the electronic device is configured to operate with multiple exposure parameters to take pictures under one ambient light brightness, collect the images captured by the front camera when the human eye is watching the display screen under different ambient light brightness, and obtain the images under different ambient light brightness Multi-frame sample image.
- each ambient light brightness multiple frames of sample images under the ambient light brightness are processed in the following manner to obtain a standard brightness matching the ambient light brightness.
- the AO module uses the image data of a frame of sample image to identify whether the human eyes in the sample image are watching the display screen. If the AO module recognizes that the human eye is not watching the display in the image, ignore the sample image and collect the next frame sample image of the ambient light; if the AO module recognizes that the human eye is watching the display in the image, record the image of the sample image Brightness, and collect the next frame sample image of the ambient light brightness. In this way, the image brightness of multiple frames of sample images recorded under one ambient light brightness is obtained, and the standard brightness matching the ambient light brightness is calculated by using the image brightness of multiple frames of sample images recorded under one ambient light brightness.
- the median value of the image brightness of the recorded multi-frame sample images is selected as the standard brightness for matching the ambient light brightness; in other embodiments, the average value of the image brightness of the recorded multi-frame sample images is calculated as the A standard brightness for ambient light brightness matching; in some other embodiments, the image brightness of one frame of sample images is randomly selected from the recorded multi-frame sample images as the standard brightness for ambient light brightness matching.
- the red (Red), green (Green), and blue (Blue) components of each pixel of the sample image are used to calculate the red (Red) component of each pixel. ), the average value of the green (Green), blue (Blue) components, as the image brightness of the sample image.
- Table 3 below provides a table of correspondence between ambient light brightness and standard brightness, and the correspondence shows the standard brightness matched by multiple ambient light brightness intervals.
- Ambient light brightness (lux) Standard brightness (0 ⁇ 1024) less than 150 480 150-250 488 250 ⁇ 350 503 350-450 508 450-550 510 550-650 511 650-750 512 750-850 512 850-950 512 950-1050 512 1050-1300 513 1350-1550 515 1550-2000 520 2000-4000 525 greater than 4000 532
- the AO module executes step S31 to obtain the ambient light brightness detected by the ambient light sensor, and screens from the corresponding relationship table between ambient light brightness and standard brightness, and screens out the ambient light sensor detection The standard brightness matching the ambient light brightness; and as shown in step S32, sending the screened standard brightness matching the ambient light brightness to the automatic exposure module.
- the camera runs with the configured initial exposure parameters, executes step S33, captures the face image of the user, and sends the face image.
- the simplest image front-end unit executes step S34, reads the face image, and stores the face image in the security buffer of the internal memory based on the security mechanism.
- the initial exposure parameters configured by the camera may be general exposure parameters set in advance; in other embodiments, the initial exposure parameters configured by the camera may also be as described in the first embodiment, which is Exposure parameter for ambient light brightness matching.
- the AO module executes step S35-1, acquires the image data of the face image stored in the security buffer of the internal memory, and then analyzes the image data to determine whether the user's eyes are watching the display screen.
- step S36 is executed to control the display screen of the electronic device not to turn off. The way the AO module analyzes the image data to determine whether the user's eyes are watching the display screen can be as described in Embodiment 3 below.
- the automatic exposure module is used to obtain the standard brightness from the AO module, and is also used to perform step S35-2, obtain the image data of the face image stored in the security buffer of the internal memory, and use the image data of the face image to calculate the brightness of the face image.
- Image brightness compare the image brightness of the face image with the standard brightness sent and screened by the automatic exposure module, and obtain the comparison result.
- the image data of the face image acquired by the automatic exposure module includes red (Red), green (Green), and blue (Blue) components of each pixel in the face image, and the automatic exposure module calculates each The average value of the red (Red), green (Green), and blue (Blue) components of the pixel is used as the image brightness of the face image.
- the automatic exposure module does not perform any operation, and the camera still operates with the initial exposure parameters. If the comparison result reflects that the difference between the image brightness of the face image and the standard brightness is not less than the preset value, the automatic exposure module is also used to adjust the exposure parameters matched by the ambient light brightness screened out by the AO module according to the comparison result, to obtain Exposure parameter adjustment value.
- the automatic exposure module executes step S37, sends the adjusted value of the exposure parameter to the AO module, and after receiving the adjusted value of the exposure parameter, the AO module executes step S38, transmits the adjusted value of the exposure parameter to the camera driver.
- the camera driver executes step S39 , and configures the camera to run and capture images with the adjusted value of the exposure parameter.
- the automatic exposure module acquires a common standard brightness, and an image under the common standard brightness is the image 13 a shown in FIG. 8 .
- the image captured by the camera in a dark environment is the image 11 shown in FIG. 8.
- the automatic exposure module acquires the image data of the image 11, and compares the image data with the general standard brightness. It can be understood that it is compared with the image brightness of the image 13a. Adjust the exposure parameters of the camera according to the comparison result, the camera will sequentially capture images 12a and 13a, and the automatic exposure module judges that when the image 13a is captured, its image brightness meets the standard brightness requirements.
- the image brightness of the image 13a is not high, so it is very easy to misjudgment if the image is provided to the AO module to judge whether the human eyes are watching the display screen.
- the standard brightness obtained by the automatic exposure module matches the brightness of the ambient light
- the image under the standard brightness is shown as image 13b in FIG. 8 .
- the automatic exposure module acquires the image data of image 11, compares the image data with the standard brightness matching the ambient light brightness, that is, compares it with the image brightness of image 13b, and adjusts the exposure parameters of the camera according to the comparison result, and the camera will sequentially
- the image 12b and the image 13b are captured, and the automatic exposure module judges that the image brightness of the image 13b meets the standard brightness requirement. It can be seen from FIG. 8 that the image brightness of the image 13b is also relatively reasonable. In this way, the AO module uses the image to judge whether the human eye is watching the display screen with high accuracy.
- an image under common standard brightness is shown as image 23 a in FIG. 8
- an image of standard brightness matching ambient light brightness is shown in image 23 b in FIG.
- the image captured by the camera is image 21. Adjust the exposure parameters of the camera according to the general standard brightness. The camera will sequentially capture images 22a and 23a. The brightness of the images is getting higher and higher. Adjust the exposure parameters of the camera according to the standard brightness matching the ambient light brightness. , the images captured by the camera in sequence are image 22b and image 23b.
- the AO module screens out the standard brightness matching the ambient light brightness detected by the ambient light sensor from the correspondence table between the ambient light brightness and the standard brightness, and sends it to the automatic exposure module.
- the automatic exposure module adjusts the exposure parameters of the camera based on the standard brightness matching the ambient light brightness, so that the camera shoots with the adjusted value of the exposure parameters, and obtains an image that meets the standard brightness requirements of the ambient light brightness matching, ensuring the accuracy of the image captured by the camera.
- Image quality which improves the accuracy of the AO module in judging whether the human eye in the image is watching the display.
- controller which may be understood as a processing unit of the processor shown in FIG. 4a.
- the controller includes an AO module, an automatic exposure module and a camera driver.
- the AO module, automatic exposure module and camera driver are all logic units of the controller, and their functions are as described above.
- the AO module is configured with a table of correspondences between ambient light brightness and confidence thresholds.
- the table of correspondences between ambient luminance and confidence thresholds includes a plurality of ambient luminances and each ambient luminance matching confidence thresholds.
- the table of correspondences between ambient light brightness and confidence thresholds includes a plurality of ambient light brightness intervals, and a matching confidence threshold for each ambient light brightness interval.
- the confidence thresholds for matching between dark light intervals and strong light intervals are smaller than the confidence thresholds under normal light.
- Table 4 below provides a corresponding relationship between the ambient light brightness and the confidence threshold.
- the corresponding relationship shows the confidence threshold matched by the three ambient light brightness intervals and the original confidence threshold.
- the ambient light brightness interval of less than 10lux is identified as the dark light interval
- the ambient light brightness interval greater than 8W lux is identified as the strong light interval
- the ambient light interval of 10lux-8W lux belongs to the normal light interval.
- Table 4 shows an example of the division method of the dark light interval, the strong light interval and the normal light interval, but the division method of the dark light interval, the strong light interval and the normal light interval is not limited to the specific values presented in Table 4.
- the matching original confidence threshold is the same as the confidence threshold of the ambient light brightness interval of 10lux-8W lux, both of which are 0.95. Because the image quality of the image captured by the camera is not high under low light and strong light, and the image is underexposed or overexposed, the confidence threshold is the same as that under normal light, which is unreasonable.
- the ambient light brightness interval of less than 10lux and the ambient light brightness interval greater than 8W lux the corresponding confidence threshold value is set to 0.9, which is the ambient light brightness interval of less than 10lux-8W lux confidence threshold.
- Table 4 shows a confidence threshold corresponding to the three ambient light brightness intervals.
- the confidence threshold corresponding to the three ambient light brightness intervals is not limited to the values in Table 4, and can be adjusted according to actual conditions.
- the ambient light brightness interval is not limited to the three ambient light brightness intervals provided in Table 4.
- the ambient light brightness and the confidence threshold value The corresponding relationship may include more than three ambient light brightness intervals, and a matching confidence threshold value for each ambient light brightness interval.
- the AO module executes step S41 to acquire the ambient light brightness detected by the ambient light sensor, and screens the ambient light sensor from the corresponding relationship table between the ambient light brightness and the confidence threshold value. Confidence threshold for the detected ambient light brightness match.
- the AO module can also acquire the standard brightness, and execute step S42 to send the standard brightness to the automatic exposure parameter.
- the standard brightness obtained by the AO module may be a general standard brightness.
- the standard brightness obtained by the AO module may be the standard brightness matched with ambient light brightness as described in the second embodiment above.
- the camera runs with the configured initial exposure parameters, executes step S43, and captures the user's face image.
- the simplest image front-end unit executes step S44, reads the face image, and stores the face image in the security buffer of the internal memory based on the security mechanism.
- the initial exposure parameters configured by the camera may be general exposure parameters set in advance; in other embodiments, the initial exposure parameters configured by the camera may also be as described in the first embodiment, which is Exposure parameter for ambient light brightness matching.
- the AO module is also used to execute step S45-1 according to the storage order of the images, and sequentially acquire the image data of each frame of the face image in the image sequence stored in the security buffer of the internal memory.
- the AO module compares the image data of the face image and the sample feature library, and according to the comparison result of the image data of the face image and the sample feature library, configures confidence for the image
- the confidence level is used to characterize the probability that the human eyes in the image will watch the display screen.
- the sample feature library includes feature data of human eyes gazing at images on the display screen.
- the method of determining the characteristic data is: obtain a large number of sample images, the sample images include the sample images of the human eyes watching the display screen and the sample images of the human eyes not watching the display screen, and use the image data of each sample image to learn, and obtain the representative human Characteristic data of the eye-gaze display image.
- the AO module is also used to judge whether the confidence of the face image is less than the screening confidence threshold.
- the AO module executes step S46 shown in FIG. 9 , and judges that the confidence of the face image is not less than the confidence threshold, then determines that the eyes in the face image are watching the display screen, and then controls the display screen of the electronic device to not turn off the screen.
- the AO module judges that the confidence degree of the face image is less than the confidence threshold value, and then determines that the human eyes in the face image are not looking at the display screen.
- the automatic exposure module is used to obtain the standard brightness from the AO module, and is also used to perform step S45-2, obtain the image data of the face image stored in the security buffer of the internal memory, and use the image data of the face image to calculate the brightness of the face image.
- Image brightness compare the image brightness of the face image with the standard brightness sent and screened by the automatic exposure module, and obtain the comparison result.
- the image data of the face image acquired by the automatic exposure module includes red (Red), green (Green), and blue (Blue) components of each pixel in the face image, and the automatic exposure module calculates each The average value of the red (Red), green (Green), and blue (Blue) components of the pixel is used as the image brightness of the face image.
- the automatic exposure module does not perform any operation, and the camera still operates with the initial exposure parameters. If the comparison result reflects that the difference between the image brightness of the face image and the standard brightness is not less than the preset value, the automatic exposure module is also used to adjust the exposure parameters matched by the ambient light brightness screened out by the AO module according to the comparison result, to obtain Exposure parameter adjustment value.
- the automatic exposure module executes step S47, sends the adjustment value of the exposure parameter to the AO module, and after the AO module receives the adjustment value of the exposure parameter, executes step S48, transmits the adjustment value of the exposure parameter to the camera driver.
- the camera driver executes step S49, and configures the camera to run and capture images with the exposure parameter adjustment value.
- the AO module determines that the human eyes in one frame of images in the continuous multi-frame images are watching the display screen, and then executes to control the display screen of the electronic device not to turn off the screen, and according to the screen extinguishing time set by the display screen of the electronic device Combine the schemes to control the screen off of the display screen, and the combination method is as follows.
- the electronic device is set to turn off the screen for 15 seconds. Start the timing when the display screen of the electronic device is started to display data. After the specified time, such as 7 seconds, the front camera of the electronic device captures images, and the AO module acquires images in sequence, and executes Do the following:
- the image data of the image is compared with the sample feature library, and the confidence level is configured for the image according to the comparison result of the image data of the image and the sample feature library. Judging whether the configured confidence of the image is less than the screened confidence threshold, if the configured confidence of the image is not less than the confidence threshold, it is determined that the human eye in the image is watching the display screen; if the configured confidence of the image is If the degree is less than the confidence threshold, it is determined that the human eyes in the image are not looking at the display screen.
- the AO module determines that there is a frame of image in which the human eye is watching the display screen, and then controls the display screen not to turn off. If the AO module continues to determine that none of the human eyes in the image is looking at the display screen, it will control the display screen to turn off when the timer reaches 15 seconds.
- the user watches the display screen of the electronic device under dark light, and the images sequentially captured by the camera under the dark environment are images 11 to 15 shown in FIG. 10 .
- the AO module is configured with an original confidence threshold of 0.95.
- the AO module acquires the image data of image 11, configures a confidence level of 0.89 for image 11, judges that the confidence level of 0.89 for image 11 is less than the confidence threshold value of 0.95, and determines that the human eyes in image 11 are not watching the display screen.
- the AO module acquires the image data of the image 12, configures the confidence level of the image 12 as 0.89, and judges that the confidence level of the image 12 is less than the confidence level threshold of 0.95, and also determines that the human eyes in the image 12 are not watching the display screen.
- the AO module configures the confidence levels of image 13, image 14, and image 15 to be 0.89, 0.92, and 0.94 respectively, and judges that the confidence level of image 13 is 0.89 less than the confidence threshold 0.95, and the confidence level of image 14 is less than 0.92.
- the degree threshold value is 0.95
- the confidence degree 0.94 of image 15 is also less than the confidence degree threshold value 0.95, so it is determined that the human eyes in image 13, image 14 and image 15 are not watching the display screen.
- the AO module obtains the ambient light brightness detected by the ambient light detection sensor, and uses the ambient light brightness to filter from the corresponding relationship table between the ambient light brightness and the confidence threshold value, and the selected confidence threshold value is 0.90.
- the AO module also acquires the image data of image 11, and configures the confidence of image 11 as 0.89. However, it is judged that the confidence level 0.89 of image 11 is less than the confidence threshold value 0.90, and it is determined that the human eyes in image 11 are not looking at the display screen; the image data of image 12 is acquired, and the confidence level of image 12 is 0.89, and the confidence level of judgment image 12 is 0.89 Also less than the confidence threshold value of 0.90, it is also determined that the human eyes in the image 12 are not watching the display screen.
- the AO module configures confidence levels of 0.89, 0.92, and 0.94 for image 13, image 14, and image 15, respectively, and judges that the confidence level of image 13 is 0.89 less than the confidence threshold 0.90, and the confidence level of image 14 is not less than 0.92.
- the degree threshold value is 0.90
- the confidence degree 0.94 of image 15 is also less than the confidence degree threshold value 0.90, it is determined that the human eyes in image 13 are not watching the display screen, and it is determined that the human eyes in image 14 and image 15 are both watching the display screen.
- the AO module determines that the human eye in one frame of image is watching the display screen, and controls the display screen not to turn off.
- the above example shows the scene where the user watches the display screen of the electronic device in a dark environment, and the user watches the electronic device in a bright environment with strong light.
- the AO module uses the original Confidence threshold values screened out in the correspondence table of the degree threshold values, and the results of whether the human eyes are watching the display screen in the image are basically the same as the scene under dark light.
- the AO module can also filter out the lower confidence threshold value from the correspondence table between the ambient light brightness value and the confidence threshold value. In this way, it is avoided that the confidence of the configuration of the AO module is less than the threshold value due to insufficient or too high image brightness, resulting in misjudgment by the AO module and affecting the detection accuracy of the AO module.
- the AO module is also used to execute a face detection event to assist in judging that the human eyes in the image are watching the display screen.
- the AO module after the AO module judges that the confidence of the image is not less than the screened confidence threshold, it is also used to execute a face detection event, and judge whether the result of the face detection event is a detected face. If the result of judging the face detection event is that a human face is detected, it is determined that the eyes of the people in the image are watching the display screen, and if the result of judging the face detection event is that no face is detected, then it is determined that the eyes of the people in the image are not watching the display screen.
- the AO module is also used to execute the face detection event before judging whether the confidence of the image is less than the screened confidence threshold, and judge whether the result of the face detection event is detected Face, if the result of judging the face detection event is that the face is detected, then judge whether the confidence of the image is less than the confidence threshold value screened out, if the confidence degree of the judged image is not less than the confidence threshold value screened out limit, it is determined that the human eye in the image is looking at the display. If the result of judging the face detection event is that no face is detected, or judging that the confidence of the image is less than the screening confidence threshold, it is determined that the human eyes in the image are not looking at the display screen.
- the AO module executes judging whether the confidence of the image is less than the screened confidence threshold value in parallel, and executes the face detection event, and judges whether the result of the face detection event is a detected face; if The AO module judges that the confidence of the image is not less than the confidence threshold value screened out, and the result of judging the face detection event is that a face is detected, then it is determined that the eyes of the person in the image are watching the display screen; otherwise, it is determined that the person in the image is Eyes are not looking at the display.
- the AO module realizes face detection through Haar-Like features and Adaboost algorithm.
- the Haar-Like feature is used to represent the face
- each Haar-Like feature is trained to obtain a weak classifier
- multiple weak classifiers that can best represent the face are selected by the Adaboost algorithm to construct a strong classifier
- the strong classifiers are connected in series to form a cascade structure of cascaded classifiers, that is, face detectors.
- the AO module extracts the Haar-Like feature of the image captured by the camera, calls the face detector to process the extracted Haar-Like feature, and obtains the face recognition result, which indicates whether the image captured by the camera contains a human face.
- the AO module realizes face detection through Multi-scale Block based Local Binary Patterns (MBLBP) features and Adaboost algorithm.
- MBLBP Multi-scale Block based Local Binary Patterns
- Adaboost algorithm the MBLBP feature that can represent the face image information of the reference frame and 8 neighboring frames is used to represent the face
- the MBLBP feature is calculated by comparing the average grayscale of the reference frame with the average grayscale of the surrounding 8 neighboring frames.
- the AO module extracts the MBLBP features of the image information of the reference frame and 8 neighboring frames, and calculates the MBLBP feature by comparing the average gray level of the reference frame with the average gray value of the surrounding 8 neighboring frames.
- Call the face detector to process the MBLBP features to get the face recognition result, which indicates whether the image captured by the camera contains a face.
- the AO module realizes face detection through Multi-scale Structured Ordinal Features (MSOF) and Adaboost algorithm.
- MSOF Multi-scale Structured Ordinal Features
- Adaboost algorithm the MSOF feature that can represent the face image information of the reference frame and 8 neighboring frames is used to represent the face.
- the distance between the 8 neighboring frames and the reference frame is adjustable, and the reference frame and 8 neighboring Can not be connected.
- weak classifiers are obtained by training MSOF features, and multiple weak classifiers that can best represent the face are selected by the Adaboost algorithm to form a strong classifier, and several strong classifiers are connected in series to form a cascaded classifier, that is, human face face detector.
- the AO module extracts the MSOF features of the image information of the reference frame and 8 neighboring frames. Call the face detector to process the MSOF feature to get the face recognition result, which indicates whether the image captured by the camera contains a face.
- the user watches the display screen of the electronic device in a dark environment, and the images sequentially captured by the camera in a dark environment are images 11 to 15 shown in FIG. 11 .
- the AO module obtains the ambient light brightness detected by the ambient light detection sensor, uses the ambient light brightness to filter from the corresponding relationship table between the ambient light brightness and the confidence threshold value, and the selected confidence threshold value is 0.90.
- the AO module acquires the image data of image 11, configures the confidence level of image 11 as 0.89, judges that the confidence level of image 11 is 0.89 less than the confidence threshold value of 0.90, and determines that the human eyes in image 11 are not watching the display screen;
- the image data of image 12 is acquired, and the confidence degree of image 12 is configured as 0.89. It is judged that the confidence degree of image 12 is also less than the confidence threshold value of 0.90. It is also determined that the human eyes in image 12 are not watching the display screen.
- the AO module configures the confidence levels of image 13, image 14, and image 15 to be 0.91, 0.92, and 0.94 respectively, and judges that the confidence level of image 13 is 0.91 not less than the confidence threshold value of 0.90, and the confidence level of image 14 is not less than 0.92.
- the confidence level 0.94 of image 15 is also less than the confidence threshold value 0.90, and it is determined that the human eyes in image 13, image 14, and image 15 are all watching the display screen.
- the AO module determines that there are people in the image watching the display screen in five consecutive frames of images, and controls the display screen not to turn off. However, it can be seen from the image 15 in FIG. 11 that the user's eyes are not watching the display screen. Therefore, it is a misjudgment operation to control the display screen not to turn off the screen.
- the results of the face detection events performed by the AO module on images 11 to 15 are all: no face is detected.
- the AO module judges that the confidence degree of 0.89 of the image is less than the confidence threshold value of 0.90, and determines that the result of the face detection event is that no face is detected, so it is determined that the human eyes in image 11 are not watching the display screen; similarly , the AO module also determines that the human eyes in image 12, image 13, image 14 and image 15 are not watching the display screen, and controls the display screen to turn off.
- the AO module performs face detection events on the images captured by the camera to assist in the detection of human eyes watching the display screen in the image
- the method avoids the problem of misjudgment that the AO module uses the reduced confidence threshold value to perform detection of people's gaze on the display screen in the image.
- the display control method is applied to an electronic device.
- the electronic device includes a processor, an ambient light sensor, a camera, and a display screen.
- the processor includes The controller as provided in Embodiment 1, Embodiment 2 or Embodiment 3 above.
- the display control method provided by this embodiment includes:
- the AO module acquires ambient light brightness detected by the ambient light sensor.
- the AO module screens from the table of correspondences between ambient light brightness and exposure parameters, and screens out exposure parameters that match the ambient light brightness detected by the ambient light sensor.
- Embodiment 1 shows that the corresponding relationship between the ambient light brightness and the exposure parameters is displayed in the form of a list, but this does not constitute a limitation on the display and display of the corresponding relationship between the ambient light brightness and the exposure parameters.
- the AO module screens the corresponding relationship table between the ambient light brightness and the standard brightness, and screens out the standard brightness matching the ambient light brightness detected by the ambient light sensor.
- Embodiment 2 shows that the corresponding relationship between the ambient light brightness and the standard brightness is displayed in the form of a list, but this does not constitute a limitation on the display of the corresponding relationship between the ambient light brightness and the standard brightness.
- the AO module screens the corresponding relationship table between the ambient light brightness and the confidence threshold value, and screens out the confidence threshold value matching the ambient light brightness detected by the ambient light sensor.
- Embodiment 3 shows that the corresponding relationship between the ambient light brightness and the confidence threshold is displayed in the form of a list, but this does not constitute a limitation on the display of the corresponding relationship between the ambient light brightness and the confidence threshold.
- Figure 12 shows an execution order of the three steps of S1202a, S1202b and S1202c, but the execution order shown in Figure 12 does not constitute a limitation of the execution order of the three steps of S1202a, S1202b and S1202c, and the three steps can also adopt other execution orders , can also be executed in parallel.
- the AO module sends the screened exposure parameters and standard brightness to the automatic exposure module.
- the AO module sends the screened exposure parameters to the camera driver.
- the camera driver configures the camera to run and capture images with exposure parameters that match the brightness of the ambient light, and store the captured images into the memory.
- the camera stores the captured image in the memory, the automatic exposure module executes steps S1205 to S1209, and the AO module executes steps S1212 to S1217.
- the automatic exposure module and AO module can run in parallel without interfering with each other.
- the automatic exposure module executes the process of steps S1205 to S1209, which can be understood as a solution for adjusting the exposure parameters of the camera in the display control method.
- the automatic exposure module acquires the image data of the face image stored in the internal memory.
- the automatic exposure module calculates image brightness of the face image by using the image data of the face image.
- the automatic exposure module compares the image brightness of the face image with the standard brightness to obtain a comparison result.
- the automatic exposure module does not perform any operation, and the camera still operates with the exposure parameters screened out by the AO module.
- the automatic exposure module executes S1208 to adjust the exposure parameter according to the comparison result to obtain an exposure parameter adjustment value.
- the automatic exposure module sends the exposure parameter adjustment value to the AO module.
- the AO module sends an exposure parameter adjustment value to the camera driver.
- the camera driver configures the camera to run and capture images with the exposure parameter adjustment value, and save the captured images to the memory.
- the AO module acquires the image data of the face image stored in the internal memory.
- the AO module compares the image data of the face image and the sample feature library, and compares the image according to the comparison result of the image data of the face image and the sample feature library Configure the confidence level, which is used to characterize the probability that the human eye in the image will fixate on the display screen.
- the AO module judges whether the confidence degree of the face image is smaller than the screened confidence threshold value.
- the AO module judges that the confidence degree of the face image is not less than the confidence threshold value, and then determines that the eyes in the face image are watching the display screen.
- the AO module determines that the confidence of the face image is less than a confidence threshold, and then determines that the human eyes in the face image are not looking at the display screen.
- the AO module determines that the human eyes in one frame of the face image of the continuous multi-frame images are watching the display screen, then the AO module executes S1217 to control the display screen of the electronic device to not turn off the screen.
- the AO module when the AO module determines that none of the human eyes in the continuous multiple frames of images are looking at the display screen, the AO module controls the display screen of the electronic device to turn off.
- the continuous multi-frame images are captured by the camera within a duration determined according to the screen-off time of the electronic device, and the method for determining the duration according to the screen-off time of the electronic device can refer to the content of Embodiment 3.
- the AO module can also execute a face detection event for each frame of the face image acquired, and judge whether the result of the face detection event is a detected face, and determine whether a frame is detected by the AO module.
- the result of the face detection event of the face image is that a face is detected, and the judgment result of step S1214 is that the confidence of the image is not less than the threshold value, then step S1215 is executed.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
本申请实施例提供了一种参数的调整方法、显示的控制方法、电子设备及介质,其中,电子设备包括前置摄像头和显示屏,前置摄像头被配置响应第一指令,以与环境光亮度匹配的初始曝光参数运行拍摄图像;参数的调整方法包括:在显示屏显示数据时,获取前置摄像头拍摄的图像;利用图像的图像数据,计算图像的图像亮度;以图像的图像亮度与标准亮度的差值,调整初始曝光参数,得到曝光参数调整值,该曝光参数调整值配置给前置摄像头在响应第二指令时,以曝光参数调整值运行拍摄图像。前置摄像头被配置以初始曝光参数运行,能够保证摄像头拍摄图像的图像亮度合理,减少了调整前置摄像头的曝光参数的耗时,提高了评判图像中人眼注视显示屏的准确度。
Description
本申请要求于2021年8月9日提交中国专利局、申请号为202110908422.9、发明名称为“参数的调整方法、显示的控制方法、电子设备及介质”中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请涉及图像处理技术领域,尤其涉及一种参数的调整方法、显示的控制方法、电子设备及介质。
用户通过电子设备查阅网页、新闻、文章,玩游戏或看视频时,用户会长时间注视电子设备的显示屏,为配合用户长时间注视显示屏,电子设备在检测到用户长时间注视显示屏后会执行对应事件,例如显示屏不熄屏、铃声音量降低等。
具体的,电子设备可以借助摄像头拍摄用户的人脸图像,并通过人脸图像来评判用户是否长时间注视显示屏。摄像头拍摄的图像质量高低,制约着电子设备确定用户人眼是否注视显示屏的准确性。为此,电子设备获取到摄像头拍摄的图像后,若判断出图像欠曝或过曝,会调整摄像头的曝光参数,以使得摄像头以合理的曝光参数拍摄出正常亮度图像。但是,电子设备调整摄像头的曝光参数耗时较长,导致图像中人眼注视显示屏的检测耗时长,且准确度低。
发明内容
本申请提供了一种参数的调整方法、显示的控制方法、电子设备及介质,目的在于减少电子设备调整摄像头的曝光参数耗时较长,进而减少图像中人眼注视显示屏的检测耗时长,且提高检测准确度。
为了实现上述目的,本申请提供了以下技术方案:
第一方面,本申请提供了一种参数的调整方法,应用于电子设备,该电子设备包括前置摄像头和显示屏,该前置摄像头被配置响应第一指令,以初始曝光参数运行拍摄图像,前置摄像头的初始曝光参数与环境光亮度匹配;本申请提供的参数的调整方法包括:在电子设备的显示屏显示数据时,获取前置摄像头拍摄的图像;利用图像的图像数据,计算图像的图像亮度;以图像的图像亮度与标准亮度的差值,调整前置摄像头的初始曝光参数,得到曝光参数调整值,该曝光参数调整值配置给前置摄像头,前置摄像头在接收到第二指令时,响应第二指令,以曝光参数调整值运行拍摄图像。
由上述内容可以看出:前置摄像头被配置以环境光亮度相匹配的初始曝光参数运行拍摄图像,能够保证摄像头以该初始曝光参数运行所拍摄图像的图像亮度相对合理,不过亮或过暗,如此,以图像的图像亮度与标准亮度的差值,调整前置摄像头的初始曝光参数的次数较少,不需要多次调整曝光参数,减少了调整前置摄像头的曝光参数的耗时。并且,较快的调整出符合图像亮度要求的曝光参数,保证了尽快以图像亮度满足要求的图像评判图像人眼是否注视显示屏,进而提高了评判图像中人眼注视显示屏的准确度。
在一个可能的实施方式中,在显示屏显示数据时,获取前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像之前,还包括:利用环境光亮度,确定初始曝光参数。
在一个可能的实施方式中,利用环境光亮度,确定初始曝光参数,包括:基于环境光亮度与曝光参数的对应关系,确定与环境光亮度匹配的曝光参数。
在一个可能的实施方式中,环境光亮度和曝光参数的对应关系的生成方法,包括:获取多组样本图像,一组样本图像对应一个环境光亮度,且一组样本图像包括多个样本图像,每个样本图像对应一个曝光参数;利用多组样本图像对应的环境光亮度,以及多组样本图像中的人眼注视显示屏的样本图像对应的曝光参数,生成环境光亮度和曝光参数的对应关系。
在一个可能的实施方式中,环境光亮度和曝光参数的对应关系的生成方法,包括:获取多个历史曝光参数调整值,以及每个历史曝光参数调整值对应的环境光亮度,每个历史曝光参数调整值满足标准亮度的要求;利用多个历史曝光参数调整值,以及每个历史曝光参数调整值对应的环境光亮度,生成环境光亮度和曝光参数的对应关系。
在一个可能的实施方式中,以图像的图像亮度与标准亮度的差值,调整前置摄像头的初始曝光参数,得到曝光参数调整值之后,还包括:以曝光参数调整值更新环境光亮度和曝光参数的对应关系。
在一个可能的实施方式中,初始曝光参数包括:曝光时长、模拟增益和数字增益中的至少一个。
在一个可能的实施方式中,在以图像的图像亮度与标准亮度的差值,调整前置摄像头的初始曝光参数,得到曝光参数调整值之前,还包括:利用环境光亮度,确定标准亮度。
在本可能的实施方式中,标准亮度以环境光亮度确定,以标准亮度为基准,调整前置摄像头的曝光参数,前置摄像头以曝光参数调整值进行拍摄,得到满足环境光亮度的标准亮度要求的图像,保证了前置摄像头拍摄图像的图像质量,提高了评判图像中人眼是否注视显示屏的准确度。
在一个可能的实施方式中,利用环境光亮度,确定标准亮度,包括:基于环境光亮度与标准亮度的对应关系,确定与环境光亮度匹配的标准亮度。
在一个可能的实施方式中,环境光亮度与标准亮度的对应关系的生成方法,包括:获取多组样本图像,一组样本图像对应一个环境光亮度,且一组样本图像包括多个样本图像;利用多组样本图像对应的环境光亮度,以及多组样本图像中的人眼注视显示屏的样本图像的图像亮度,生成环境光亮度和标准亮度的对应关系。
在一个可能的实施方式中,环境光亮度与标准亮度的对应关系包括:多个环境光亮度区间和每个环境光亮度区间匹配的标准亮度。
在一个可能的实施方式中,利用图像的图像数据,计算图像的图像亮度,包括:获取图像包括的每个像素点的RGB分量;计算图像的每个像素点的RGB分量的均值,计算得到的均值作为图像的图像亮度。
在一个可能的实施方式中,在显示屏显示数据时,获取前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像之后,还包括:根据图像的图像数据和样本特征库的比对结果,对图像配置置信度,置信度用于表征图像中人眼注视显示屏的概率;若图像的置信度不小于预设门限值,该预设门限值与环境光亮度匹配,则控制显示屏不熄屏。
在本可能的实施方式中,预设门限值与环境光亮度匹配,图像的置信度与预设门限值比对,保证了以图像的置信度相比较的预设门限值能够符合环境光亮度要求,保证了检测图像中人眼注视显示屏的检测准确性。
在一个可能的实施方式中,控制显示屏不熄屏之前,还包括:确定前置摄像头在预设时段内拍摄的图像中,存在一帧图像的置信度不小于预设门限值,预设时段利用显示屏被设定的屏幕熄灭时间确定。
在一个可能的实施方式中,还包括:若前置摄像头在预设时段内拍摄的图像的置信度,均小于预设门限值,则控制显示屏熄屏。
在一个可能的实施方式中,在显示屏显示数据时,获取前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像之前,还包括:利用环境光亮度,确定预设门限值。
在一个可能的实施方式中,利用环境光亮度,确定预设门限值,包括:基于环境光亮度与门限值的对应关系,确定与环境光亮度匹配的门限值。
在一个可能的实施方式中,环境光亮度和门限值的对应关系包括:多个环境光亮度区间,以及每个环境光亮度区间匹配的门限值。
在一个可能的实施方式中,环境光亮度和门限值的对应关系中,强光区间匹配的门限值和弱光区间匹配的门限值,小于正常光区间匹配的门限值。
在本可能的实施方式中,暗环境或强光的亮环境下,摄像头拍摄图像的图像亮度过低或过高,图像欠曝或过曝,对图像配置的置信度会降低。对应的,在环境光亮度值和置信度门限值的对应关系表中,强光区间匹配的门限值和弱光区间匹配的门限值,小于正常光区间匹配的门限值,如此能够保证利用环境光亮度,从环境光亮度和门限值的对应关系中筛选出的数值较低的门限值,避免了由于图像亮度不够或过高导致配置的置信度小于门限值,导致图像中人眼注视显示屏的误判,提高了检测图像中人眼注视显示屏的准确度。
在一个可能的实施方式中,控制显示屏不熄屏之前,还包括:获取图像的人脸检测结果,确定人脸检测结果为检测到人脸。
在本可能的实施方式中,若图像的置信度不小于预设门限值,再获取图像的人脸检测结果,确定人脸检测结果为检测到人脸之后,控制显示屏不熄屏,避免了以环境光亮度匹配的预设门限值执行图像中人眼注视显示屏的误判的问题。
第二方面,本申请提供了一种显示的控制方法,应用于电子设备,电子设备包括前置摄像头和显示屏,显示的控制的方法包括:在显示屏显示数据时,获取前置摄像头拍摄的图像;根据图像的图像数据和样本特征库的比对结果,对图像配置置信度,置信度用于表征图像中人眼注视显示屏的概率;若图像的置信度不小于预设门限值,则确定图像中的人眼注视显示屏,预设门限值与环境光亮度匹配。
由上述内容可以看出:预设门限值与环境光亮度匹配,图像的置信度与预设门限值比对,保证了以图像的置信度相比较的预设门限值能够符合环境光亮度要求,保证了检测图像中人眼注视显示屏的检测准确性。
在一个可能的实施方式中,若图像的置信度不小于预设门限值,则确定图像中人眼注视显示屏之后,还包括:控制显示屏不熄屏。
在一个可能的实施方式中,控制显示屏不熄屏之前,还包括:确定前置摄像头在预设时段内拍摄的多帧图像中,存在一帧图像中的人眼注视显示屏,预设时段利用显示屏被设定的屏幕熄灭时间确定。
在一个可能的实施方式中,还包括:若图像的置信度小于预设门限值,则确定图像中人眼未注视显示屏。
在一个可能的实施方式中,若图像的置信度小于预设门限值,则确定图像中人眼未注视显示屏之后,还包括:若前置摄像头在预设时段内拍摄的多帧图像中的人眼均未注视显示屏,则控制显示屏熄屏,预设时段利用显示屏被设定的屏幕熄灭时间确定。
在一个可能的实施方式中,若图像的置信度不小于预设门限值,则确定图像中的人眼注视显示屏,预设门限值与环境光亮度匹配之前,还包括:利用环境光亮度,确定预设门限值。
在一个可能的实施方式中,利用环境光亮度,确定预设门限值,包括:基于环境光亮度与门限值的对应关系,确定与环境光亮度匹配的门限值。
在一个可能的实施方式中,环境光亮度和门限值的对应关系包括:多个环境光亮度区间,以及每个环境光亮度区间匹配的门限值。
在一个可能的实施方式中,环境光亮度和门限值的对应关系中,强光区间匹配的门限值和弱光区间匹配的门限值,小于正常光区间匹配的门限值。
在本可能的实施方式中,暗环境或强光的亮环境下,摄像头拍摄图像的图像亮度过低或过高,图像欠曝或过曝,对图像配置的置信度会降低。对应的,在环境光亮度值和置信度门限值的对应关系表中,强光区间匹配的门限值和弱光区间匹配的门限值,小于正常光区间匹配的门限值,如此能够保证利用环境光亮度,从环境光亮度和门限值的对应关系中筛选出的数值较低的门限值,避免了由于图像亮度不够或过高导致配置的置信度小于门限值,导致图像中人眼注视显示屏的误判,提高了检测图像中人眼注视显示屏的准确度。
在一个可能的实施方式中,前置摄像头的初始曝光参数与环境光亮度匹配。
在本可能的实施方式中,前置摄像头被配置以环境光亮度相匹配的初始曝光参数运行拍摄图像,能够保证摄像头以该初始曝光参数运行所拍摄图像的图像亮度相对合理,不过亮或过暗,如此,以图像的图像亮度与标准亮度的差值,调整前置摄像头的初始曝光参数的次数较少,不需要多次调整曝光参数,减少了调整前置摄像头的曝光参数的耗时。并且,较快的调整出符合图像亮度要求的曝光参数,保证了尽快以图像亮度满足要求的图像评判图像人眼是否注视显示屏,进而提高了评判图像中人眼注视显示屏的准确度。
在一个可能的实施方式中,在显示屏显示数据时,获取前置摄像头拍摄的图像之前,还包括:基于环境光亮度和曝光参数的对应关系,确定初始曝光参数。
在一个可能的实施方式中,环境光亮度和曝光参数的对应关系的生成方法,包括:获取多组样本图像,一组样本图像对应一个环境光亮度,且一组样本图像包括多个样本图像,每个样本图像对应一个曝光参数;利用多组样本图像对应的环境光亮度,以及多组样本图像中的人眼注视显示屏的样本图像对应的曝光参数,生成环境光亮度和曝光参数的对应关系。
在一个可能的实施方式中,环境光亮度和曝光参数的对应关系的生成方法,包括:获取多个历史曝光参数调整值,以及每个历史曝光参数调整值对应的环境光亮度,历史曝光参数调整值满足标准亮度要求;利用多个历史曝光参数调整值,以及每个历史曝光参数调整值对应的环境光亮度,生成环境光亮度和曝光参数的对应关系。
在一个可能的实施方式中,初始曝光参数包括:曝光时长、模拟增益和数字增益中的至少一个。
在一个可能的实施方式中,确定图像中的人眼注视显示屏之前,还包括:获取图像的人脸检测结果;确定人脸检测结果为检测到人脸。
在本可能的实施方式中,若图像的置信度不小于预设门限值,再获取图像的人脸检测结果,确定人脸检测结果为检测到人脸之后,确定图像中的人眼注视显示屏,避免了以环境光亮度匹配的预设门限值执行图像中人眼注视显示屏的误判的问题。
第三方面,本申请提供了一种电子设备,包括:显示屏、环境光检测器、前置摄像头、一个或多个处理器和存储有程序的存储器,其中:显示屏用于显示数据;环境光检测器用于检测环境光,得到环境光亮度;前置摄像头用于在显示屏显示数据时,以初始曝光参数运行拍摄图像;在存储器中的程序被一个或多个处理器执行时,使得电子设备执行如第一方面或者第一方面的任一个可能的实施方式的参数的调整方法,或者,如第二方面或者第二方面的任一个可能的实施方式的显示的控制方法。
第四方面,本申请提供了一种可读存储介质,其上存储有计算机程序,其中,计算机程序被处理器执行时实现如第一方面或者第一方面的任一个可能的实施方式的参数的调整方法,或者,如第二方面或者第二方面的任一个可能的实施方式的显示的控制方法。
图1为本申请提供的电子设备的一种应用场景图;
图2a为本申请提供的电子设备的结构示意图;
图2b为本申请提供的电子设备中的逻辑单元的运行过程展示图;
图3为本申请提供的摄像头拍摄的多帧图像的展示图;
图4a为本申请实施例提供的一种电子设备的结构示意图;
图4b为本申请实施例提供的一种电子设备的软件结构示例图;
图5为本申请实施例一提供的一种电子设备的结构示意图;
图6为本申请实施例一提供的摄像头拍摄的多帧图像的展示图;
图7为本申请实施例二提供的一种电子设备的结构示意图;
图8为本申请实施例二提供的摄像头拍摄的多帧图像的展示图;
图9为本申请实施例三提供的一种电子设备的结构示意图;
图10为本申请实施例三提供的摄像头拍摄的多帧图像的展示图;
图11为本申请实施例三提供的摄像头拍摄的多帧图像的展示图;
图12为本申请实施例四提供的显示的控制方法的流程图。
本申请说明书和权利要求书及附图说明中的术语“第一”、“第二”和“第三”等是用于区别不同对象,而不是用于限定特定顺序。
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
用户可以通过电子设备查阅网页、新闻、文章等,也可以通过电子设备玩游戏,看视频。用户通过电子设备查阅网页、新闻、文章,玩游戏或看视频时,用户会长时间注视电子设备的显示屏,为配合用户长时间注视显示屏,电子设备在检测到用户长时间注视显示屏后会执行对应事件,例如显示屏不熄屏、铃声音量降低等。
图1展示了用户通过电子设备查阅网页的场景,以下以该场景为例对电子设备检测用户长时间注视显示屏,执行对应事件的方案进行介绍。
参见图2a,最简图像前端(Image Front End lit,IFE lit)单元,是图像信号处理器中的集成单元,摄像头输出的图像会到达IFE lit集成单元,由其将摄像头输出的图像存储于内存的安全buffer。
自动曝光模块属于控制器的一个逻辑单元,由控制器运行自动曝光(automatic exposure,AE)算法得到。
AO(always on)模块也属于控制器的一个逻辑单元,由控制器运行AO(always on)方案得到。AO方案,是指基于AO camera(always on camera)实现的智慧感知的解决方案,通常包含人眼注视识别、机主识别、手势识别等功能,典型特征为长时间低功耗运行。
摄像头驱动也属于控制器的一个逻辑单元,用于对摄像头配置参数,打开或关闭摄像头。
电子设备的显示屏显示网页,用户注视电子设备的显示屏查阅网页。如图2a所示,电子设备发送指令,电子设备的前置摄像头响应指令后运行,执行步骤S1、拍摄用户的人脸图像。最简图像前端单元执行步骤S2读取人脸图像,基于安全机制将人脸图像存储于内存的安全buffer。AO模块执行步骤S3-1、获取内存的安全buffer存储的人脸图像的图像数据,通过分析图像数据,确定用户人眼是否注视显示屏。在AO模块确定用户人眼注视显示屏时,执行步骤S4、控制电子设备的显示屏不熄屏。
摄像头拍摄的人脸图像的图像质量高低,制约着AO模块确定用户人眼是否注视显示屏的准确性。尤其是摄像头拍摄的人脸图像的图像亮度较高或者较低时,AO模块确定用户人眼是否注视显示屏的误差较大。为此,图2a中,自动曝光模块按照步骤S3-2所示,获取内存存储的人脸图像的图像数据;利用图像数据,计算人脸图像的图像亮度,将计算得到的图像亮度与标准亮度做比对,得到比对结果;依据比对结果调整摄像头的曝光参数,一般为曝光时长和增益,得到曝光时长调整值和增益调整值。自动曝光模块还执行步骤S5、将计算得到的曝光时长调整值和增益调整值,传输至AO模块,AO模块再按照图2a中步骤S6所示,将曝光时长调整值和增益调整值发送至摄像头驱动,摄像头驱动按照图2a中步骤S7所示,配置摄像头以曝光时长调整值和增益调整值运行。电子设备可再发送指令,摄像头响应电子设备的指令,以曝光时长调整值和增益调整值运行拍摄图像。
以下结合图2b,对AO模块分析图像数据,确定用户人眼是否注视显示屏,以及自动曝光模块调整摄像头的曝光参数的具体方式进行说明。
参见图2b,图像序列包括摄像头拍摄的多帧图像,比如图像帧1,2,3,4…n,其中,摄像头开始以通用的曝光时长和增益运行。通常来讲,通用的曝光时长和增益可以是预先设定的。自动曝光模块按照图像的存储顺序,依次获取图像序列的每一帧图像的图像数据。针对第一帧图像(也称图像帧1),自动曝光模块利用图像帧1的图像数据,计算图像帧1的图像亮度,将图像帧1的图像亮度与标准亮度做比对,得到比对结果。若比对结果反映出图像帧1的图像亮度与标准亮度的差值小于预设值(如±10%),自动曝光模块则不执行操作,摄像头还是以原始曝光时长和增益运行,该原始曝光时长和增益是指前述通用的曝光时长和增益。若比对结果反映出图像帧1的图像亮度与标准亮度的差值不小于预设值,自动曝光模块依据比对结果调整摄像头的曝光时长和增益,得到曝光时长1调值和增益1调值。自动曝光模块将曝光时长1调值和增益1调值,经AO模块向摄像头驱动传输。摄像头驱动配置摄像头以曝光时长1调值和增益1调值运行拍摄图像。
受自动曝光模块和摄像头驱动执行一次流程,滞后摄像头的拍摄一帧图像的影响,假设图像帧2和图像帧3,是摄像头以原始曝光时长和增益运行拍摄得到,自动曝光模块按照上述处理方式,利用图像帧2和图像帧3的图像数据,计算得到曝光时长1调值和增益1调值;摄像头驱动配置摄像头同样以曝光时长1调值和增益1调值运行拍摄图像。图像帧4为摄像头被配置以曝光时长1调值和增益1调值拍摄得到。自动曝光模块也采样上述处理方式,利用图像帧4的图像数据,计算得到曝光时长2调值和增益2调值;摄像头驱动配置摄像头以曝光时长2调值和增益2调值运行拍摄图像。如此反复,直到自动曝光模块比对出图像帧的图像亮度和标准亮度的差值小于预设值,如±10%停止。
AO模块也按照图像的存储顺序,依次获取图像序列的每一帧图像的图像数据。针对AO模块获取的每一帧图像的图像数据,AO模块均执行下述流程,得到每一帧图像中人眼注视显示屏的判断结果。以下以AO模块处理图像帧1的图像数据为例说明。
AO模块比对图像帧1的图像数据和样本特征库,根据图像帧1的图像数据和样本特征库的比对结果,对图像帧1配置置信度,该置信度用于表征图像帧1中的人眼注视显示屏的概率。AO模块判断图像帧1的置信度是否小于门限值,图像帧1的置信度不小于门限值,则确定图像帧1中的人眼注视显示屏,图像帧1的置信度小于门限值,则确定图像帧1中的人眼未注视显示屏。
一些实施例中,样本特征库中包括人眼注视显示屏图像的特征数据。该特征数据的确定方式为:获取大量的样本图像,样本图像包括人眼注视显示屏的样本图像和人眼未注视显示屏的样本图像,利用每一个样本图像的图像数据进行学习,得到表征人眼注视显示屏图像的特征数据。人眼注视显示屏的样本图像和人眼未注视显示屏的样本图像,均指代电子设备的前置摄像头拍摄人脸图像。
AO模块确定一定时长内的图像存在一帧图像的人眼注视显示屏,则执行控制电子设备的显示屏不熄屏、铃声音量降低等对应事件。
前述提出的AO模块分析图像数据,确定用户人眼是否注视显示屏,以及自动曝光模块调整摄像头的曝光参数存在下述问题:
问题一、自动曝光模块调整摄像头的曝光参数耗时较长,AO模块确定图像中人眼注视显示屏的检测准确度被降低。
用户处于暗环境或强光的亮环境使用电子设备,电子设备的摄像头在暗环境或者强光的亮环境下拍摄图像的图像亮度也会过暗或过亮。如图3所示,图像11为摄像头处于暗环境下,以通用的曝光时长和增益运行拍摄得到,图像21为摄像头处于强光的亮环境下,以通用的曝光时长和增益运行拍摄得到。自动曝光模块获取图像11的图像数据,利用图像11的图像数据,经前述图2b提出的处理流程,调整摄像头的通用的曝光时长和增益,得到曝光时长调整值和增益调整值。摄像头以曝光时长调整值和增益调整值拍摄的图像为图像12。自动曝光模块获取图像12的图像数据,再进一步调整摄像头的曝光时长和增益,如此反复,直至自动曝光模块利用图像14的图像数据,调整得到摄像头的曝光时长和增益提供于摄像头,摄像头再拍摄的图像15的图像亮度满足要求。同理,按照自动曝光模块调整后的曝光时长和增益,摄像头依次拍摄得到图像22至图像25,图像25的图像亮度满足要求。
进一步的,自动曝光模块是以较小步长缩短图像的图像亮度和标准亮度的差值,并以此调整曝光时长和增益,如此就导致自动曝光模块获取图像的图像亮度过亮或过暗,均会反复多次调整曝光时长和增益,才能保证摄像头拍摄图像的图像亮度满足要求。随之带来了自动曝光模块调整摄像头的曝光参数耗时较长的问题。
自动曝光模块反复多次调整曝光时长和增益,直到合理的曝光时长和增益提供摄像头拍摄出满足要求的图像的过程中,摄像头拍摄的每一帧图像,AO模块均会获取图像数据,以评判图像中的人眼是否注视显示屏。但是,图像的图像亮度不满足要求,AO模块确定图像中人眼是否注视显示屏的准确度会降低。
问题二、标准亮度设置不合理,导致自动曝光调整模块不能准确的调整出曝光时长和增益,影响摄像头的拍摄图像的图像质量,降低AO模块确定图像中人眼注视显示屏的检测准确度。
电子设备处于不同亮度的环境,自动曝光调整模块均以一个通用的标准亮度来评判图像的图像亮度是否满足要求,在不满足要求时以标准亮度为基准,调整曝光时长和增益。通用的标准亮度本身不能保证摄像头在不同亮度环境下的拍摄图像的亮度合理,以该标准亮度为基础进行曝光时长和参数的调整,就会导致自动曝光调整模块不能准确调整的曝光时长和增益。曝光时长和增益无法被准确调整,摄像头以调整后的曝光时长和增益运行而拍摄图像的图像质量降低,AO模块以图像质量不高的图像来执行确定图像中人眼注视显示屏的操作,准确度必然会不高。
问题三、置信度的门限值设置不合理,导致AO模块以门限值判断欠曝图像或过曝图像中人眼是否注视显示屏的检测准确度不高。
暗环境或强光的亮环境下,摄像头拍摄图像的图像亮度过低或过高,图像欠曝或过曝,AO模块对图像配置的置信度会降低,如此,会导致即便图像中人眼注视显示屏时,AO模块由于图像亮度不够或过高导致配置的置信度小于门限值,如此会导致AO模块的误判,影响AO模块的检测准确度。
基于上述技术方案中存在的问题,本申请通过下述三个实施例提出三种方案。
本申请实施例提供的三种方案,均可以适用于手机,平板电脑,桌面型、膝上型、笔记本电脑,超级移动个人计算机(Ultra-mobile Personal Computer,UMPC),手持计算机, 上网本,个人数字助理(Personal Digital Assistant,PDA),可穿戴电子设备,智能手表等电子设备。
图4a为本申请实施例提供的一种电子设备的组成示例。以手机为例,电子设备400可以包括处理器410,外部存储器接口420,内部存储器421,显示屏430,摄像头440,天线1,天线2,移动通信模块450,无线通信模块460,以及环境光传感器470等。
可以理解的是,本实施例示意的结构并不构成对该电子设备的具体限定。在另一些实施例中,该电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器410可以包括一个或多个处理单元,例如:处理器410可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备400的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
视频编解码器用于对数字视频压缩或解压缩。电子设备可以支持一种或多种视频编解码器。这样,电子设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
处理器410中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器410中的存储器为高速缓冲存储器。该存储器可以保存处理器410刚用过或循环使用的指令或数据。如果处理器410需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器410的等待时间,因而提高了系统的效率。
在一些实施例中,处理器410可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器410与显示屏430,摄像头440,无线通信模块460等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备400的结构限定。在本申请另一些实施例中,电子设备400也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
外部存储器接口420可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备的存储能力。外部存储卡通过外部存储器接口420与处理器410通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器421可以用于存储计算机可执行程序代码,可执行程序代码包括指令。处理器410通过运行存储在内部存储器421的指令,从而执行电子设备400的各种功能应用以及数据处理。内部存储器421可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器421可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器410通过运行存储在内部存储器421的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备的各种功能应用以及数据处理。
电子设备通过GPU,显示屏430,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏430和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器410可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏430用于显示图像,视频等。显示屏430包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oled,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备可以包括1个或N个显示屏430,N为大于1的正整数。
电子设备的显示屏430上可以显示一系列图形用户界面(graphical user interface,GUI),这些GUI都是该电子设备的主屏幕。一般来说,电子设备的显示屏430的尺寸是固定的,只能在该电子设备的显示屏430中显示有限的控件。控件是一种GUI元素,它是一种软件组件,包含在应用程序中,控制着该应用程序处理的所有数据以及关于这些数据的交互操作,用户可以通过直接操作(direct manipulation)来与控件交互,从而对应用程序的有关信息进行读取或者编辑。一般而言,控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。例如,在本申请实施例中,显示屏430可以显示虚拟按键(一键编排、开始编排、场景编排)。
电子设备可以通过ISP,摄像头440,视频编解码器,GPU,显示屏430以及应用处理器等实现拍摄功能。
ISP用于处理摄像头440反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头440中。
摄像头440包含镜头和感光元件(也为图像传感器)。摄像头440用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备可以包括1个或N个摄像头440,N为大于1的正整数。
DSP用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
电子设备的无线通信功能可以通过天线1,天线2,移动通信模块450,无线通信模块460,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块450可以提供应用在电子设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块450可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块450可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块450还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块450的至少部分功能模块可以被设置于处理器410中。在一些实施例中,移动通信模块450的至少部分功能模块可以与处理器410的至少部分模块被设置在同一个器件中。
无线通信模块460可以提供应用在电子设备上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块460可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块460经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器410。无线通信模块460还可以从处理器410接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
环境光传感器470用于感知环境光亮度。电子设备可以根据感知的环境光亮度自适应调节显示屏430亮度。环境光传感器470也可用于拍照时自动调节白平衡。环境光传感器470还可以与接近光传感器配合,检测电子设备是否在口袋里,以防误触。
另外,在上述部件之上,运行有操作系统。例如鸿蒙系统,iOS操作系统,Android操作系统,Windows操作系统等。在操作系统上可以安装运行应用程序。
图4b是本申请实施例的电子设备的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。如图4b所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,显示,音乐,铃声,以及短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图4b所示,应用程序框架层可以包括窗口管理器,内容提供器,电话管理器,资源管理器,通知管理器,视图系统等,在本申请的一些实施例中,应用程序框架层还可以包括感知服务。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
电话管理器用于提供电子设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
感知服务用于执行前述提出的AO方案。感知服务执行AO方案过程中,若检测到一帧图像中人眼注视显示屏,则控制应用程序层的显示采用不熄屏方式显示,铃声在需要输出响铃时,以降低音量的方式。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。在本申请一些实施例中,应用冷启动会在Android runtime中运行,Android runtime由此获取到应用的优化文件状态参数,进而Android runtime可以通过优化文件状态参数判断优化文件是否因系统升级而导致过时,并将判断结果返回给应用管控模块。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),二维图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染、合成和图层处理等。
二维图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。摄像头驱动用于对摄像头配置参数,打开或关闭摄像头。
需要说明的是,本申请实施例虽然以Android系统为例进行说明,但是其基本原理同样适用于基于鸿蒙、iOS、Windows等操作系统的电子设备。
实施例一
为解决前述问题一,本申请实施例提供了一种控制器,控制器可以理解成图4a展示的处理器的一个处理单元。参见图5,控制器包括AO模块、自动曝光模块和摄像头驱动。AO模块、自动曝光模块和摄像头驱动均为控制器的逻辑单元,功能如前述内容。
本实施例中,AO模块被配置有环境光亮度与曝光参数的对应关系表。在一些实施例中,环境光亮度与曝光参数的对应关系表,包括多个环境光亮度和每个环境光亮度匹配的曝光参数。在另一些实施例中,环境光亮度与曝光参数的对应关系表包括多个环境光亮度区间和每个环境光亮度区间匹配的曝光参数。
在一些实施例中,曝光参数包括:曝光时长和增益中的至少一个,因增益一般包括数字增益和模拟增益,因此,曝光参数可以包括曝光时长、数字增益和模拟增益的至少一个。
曝光时长是指快门速度,即按下快门的时间。曝光时长越长,光子到图像传感器表面的光子总和越多,摄像头拍摄的图像就会越亮,反之图像则越暗。但是如果曝光过度,则图像过亮,失去图像细节;如果曝光不足,则图像过暗,同样会失去图像细节。
模拟增益和数字增益是指经过双采样之后的信号的放大增益,用于控制图像传感器采集数据,模拟增益和数字增益越大,则摄像头拍摄的图像越亮,模拟增益和数字增益越小,则摄像头拍摄的图像越暗。
在一些实施例中,环境光亮度与曝光参数的对应关系表的生成方式为:
电子设备的前置摄像头在一个环境光亮度下被配置以多种曝光参数运行拍照,采集前置摄像头在不同环境光亮度下拍摄的人眼注视显示屏时的图像,得到不同环境光亮度下的多帧样本图像。
针对每一个环境光亮度,按照下述方式处理该环境光亮度下的多帧样本图像,得到该环境光亮度匹配的曝光参数。
AO模块利用一帧样本图像的图像数据,识别样本图像中人眼是否注视显示屏。若AO模块识别图像中人眼未注视显示屏,则忽略该样本图像,采集该环境光亮度的下一帧样本图像;若AO模块识别图像中人眼注视显示屏,则记录该样本图像对应的曝光参数,并采集该环境光亮度的下一帧样本图像。如此,得到一个环境光亮度下记录的多帧样本图像对应的曝光参数,利用一个环境光亮度下记录的多帧样本图像对应的曝光参数,计算得到该环境光亮度匹配的曝光参数。
一些实施例中,选择记录的多帧样本图像对应的曝光参数的中间值作为该环境光亮度匹配的曝光参数;另一些实施例中,计算记录的多帧样本图像对应的曝光参数的平均值,作为该环境光亮度匹配的曝光参数;另一些实施例中,记录的多帧样本图像中随机选择一帧样本图像,将选择的样本图像对应的曝光参数,作为该环境光亮度匹配的曝光参数。
在另一些实施例中,环境光亮度与曝光参数的对应关系表的生成方式为:
记录自动曝光模块执行调整摄像头的曝光参数流程时,与标准亮度差值满足预设值的图像亮度而对应的调整后的曝光参数,并且,还记录自动曝光模块执行调整摄像头的曝光参数流程对应的环境光亮度。利用记录的每一个环境光亮度,以及自动曝光模块调整后的曝光参数,构建环境光亮度与曝光参数的对应关系表。
下表1提供一种环境光亮度与曝光参数的对应关系表的示例。本示例展示的对应关系表包括:多个环境光亮度范围,以及每个环境光亮度范围匹配的曝光时长和模拟增益。
表1
环境光亮度(lux) | 模拟增益(0~1023) | 曝光时长(μs) |
小于150 | 891 | 4025 |
150-250 | 835 | 4025 |
250~350 | 694 | 4025 |
350-450 | 619 | 4025 |
450-550 | 300 | 3618 |
550-650 | 300 | 3522 |
650-750 | 300 | 2599 |
750-850 | 300 | 2534 |
850-950 | 300 | 2206 |
950-1050 | 300 | 1566 |
1050-1300 | 300 | 1326 |
1350-1550 | 300 | 949 |
1550-2000 | 300 | 599 |
2000-4000 | 300 | 487 |
大于4000 | 300 | 85 |
在环境光亮度较小时,如表1中展示的前五行,环境光亮度对应的曝光时长为4025us,属于最大曝光时长。因此,表1提供的环境光亮度与曝光参数的对应关系表属于快门优先,即在环境光亮度较低时,优先设定最大曝光时长。
下表2提供了环境光亮度与曝光参数的对应关系表的另一种示例。
本示例展示的对应关系表也包括:多个环境光亮度范围,以及每个环境光亮度范围匹配的曝光时长和模拟增益。
表2
环境光亮度(lux) | 模拟增益(0~1023) | 曝光时长(μs) |
小于150 | 1023 | 3200 |
150-250 | 1023 | 3165 |
250~350 | 1023 | 3082 |
350-450 | 1023 | 2954 |
450-550 | 1023 | 2726 |
550-650 | 890 | 2034 |
650-750 | 827 | 1890 |
750-850 | 670 | 1679 |
850-950 | 512 | 1432 |
950-1050 | 425 | 1428 |
1050-1300 | 300 | 1326 |
1350-1550 | 300 | 949 |
1550-2000 | 300 | 599 |
2000-4000 | 300 | 487 |
大于4000 | 300 | 85 |
表2展示的环境光亮度与曝光参数的对应关系表属于增益优先,同样由前五行的数据可以看出:在环境光亮度较低时,优先设定最大模拟增益。
需要说明的是,上表1和表2均是环境光亮度与曝光时长、模拟增益的对应关系表,但这并不构成对环境光亮度与曝光参数的对应关系表的限定。在摄像头支持数字增益调整数时,环境光亮度与曝光参数的对应关系表,可以包括环境光亮度,与曝光时长、模拟增益和数字增益的对应关系。
本实施例中,AO模块如图5所示,执行步骤S11、获取环境光传感器检测的环境光亮度,并从环境光亮度与曝光参数的对应关系表进行筛查,筛查出环境光传感器检测的环境光亮度匹配的曝光参数,再如步骤S12所示,向摄像头驱动发送环境光亮度匹配的曝光参数。在一些实施例中,AO模块还可以获取标准亮度,并执行步骤S13、向自动曝光模块发送标准亮度和环境光亮度匹配的曝光参数。
摄像头驱动接收到环境光亮度匹配的曝光参数后,执行步骤S14、配置摄像头以环境光亮度匹配的曝光参数运行。摄像头以环境光亮度匹配的曝光参数运行,执行步骤S15、拍摄用户的人脸图像,并发送拍摄的人脸图像。最简图像前端单元执行步骤S16、读取人脸图像,基于安全机制将人脸图像存储于内存的安全buffer。
可以理解的是,AO模块筛选出的环境光亮度匹配的曝光参数,也可以称之为初始曝光参数。摄像头以该初始曝光参数运行,按照摄像头的拍摄频率拍摄人脸图像,可以得到人脸图像。通常情况下,摄像头以初始曝光参数运行,可以拍摄出多帧人脸图像。
AO模块执行步骤S17-1、获取内存的安全buffer存储的人脸图像的图像数据,再通过分析图像数据,确定用户人眼是否注视显示屏。AO模块确定用户人眼注视显示屏时,执行步骤S18、控制电子设备的显示屏不熄屏。AO模块分析图像数据,确定用户人眼是否注视显示屏的方式,可如下述实施例三的内容。
自动曝光模块执行步骤S17-2、获取内存的安全buffer存储的人脸图像的图像数据,再利用人脸图像的图像数据,计算人脸图像的图像亮度;将人脸图像的图像亮度与标准亮度做比对,得到比对结果。在一些实施例中,自动曝光模块获取的人脸图像的图像数据包括人脸图像包括的每个像素点的红(Red)、绿(Green)、蓝(Blue)分量,自动曝光模块计算每个像素点的红(Red)、绿(Green)、蓝(Blue)分量的平均值,作为人脸图像的图像亮度。
若比对结果反映出人脸图像的图像亮度与标准亮度的差值小于预设值,自动曝光模块则不执行操作,摄像头还是以环境光亮度匹配的曝光参数运行。
若比对结果反映出人脸图像的图像亮度与标准亮度的差值不小于预设值,自动曝光模块还用于依据比对结果调整环境光亮度匹配的曝光参数,得到曝光参数调整值。自动曝光模块再执行步骤S19、将曝光参数调整值向AO模块传输,AO模块接收到曝光参数调整值后,执行步骤S20、向摄像头驱动发送曝光参数调整值。摄像头驱动以步骤S21所示,配置摄像头以曝光参数调整值运行拍摄图像。
自动曝光模块调整曝光参数的截止条件如前所述。在一些实施例中,自动曝光模块最后一次调整曝光参数之后的曝光参数调整值会发送至AO模块,由AO模块更新到环境光亮度与曝光参数的对应关系表中,作为该对应关系表中环境光传感器检测的环境光亮度匹配的曝光参数。随着自动曝光模块执行调整摄像头的曝光参数流程,环境光亮度与曝光参数的对应关系表会被随之更新,并且,更新后的环境光亮度与曝光参数的对应关系表会被AO模块保存到内存。该内存则指代图4a中展示的内部存储器421。
如前所述,摄像头在暗环境下以通用的曝光参数运行拍摄的图像,是图6中展示的图像11。本实施例中,摄像头也处于暗环境,但以环境光亮度匹配的曝光参数运行拍摄,所拍摄的图像则是图6中展示的图像14。自动曝光模块获取图像14的图像数据,利用图像14的图像数据,经前述内容提出的处理流程,调整摄像头被配置的环境光亮度匹配的曝光参数,得到曝光参数调整值。摄像头以曝光参数调整值拍摄的图像即为满足亮度要求的图像15。
同理,摄像头在强光的亮环境下以通用的曝光参数运行拍摄的图像是,图6中展示的图像21。本实施例中,摄像头也处于暗环境,但以环境光亮度匹配的曝光参数运行拍摄,所拍摄的图像则是图6中展示的图像24。自动曝光模块获取图像24的图像数据,利用图像24的图像数据,经前述内容提出的处理流程,调整摄像头被配置的环境光亮度匹配的曝光参数,得到曝光参数调整值。摄像头以曝光参数调整值拍摄的图像即为满足亮度要求的图像25。
由此可以看出:AO模块从环境光亮度与曝光参数的对应关系表中,筛查出环境光传感器检测的环境光亮度匹配的曝光参数,并通过摄像头驱动配给摄像头,能够保证摄像头以该曝光参数运行所拍摄图像的图像亮度相对合理,不过亮或过暗,如此,自动曝光模块不需要反复多次调整曝光参数,减少自动曝光模块调整摄像头的曝光参数的耗时。
并且,自动曝光模块较快的调整出符合图像亮度要求的曝光参数,AO模块能够以图像亮度满足要求的图像评判图像人眼是否注视显示屏,保证了AO模块评判的准确度。
实施例二
为解决前述问题二,本申请另一实施例还提供了一种控制器,控制器可以理解成图4a展示的处理器的一个处理单元。参见图7,控制器包括AO模块、自动曝光模块和摄像头驱动。AO模块、自动曝光模块和摄像头驱动均为控制器的逻辑单元,功能如前述内容。
本实施例中,AO模块被配置有环境光亮度与标准亮度的对应关系表。在一些实施例中,环境光亮度与标准亮度的对应关系表,包括多个环境光亮度和每个环境光亮度匹配的标准亮度。在另一些实施例中,环境光亮度与标准亮度的对应关系表包括多个环境光亮度区间和每个环境光亮度区间匹配的标准亮度。
标准亮度可以调整摄像头的曝光参数以使得摄像头拍摄的图像所要达到的目标亮度。标准亮度的取值范围为:0~1024。不同的环境光亮度下,标准亮度略有差异。因此,构建不同环境光亮度和标准亮度的对应关系表。
在一些实施例中,环境光亮度与标准亮度的对应关系表的生成方式为:
电子设备的前置摄像头在一个环境光亮度下被配置以多种曝光参数运行拍照,采集前置摄像头在不同环境光亮度下拍摄的人眼注视显示屏时的图像,得到不同环境光亮度下的多帧样本图像。
针对每一个环境光亮度,按照下述方式处理该环境光亮度下的多帧样本图像,得到该环境光亮度匹配的标准亮度。
AO模块利用一帧样本图像的图像数据,识别样本图像中人眼是否注视显示屏。若AO模块识别图像中人眼未注视显示屏,则忽略该样本图像,采集该环境光亮度的下一帧样本图像;若AO模块识别图像中人眼注视显示屏,则记录该样本图像的图像亮度,并采集该环境光亮度的下一帧样本图像。如此,得到一个环境光亮度下记录的多帧样本图像的图像亮度,利用一个环境光亮度下记录的多帧样本图像的图像亮度,计算得到该环境光亮度匹配的标准亮度。
一些实施例中,选择记录的多帧样本图像的图像亮度的中间值作为该环境光亮度匹配的标准亮度;另一些实施例中,计算记录的多帧样本图像的图像亮度的平均值,作为该环境光亮度匹配的标准亮度;另一些实施例中,记录的多帧样本图像中,随机选择一帧样本图像的图像亮度,作为该环境光亮度匹配的标准亮度。
在一些实施例中,针对选择的每一帧样本图像,利用该样本图像的每个像素点的红(Red)、绿(Green)、蓝(Blue)分量,计算每个像素点的红(Red)、绿(Green)、蓝(Blue)分量的平均值,作为样本图像的图像亮度。
下表3提供了一种环境光亮度与标准亮度的对应关系表,该对应关系中展示了多个环境光亮度区间所匹配的标准亮度。
表3
环境光亮度(lux) | 标准亮度(0~1024) |
小于150 | 480 |
150-250 | 488 |
250~350 | 503 |
350-450 | 508 |
450-550 | 510 |
550-650 | 511 |
650-750 | 512 |
750-850 | 512 |
850-950 | 512 |
950-1050 | 512 |
1050-1300 | 513 |
1350-1550 | 515 |
1550-2000 | 520 |
2000-4000 | 525 |
大于4000 | 532 |
本实施例中,AO模块如图7所示,执行步骤S31、获取环境光传感器检测的环境光亮度,并从环境光亮度与标准亮度的对应关系表进行筛查,筛查出环境光传感器检测的环境光亮度匹配的标准亮度;再如步骤S32所示,向自动曝光模块发送筛查出环境光亮度匹配的标准亮度。
摄像头以配置的初始曝光参数运行,执行步骤S33、拍摄用户的人脸图像,并发送人脸图像。最简图像前端单元执行步骤S34、读取人脸图像,基于安全机制将人脸图像存储于内存的安全buffer。
在一些实施例中,摄像头被配置的初始曝光参数,可以为提前设定的通用曝光参数;在另一些实施例中,摄像头被配置的初始曝光参数,还可以为前述实施例一所述,是环境光亮度匹配的曝光参数。
AO模块执行步骤S35-1、获取内存的安全buffer存储的人脸图像的图像数据,再通过分析图像数据,确定用户人眼是否注视显示屏。AO模块确定用户人眼注视显示屏时,执行步骤S36、控制电子设备的显示屏不熄屏。AO模块分析图像数据,确定用户人眼是否注视显示屏的方式,可如下述实施例三的内容。
自动曝光模块用于从AO模块中获取标准亮度,还用于执行步骤S35-2、获取内存的安全buffer存储的人脸图像的图像数据,并利用人脸图像的图像数据,计算人脸图像的图像亮度;将人脸图像的图像亮度与自动曝光模块发送筛查出标准亮度做比对,得到比对结果。在一些实施例中,自动曝光模块获取的人脸图像的图像数据包括人脸图像包括的每个像素点的红(Red)、绿(Green)、蓝(Blue)分量,自动曝光模块计算每个像素点的红(Red)、绿(Green)、蓝(Blue)分量的平均值,作为人脸图像的图像亮度。
若比对结果反映出人脸图像的图像亮度与标准亮度的差值小于预设值,自动曝光模块则不执行操作,摄像头还是初始曝光参数运行。若比对结果反映出人脸图像的图像亮度与 标准亮度的差值不小于预设值,自动曝光模块还用于依据比对结果调整AO模块筛查出的环境光亮度匹配的曝光参数,得到曝光参数调整值。
自动曝光模块执行步骤S37、将曝光参数调整值向AO模块发送,AO模块接收曝光参数调整值后,执行步骤S38、向摄像头驱动传输曝光参数调整值。摄像头驱动执行步骤S39、配置摄像头以曝光参数调整值运行拍摄图像。
在一个示例中,自动曝光模块获取通用的标准亮度,该通用的标准亮度下的图像如图8展示的图像13a。摄像头在暗环境下拍摄的图像为图8展示的图像11,自动曝光模块获取图像11的图像数据,将该图像数据与通用的标准亮度比对,可以理解是与图像13a的图像亮度比对,按照比对结果调整摄像头的曝光参数,摄像头会依次拍摄得到图像12a和图像13a,且自动曝光模块判断在拍摄得到图像13a时,其图像亮度满足标准亮度要求。但是,图像13a的图像亮度并不高,以此图像提供AO模块进行图像人眼是否注视显示屏的判断,非常容易误判。
本实施例中,自动曝光模块获取的标准亮度与环境光亮度相匹配,该标准亮度下的图像如图8展示的图像13b。自动曝光模块获取图像11的图像数据,将该图像数据与环境光亮度匹配的标准亮度比对,即与图像13b的图像亮度做比对,并按照比对结果调整摄像头的曝光参数,摄像头会依次拍摄得到图像12b和图像13b,且自动曝光模块判断图像13b的图像亮度满足标准亮度要求。由图8可以看出,图像13b的图像亮度也较为合理。如此,AO模块以该图像进行图像人眼是否注视显示屏的判断,准确度高。
同理,在强光的亮环境下,通用的标准亮度下的图像如图8展示的图像23a,环境光亮度匹配的标准亮度的图像如图8展示的图像23b。摄像头拍摄的图像为图像21,按照通用的标准亮度调整摄像头的曝光参数,摄像头会依次拍摄得到图像22a和图像23a,图像亮度越来越高,按照环境光亮度匹配的标准亮度调整摄像头的曝光参数,摄像头依次拍摄得到的图像为图像22b和图像23b。
由此可以看出:AO模块从环境光亮度与标准亮度的对应关系表中,筛查出环境光传感器检测的环境光亮度匹配的标准亮度,并发送至自动曝光模块。自动曝光模块以环境光亮度匹配的标准亮度为基准,调整摄像头的曝光参数,以使摄像头以曝光参数调整值进行拍摄,得到满足环境光亮度匹配的标准亮度要求的图像,保证了摄像头拍摄图像的图像质量,提高了AO模块判断图像中人眼是否注视显示屏的准确度。
实施例三
为解决前述问题三,本申请另一实施例还提供了一种控制器,控制器可以理解成图4a展示的处理器的一个处理单元。参见图9,控制器包括AO模块、自动曝光模块和摄像头驱动。AO模块、自动曝光模块和摄像头驱动均为控制器的逻辑单元,功能如前述内容。
AO模块被配置有环境光亮度与置信度门限值的对应关系表。在一些实施例中,环境光亮度与置信度门限值的对应关系表,包括多个环境光亮度和每个环境光亮度匹配的置信度门限值。在另一些实施例中,环境光亮度与置信度门限值的对应关系表,包括多个环境光亮度区间,以及每个环境光亮度区间匹配的置信度门限值。
在一个可能的实施方式中,环境光亮度与置信度门限值的对应关系表中,暗光区间和强光区间匹配的置信度门限值,小于正常光下的置信度门限值。
下表4提供了一种环境光亮度和置信度门限值的对应关系,该对应关系中展示了三个环境光亮度区间所匹配的置信度门限值,以及原始置信度门限值。
表4
环境光亮度 | 原始置信度门限值 | 置信度门限值 |
<10lux | 0.95 | 0.9 |
10lux-8W lux | 0.95 | 0.95 |
>8W lux | 0.95 | 0.9 |
表4中,小于10lux的环境光亮度区间被认定为暗光区间,大于8W lux的环境光亮度区间被认定为强光区间,10lux-8W lux的环境光区间属于正常光区间。表4展示了一种暗光区间、强光区间和正常光区间的划分方式的示例,但暗光区间、强光区间和正常光区间的划分方式并不限于表4中提出的具体数值。
表4中,小于10lux环境光亮度区间,以及大于8W lux的环境光亮度区间,匹配的原始置信度门限与10lux-8W lux的环境光亮度区间的置信度门限值相同,均为0.95。由于暗光和强光下,摄像头拍摄图像的图像质量不高,欠曝或过曝,所以置信度门限值与正常光下的置信度门限值相同,并不合理。在表3中展示的本实施例方案,小于10lux环境光亮度区间,以及大于8W lux的环境光亮度区间,对应的置信度门限值设置为0.9,是小于10lux-8W lux的环境光亮度区间的置信度门限值。
表4展示的三个环境光亮度区间对应的一种置信度门限值,三个环境光亮度区间对应的置信度门限并不仅限于表4中的数值,可以根据实际情况进行调整。并且,环境光亮度和置信度门限值的对应关系中,环境光亮度区间也不限于表4提供的三种环境光亮度区间,在一些实施例中,环境光亮度和置信度门限值的对应关系中可以包括多于3个的环境光亮度区间,及每个环境光亮度区间匹配的置信度门限值。
本实施例中,参见图9,AO模块执行步骤S41、获取环境光传感器检测的环境光亮度,并从环境光亮度与置信度门限值的对应关系表进行筛查,筛查出环境光传感器检测的环境光亮度匹配的置信度门限值。
AO模块还可以获取标准亮度,并执行步骤S42、向自动曝光参数发送标准亮度。在一些实施例中,AO模块获取的标准亮度可以是通用标准亮度。在另一些实施例中,AO模块获取的标准亮度可以如前述实施例二所述,是环境光亮度匹配的标准亮度。
摄像头以配置的初始曝光参数运行,执行步骤S43、拍摄用户的人脸图像。最简图像前端单元执行步骤S44、读取人脸图像,基于安全机制将人脸图像存储于内存的安全buffer。
在一些实施例中,摄像头被配置的初始曝光参数,可以为提前设定的通用曝光参数;在另一些实施例中,摄像头被配置的初始曝光参数,还可以为前述实施例一所述,是环境光亮度匹配的曝光参数。
AO模块还用于按照图像的存储顺序,执行步骤S45-1,依次获取内存的安全buffer存储的图像序列的每一帧人脸图像的图像数据。针对AO模块获取的每一帧人脸图像的图像 数据,AO模块比对人脸图像的图像数据和样本特征库,根据人脸图像的图像数据和样本特征库的比对结果,对图像配置置信度,该置信度用于表征图像中人眼注视显示屏的概率。
在一些实施例中,样本特征库中包括人眼注视显示屏图像的特征数据。该特征数据的确定方式为:获取大量的样本图像,样本图像包括人眼注视显示屏的样本图像和人眼未注视显示屏的样本图像,利用每一个样本图像的图像数据进行学习,得到表征人眼注视显示屏图像的特征数据。
AO模块还用于判断人脸图像的置信度是否小于筛查出的置信度门限值。AO模块执行图9所示的步骤S46、判断人脸图像的置信度不小于置信度门限值,则确定人脸图像中人眼注视显示屏,进而控制电子设备的显示屏不熄屏。
AO模块判断人脸图像的置信度小于置信度门限值,则确定人脸图像中人眼未注视显示屏。
自动曝光模块用于从AO模块中获取标准亮度,还用于执行步骤S45-2、获取内存的安全buffer存储的人脸图像的图像数据,并利用人脸图像的图像数据,计算人脸图像的图像亮度;将人脸图像的图像亮度与自动曝光模块发送筛查出标准亮度做比对,得到比对结果。在一些实施例中,自动曝光模块获取的人脸图像的图像数据包括人脸图像包括的每个像素点的红(Red)、绿(Green)、蓝(Blue)分量,自动曝光模块计算每个像素点的红(Red)、绿(Green)、蓝(Blue)分量的平均值,作为人脸图像的图像亮度。
若比对结果反映出人脸图像的图像亮度与标准亮度的差值小于预设值,自动曝光模块则不执行操作,摄像头还是初始曝光参数运行。若比对结果反映出人脸图像的图像亮度与标准亮度的差值不小于预设值,自动曝光模块还用于依据比对结果调整AO模块筛查出的环境光亮度匹配的曝光参数,得到曝光参数调整值。
自动曝光模块执行步骤S47、将曝光参数调整值向AO模块发送,AO模块接收曝光参数调整值后,执行步骤S48、向摄像头驱动传输曝光参数调整值。摄像头驱动执行步骤S49、配置摄像头以曝光参数调整值运行拍摄图像。
在一些实施例中,AO模块确定连续多帧图像有一帧图像中的人眼注视显示屏,则执行控制电子设备的显示屏不熄屏,与按照电子设备的显示屏被设定的屏幕熄灭时间来控制显示屏熄屏的方案进行结合,结合的方式如下。
假设电子设备被设定的屏幕熄灭时间为15秒。在电子设备的显示屏被启动显示数据时刻起启动计时,计时满规定时间,如7秒后,电子设备的前置摄像头拍摄图像,AO模块依次获取图像,并针对获取的每一帧图像,执行下述操作:
比对图像的图像数据和样本特征库,根据图像的图像数据和样本特征库的比对结果,对图像配置置信度。判断图像被配置的置信度是否小于筛查出的置信度门限值,若图像被配置的置信度不小于置信度门限值,则确定图像中人眼注视显示屏;若图像被配置的置信度小于置信度门限值,则确定图像中人眼未注视显示屏。
在计时满7秒开始,一直到计时满15秒的时间段内,AO模块确定有一帧图像中人眼注视显示屏,则控制显示屏不熄屏。若AO模块持续确定图像中人眼均未注视显示屏,则在计时满15秒的时刻,控制显示屏熄屏。
需要说明的是,以上内容是以屏幕熄灭时间为15秒为示例对AO模块的处理流程进行介绍,但这并不构成对其的限定。
以下以一个示例对本实施例的方案进行说明。在一个示例中,用户在暗光下观看电子设备的显示屏,摄像头处于暗环境下依次拍摄的图像为图10展示的图像11至图像15。AO模块被配置的原始置信度门限值为0.95。
如图10所示,AO模块获取图像11的图像数据,对图像11配置置信度为0.89,判断图像11的置信度0.89小于置信度门限值0.95,判定图像11中人眼未注视显示屏。AO模块获取图像12的图像数据,对图像12配置置信度为0.89,判断图像12的置信度0.89小于置信度门限值0.95,同样也判定图像12中人眼未注视显示屏。同理,AO模块针对图像13、图像14和图像15配置置信度分别为0.89、0.92和0.94,且分别判断图像13的置信度0.89小于置信度门限值0.95,图像14的置信度0.92小于置信度门限值0.95,图像15的置信度0.94也小于置信度门限值0.95,判定图像13、图像14和图像15中人眼未注视显示屏。
本实施例中,AO模块获取环境光检测传感器检测的环境光亮度,用该环境光亮度,从环境光亮度与置信度门限值的对应关系表进行筛选,筛选出的置信度门限值为0.90。
如图10所示,AO模块同样获取图像11的图像数据,对图像11配置置信度为0.89。但是判断图像11的置信度0.89小于置信度门限值0.90,判定图像11中人眼未注视显示屏;获取图像12的图像数据,对图像12配置置信度为0.89,判断图像12的置信度0.89也小于置信度门限值0.90,同样也判定图像12中人眼未注视显示屏。但是,AO模块针对图像13、图像14和图像15配置置信度分别为0.89、0.92和0.94,且分别判断图像13的置信度0.89小于置信度门限值0.90,图像14的置信度0.92不小于置信度门限值0.90,图像15的置信度0.94也小于置信度门限值0.90,判定图像13中人眼未注视显示屏,且判定图像14和图像15中人眼均注视显示屏。AO模块确定有一帧图像中人眼注视显示屏,控制显示屏不熄屏。
上述示例展示的是用户在暗环境下观看电子设备的显示屏的场景,用户在强光的亮环境下观看电子设备,AO模块采用原始的置信度门限值,和从环境光亮度值和置信度门限值的对应关系表中筛选出的置信度门限值,分别进行图像人眼是否注视显示屏的结果,与在暗光下的场景基本等同。
由上述内容可以看出:暗环境或强光的亮环境下,摄像头拍摄图像的图像亮度过低或过高,图像欠曝或过曝,AO模块对图像配置的置信度会降低。对应的,AO模块从环境光亮度值和置信度门限值的对应关系表中,也能筛选出的数值较低的置信度门限值。如此,避免了AO模块由于图像亮度不够或过高导致配置的置信度小于门限值,导致AO模块的误判,影响AO模块的检测准确度的问题。
上述实施例提出的方案,降低暗环境和强光的亮环境的置信度门限值,很有可能带来的是AO模块检测图像中人眼注视显示屏误判的问题,为此,在一个可选的实施方式中,AO模块还用于执行人脸检测事件,用以来辅助判断图像中人眼注视显示屏。
在一些实施例中,AO模块判断图像的置信度不小于筛查出的置信度门限值之后,还用于执行人脸检测事件,并判断人脸检测事件的结果是否为检测到人脸。若判断人脸检测事 件的结果为检测到人脸,则确定图像中人眼注视显示屏,若判断人脸检测事件的结果为未检测到人脸,则确定图像中人眼未注视显示屏。
在另一些实施例中,AO模块在执行判断图像的置信度是否小于筛查出的置信度门限值之前,还用于执行人脸检测事件,并判断人脸检测事件的结果是否为检测到人脸,若判断人脸检测事件的结果为检测到人脸,则判断图像的置信度是否小于筛查出的置信度门限值,若判断图像的置信度不小于筛查出的置信度门限值,则确定图像中人眼注视显示屏。若判断人脸检测事件的结果是否为未检测到人脸,或者判断图像的置信度小于筛查出的置信度门限值,则确定图像中人眼未注视显示屏。
在另一些实施例中,AO模块并行执行判断图像的置信度是否小于筛查出的置信度门限值,和执行人脸检测事件,判断人脸检测事件的结果是否为检测到人脸;若AO模块判断图像的置信度不小于筛查出的置信度门限值,且判断人脸检测事件的结果为检测到人脸,则确定图像中人眼注视显示屏;否则,均确定图像中人眼未注视显示屏。
AO模块执行人脸检测事件的实施方式很多,本实施例不做限定,以下提供三种方式:
方式一、AO模块通过Haar-Like特征与Adaboost算法实现人脸检测。该方式中,采用Haar-Like特征表示人脸,对各Haar-Like特征进行训练得到弱分类器,通过Adaboost算法选择多个最能代表人脸的弱分类器构造成强分类器,将若干个强分类器串联组成一个级联结构的层叠分类器,即人脸检测器。
AO模块提取摄像头拍摄图像的Haar-Like特征,调用人脸检测器处理提取的Haar-Like特征,得到人脸识别结果,该人脸识别结果表征摄像头拍摄的图像是否包含人脸。
方式二、AO模块通过多尺度块状局部二值模式(Multi-scale Block based Local Binary Patterns,MBLBP)特征与Adaboost算法实现人脸检测。该方式中,采用可表示基准框与8个邻域框的人脸图像信息的MBLBP特征表示人脸,通过比较基准框的平均灰度同周围8个邻域框各自的平均灰度计算MBLBP特征,同样对MBLBP特征训练得到弱分类器,通过Adaboost算法选择多个最能代表人脸的弱分类器构造成强分类器,将若干个强分类器串联组成一个级联结构的层叠分类器,即人脸检测器。
AO模块提取基准框与8个邻域框的图像信息的MBLBP特征,通过比较基准框的平均灰度同周围8个邻域框各自的平均灰度计算MBLBP特征。调用人脸检测器处理MBLBP特征,得到人脸识别结果,该人脸识别结果表征摄像头拍摄的图像是否包含人脸。
方式三、AO模块通过多尺度的结构化定序测量特征(Multi-scale Structured Ordinal Features,MSOF)与Adaboost算法实现人脸检测。该方式中,采用可表示基准框与8个邻域框的人脸图像信息的MSOF特征表示人脸,8个邻域框相对于基准框的距离可调,且基准框与8个邻域框可以不相连。同样对MSOF特征训练得到弱分类器,通过Adaboost算法选择多个最能代表人脸的弱分类器构造成强分类器,将若干个强分类器串联组成一个级联结构的层叠分类器,即人脸检测器。
AO模块提取基准框与8个邻域框的图像信息的MSOF特征。调用人脸检测器处理MSOF特征,得到人脸识别结果,该人脸识别结果表征摄像头拍摄的图像是否包含人脸。
以下以一个示例对本实施例的方案进行说明。本示例中,用户在暗光下观看电子设备的显示屏,摄像头处于暗环境下依次拍摄的图像为图11展示的图像11至图像15。AO模 块获取环境光检测传感器检测的环境光亮度,用该环境光亮度,从环境光亮度与置信度门限值的对应关系表进行筛选,筛选出的置信度门限值为0.90。
如图11所示,AO模块获取图像11的图像数据,对图像11配置置信度为0.89,判断图像11的置信度0.89小于置信度门限值0.90,判定图像11中人眼未注视显示屏;获取图像12的图像数据,对图像12配置置信度为0.89,判断图像12的置信度0.90也小于置信度门限值0.90,同样也判定图像12中人眼未注视显示屏。同理,AO模块针对图像13、图像14和图像15配置置信度分别为0.91、0.92和0.94,且分别判断图像13的置信度0.91不小于置信度门限值0.90,图像14的置信度0.92不小于置信度门限值0.90,图像15的置信度0.94也小于置信度门限值0.90,判定图像13、图像14和图像15中人眼均注视显示屏。AO模块在连续五帧图像中确定有图像中人眼注视显示屏,控制显示屏不熄屏。但是,由图11中的图像15可以看出:用户人眼并未注视显示屏。因此,控制显示屏不熄屏则为误判操作。
本示例中,AO模块针对图像11至图像15执行人脸检测事件的结果均为:未检测到人脸。如此,AO模块针对图像11,判断图像0.89的置信度小于置信度门限值0.90,且确定人脸检测事件结果是未检测到人脸,因此确定图像11中人眼未注视显示屏;同理,AO模块也确定图像12、图像13、图像14和图像15中人眼均未注视显示屏,控制显示屏熄屏。
由上述示例可以看出:在暗环境和强光的亮环境的置信度门限值被降低时,AO模块对摄像头拍摄的图像进行人脸检测事件,以辅助图像中人眼注视显示屏的检测,避免AO模块以降低后的置信度门限值执行图像中人眼注视显示屏的检测的误判的问题。
实施例四
本申请另一实施例提供了一种显示的控制方法,该显示的控制方法应用于电子设备,该电子设备如图4a所示,包括处理器、环境光传感器、摄像头和显示屏,处理器包括如前述实施例一、实施例二或者实施例三提供的控制器。
参见图12,本实施例提供的显示的控制方法,包括:
S1201、AO模块获取环境光传感器检测的环境光亮度。
S1202a、AO模块从环境光亮度与曝光参数的对应关系表进行筛查,筛查出环境光传感器检测的环境光亮度匹配的曝光参数。
其中,本步骤的具体实现方式可参见实施例一内容,此处不再赘述。
实施例一展示了以列表的形式体现环境光亮度与曝光参数的对应关系,但这不构成对环境光亮度和曝光参数的对应关系的展示显示的限定。
S1202b、AO模块从环境光亮度与标准亮度的对应关系表进行筛查,筛查出环境光传感器检测的环境光亮度匹配的标准亮度。
其中,本步骤的具体实现方式可参见实施例二内容,此处不再赘述。
同理,实施例二展示了以列表的形式体现环境光亮度与标准亮度的对应关系,但这不构成对环境光亮度和标准亮度的对应关系的展示显示的限定。
S1202c、AO模块从环境光亮度与置信度门限值的对应关系表进行筛查,筛查出环境光传感器检测的环境光亮度匹配的置信度门限值。
其中,本步骤的具体实现方式可参见实施例三内容,此处不再赘述。
同理,实施例三展示了以列表的形式体现环境光亮度与置信度门限值的对应关系,但这不构成对环境光亮度和置信度门限值的对应关系的展示显示的限定。
图12展示了S1202a、S1202b和S1202c三个步骤的一种执行顺序,但图12展示的执行顺序并不构成S1202a、S1202b和S1202c三个步骤执行顺序的限定,三个步骤也可以采用其他执行顺序,也可以并行执行。
S1203a、AO模块向自动曝光模块发送筛查出曝光参数和标准亮度。
S1203b、AO模块向摄像头驱动发送筛查出曝光参数。
S1204、摄像头驱动配置摄像头以与环境光亮度匹配的曝光参数运行拍摄图像,拍摄的图像存储到内存。
摄像头将拍摄的图像存储到内存过程中,自动曝光模块执行步骤S1205至S1209,AO模块执行步骤S1212至S1217。自动曝光模块和AO模块可并行运行,互不干扰。
并且,自动曝光模块执行步骤S1205至S1209的过程,可以理解成,是对显示的控制方法中对摄像头的曝光参数的进行调整的方案。
S1205、自动曝光模块获取内存存储的人脸图像的图像数据。
S1206、自动曝光模块利用人脸图像的图像数据,计算人脸图像的图像亮度。
S1207、自动曝光模块将人脸图像的图像亮度与标准亮度做比对,得到比对结果。
若比对结果反映出人脸图像的图像亮度与标准亮度的差值小于预设值,自动曝光模块则不执行操作,摄像头还是以AO模块筛查出的曝光参数运行。
若比对结果反映出人脸图像的图像亮度与标准亮度的差值不小于预设值,自动曝光模块执行S1208、依据比对结果调整曝光参数,得到曝光参数调整值。
S1209、自动曝光模块向AO模块发送曝光参数调整值。
S1210、AO模块发送向摄像头驱动发送曝光参数调整值。
S1211、摄像头驱动配置摄像头以曝光参数调整值运行拍摄图像,保存拍摄的图像到内存。
S1212、AO模块获取内存存储的人脸图像的图像数据。
S1213、针对AO模块获取的每一帧人脸图像的图像数据,AO模块比对人脸图像的图像数据和样本特征库,根据人脸图像的图像数据和样本特征库的比对结果,对图像配置置信度,该置信度用于表征图像中人眼注视显示屏的概率。
S1214、AO模块判断人脸图像的置信度是否小于筛查出的置信度门限值。
S1215、AO模块判断人脸图像的置信度不小于置信度门限值,则确定人脸图像中人眼注视显示屏。
S1216、AO模块判断人脸图像的置信度小于置信度门限值,则确定人脸图像中人眼未注视显示屏。
AO模块确定连续多帧图像的一帧人脸图像中的人眼注视显示屏,则AO模块执行S1217、控制电子设备的显示屏不熄屏。
在一些实施例中,AO模块确定连续多帧图像中的人眼均未注视显示屏,则AO模块控制电子设备的显示屏熄屏。具体的,该连续多帧图像,是摄像头在按照电子设备的屏幕熄 灭时间来确定的时长内拍摄的,按照电子设备的屏幕熄灭时间来确定时长的方式可参考实施例三的内容。
在一个实施方式中,步骤S1212之后,AO模块还可以针对获取的每一帧人脸图像,执行人脸检测事件,判断人脸检测事件的结果是否为检测到人脸,在AO模块判断一帧人脸图像的人脸检测事件的结果为检测到人脸,且步骤S1214的判断结果为图像的置信度不小于门限值,则执行步骤S1215。
本实施方式的具体内容,可参见上述实施例三的内容,此处不再赘述。
Claims (56)
- 一种参数的调整方法,其特征在于,应用于电子设备,所述电子设备包括前置摄像头和显示屏,所述参数的调整方法包括:在所述显示屏显示数据时,获取所述前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像,所述初始曝光参数与环境光亮度匹配;利用所述图像的图像数据,计算所述图像的图像亮度;以所述图像的图像亮度与标准亮度的差值,调整所述摄像头的初始曝光参数,得到曝光参数调整值,所述曝光参数调整值用于所述前置摄像头响应第二指令,以所述曝光参数调整值运行拍摄图像。
- 根据权利要求1所述的参数的调整方法,其特征在于,在所述显示屏显示数据时,获取所述前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像之前,还包括:利用所述环境光亮度,确定所述初始曝光参数。
- 根据权利要求2所述的参数的调整方法,其特征在于,所述利用所述环境光亮度,确定所述初始曝光参数,包括:基于环境光亮度与曝光参数的对应关系,确定与所述环境光亮度匹配的曝光参数。
- 根据权利要求3所述的参数的调整方法,其特征在于,所述环境光亮度和曝光参数的对应关系的生成方法,包括:获取多组样本图像,一组所述样本图像对应一个环境光亮度,且一组所述样本图像包括多个样本图像,每个所述样本图像对应一个曝光参数;利用多组所述样本图像对应的环境光亮度,以及多组所述样本图像中的人眼注视显示屏的样本图像对应的曝光参数,生成所述环境光亮度和曝光参数的对应关系。
- 根据权利要求3所述的参数的调整方法,其特征在于,所述环境光亮度和曝光参数的对应关系的生成方法,包括:获取多个历史曝光参数调整值,以及每个所述历史曝光参数调整值对应的环境光亮度,每个所述历史曝光参数调整值满足所述标准亮度的要求;利用所述多个历史曝光参数调整值,以及每个所述历史曝光参数调整值对应的环境光亮度,生成所述环境光亮度和曝光参数的对应关系。
- 根据权利要求3所述的参数的调整方法,其特征在于,所述以所述图像的图像亮度与标准亮度的差值,调整所述前置摄像头的初始曝光参数,得到曝光参数调整值之后,还包括:以所述曝光参数调整值更新所述环境光亮度和曝光参数的对应关系。
- 根据权利要求1至6中任意一项所述的参数的调整方法,其特征在于,所述初始曝光参数包括:曝光时长、模拟增益和数字增益中的至少一个。
- 根据权利要求1至7中任意一项所述的参数的调整方法,其特征在于,在所述以所述图像的图像亮度与标准亮度的差值,调整所述前置摄像头的初始曝光参数,得到曝光参数调整值之前,还包括:利用所述环境光亮度,确定所述标准亮度。
- 根据权利要求8所述的参数的调整方法,其特征在于,所述利用所述环境光亮度,确定所述标准亮度,包括:基于环境光亮度与标准亮度的对应关系,确定与所述环境光亮度匹配的标准亮度。
- 根据权利要求9所述的参数的调整方法,其特征在于,所述环境光亮度与标准亮度的对应关系的生成方法,包括:获取多组样本图像,一组所述样本图像对应一个环境光亮度,且一组所述样本图像包括多个样本图像;利用多组所述样本图像对应的环境光亮度,以及多组所述样本图像中的人眼注视显示屏的样本图像的图像亮度,生成所述环境光亮度和标准亮度的对应关系。
- 根据权利要求9所述的参数的调整方法,其特征在于,所述环境光亮度与标准亮度的对应关系包括:多个环境光亮度区间和每个所述环境光亮度区间匹配的标准亮度。
- 根据权利要求1至11中任意一项所述的参数的调整方法,其特征在于,利用所述图像的图像数据,计算所述图像的图像亮度,包括:获取所述图像包括的每个像素点的RGB分量;计算所述图像的每个像素点的RGB分量的均值,所述均值作为所述图像的图像亮度。
- 根据权利要求1至12中任意一项所述的参数的调整方法,其特征在于,在所述显示屏显示数据时,获取所述前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像之后,还包括:根据所述图像的图像数据和样本特征库的比对结果,对所述图像配置置信度,所述置信度用于表征所述图像中人眼注视显示屏的概率;若所述图像的置信度不小于预设门限值,则控制所述显示屏不熄屏,所述预设门限值与环境光亮度匹配。
- 根据权利要求13所述的参数的调整方法,其特征在于,所述控制所述显示屏不熄屏之前,还包括:确定所述前置摄像头在预设时段内拍摄的图像中,存在一帧图像的置信度不小于所述预设门限值,所述预设时段利用所述显示屏被设定的屏幕熄灭时间确定。
- 根据权利要求14所述的参数的调整方法,其特征在于,还包括:若所述前置摄像头在所述预设时段内拍摄的图像的置信度,均小于所述预设门限值,则控制所述显示屏熄屏。
- 根据权利要求13所述的参数的调整方法,其特征在于,在所述显示屏显示数据时,获取所述前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像之前,还包括:利用所述环境光亮度,确定所述预设门限值。
- 根据权利要求16所述的参数的调整方法,其特征在于,所述利用所述环境光亮度,确定所述预设门限值,包括:基于环境光亮度与门限值的对应关系,确定与所述环境光亮度匹配的门限值。
- 根据权利要求17所述的参数的调整方法,其特征在于,所述环境光亮度和门限值的对应关系包括:多个环境光亮度区间,以及每个所述环境光亮度区间匹配的门限值。
- 根据权利要求17所述的参数的调整方法,其特征在于,所述环境光亮度和门限值的对应关系中,强光区间匹配的门限值和弱光区间匹配的门限值,小于正常光区间匹配的门限值。
- 根据权利要求13至19中任意一项所述的参数的调整方法,其特征在于,所述控制所述显示屏不熄屏之前,还包括:获取所述图像的人脸检测结果;确定所述人脸检测结果为检测到人脸。
- 一种显示的控制方法,其特征在于,应用于电子设备,所述电子设备包括前置摄像头和显示屏,所述显示的控制方法包括:在所述显示屏显示数据时,获取所述前置摄像头拍摄的图像;根据图像的图像数据和样本特征库的比对结果,对所述图像配置置信度,所述置信度用于表征所述图像中人眼注视显示屏的概率;若所述图像的置信度不小于预设门限值,则确定所述图像中的人眼注视显示屏,所述预设门限值与环境光亮度匹配。
- 根据权利要求21所述的显示的控制方法,其特征在于,所述若所述图像的置信度不小于预设门限值,则确定所述图像中人眼注视显示屏之后,还包括:控制所述显示屏不熄屏。
- 根据权利要求22所述的显示的控制方法,其特征在于,所述控制所述显示屏不熄屏之前,还包括:确定所述前置摄像头在预设时段内拍摄的多帧图像中,存在一帧图像中的人眼注视显示屏,所述预设时段利用所述显示屏被设定的屏幕熄灭时间确定。
- 根据权利要求21至23中任意一项所述的显示的控制方法,其特征在于,还包括:若所述图像的置信度小于所述预设门限值,则确定所述图像中人眼未注视所述显示屏。
- 根据权利要求24所述的显示的控制方法,其特征在于,所述若所述图像的置信度小于所述预设门限值,则确定所述图像中人眼未注视所述显示屏之后,还包括:若所述前置摄像头在预设时段内拍摄的多帧图像中的人眼,均未注视所述显示屏,则控制所述显示屏熄屏,所述预设时段利用所述显示屏被设定的屏幕熄灭时间确定。
- 根据权利要求21至25任意一项所述的显示的控制方法,其特征在于,所述若所述图像的置信度不小于预设门限值,则确定所述图像中的人眼注视显示屏之前,还包括:利用所述环境光亮度,确定所述预设门限值。
- 根据权利要求26所述的显示的控制方法,其特征在于,所述利用所述环境光亮度,确定所述预设门限值,包括:基于环境光亮度与门限值的对应关系,确定与所述环境光亮度匹配的门限值。
- 根据权利要求27所述的显示的控制方法,其特征在于,所述环境光亮度和门限值的对应关系包括:多个环境光亮度区间,以及每个所述环境光亮度区间匹配的门限值。
- 根据权利要求27所述的显示的控制方法,其特征在于,所述环境光亮度和门限值的对应关系中,强光区间匹配的门限值和弱光区间匹配的门限值,小于正常光区间匹配的门限值。
- 根据权利要求21至29中任意一项所述的显示的控制方法,其特征在于,所述前置摄像头的初始曝光参数与环境光亮度匹配。
- 根据权利要求30所述的显示的控制方法,其特征在于,所述在所述显示屏显示数据时,获取所述前置摄像头拍摄的图像之前,还包括:基于环境光亮度和曝光参数的对应关系,确定所述初始曝光参数。
- 根据权利要求31所述的显示的控制方法,其特征在于,所述环境光亮度和曝光参数的对应关系的生成方法,包括:获取多组样本图像,一组所述样本图像对应一个环境光亮度,且一组所述样本图像包括多个样本图像,每个所述样本图像对应一个曝光参数;利用多组所述样本图像对应的环境光亮度,以及多组所述样本图像中的人眼注视显示屏的样本图像对应的曝光参数,生成所述环境光亮度和曝光参数的对应关系。
- 根据权利要求31所述的显示的控制方法,其特征在于,所述环境光亮度和曝光参数的对应关系的生成方法,包括:获取多个历史曝光参数调整值,以及每个所述历史曝光参数调整值对应的环境光亮度,所述历史曝光参数调整值满足所述标准亮度要求;利用所述多个历史曝光参数调整值,以及每个所述历史曝光参数调整值对应的环境光亮度,生成所述环境光亮度和曝光参数的对应关系。
- 根据权利要求30至33中任意一项所述的显示的控制方法,其特征在于,所述初始曝光参数包括:曝光时长、模拟增益和数字增益中的至少一个。
- 根据权利要求21至34中任意一项所述的显示的控制方法,其特征在于,所述确定所述图像中的人眼注视显示屏之前,还包括:获取所述图像的人脸检测结果;确定所述人脸检测结果为检测到人脸。
- 一种参数的调整方法,其特征在于,应用于电子设备,所述电子设备包括前置摄像头和显示屏,所述参数的调整方法包括:在所述显示屏显示数据时,获取所述前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像,所述初始曝光参数与环境光亮度匹配;利用所述图像的图像数据,计算所述图像的图像亮度;以所述图像的图像亮度与标准亮度的差值,调整所述摄像头的初始曝光参数,得到曝光参数调整值,所述曝光参数调整值用于所述前置摄像头响应第二指令,以所述曝光参数调整值运行拍摄图像;根据所述图像的图像数据和样本特征库的比对结果,对所述图像配置置信度,所述置信度用于表征所述图像中人眼注视显示屏的概率;若所述图像的置信度不小于预设门限值,则控制所述显示屏不熄屏,所述预设门限值与环境光亮度匹配。
- 根据权利要求36所述的参数的调整方法,其特征在于,在所述显示屏显示数据时,获取所述前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像之前,还包括:利用所述环境光亮度,确定所述初始曝光参数。
- 根据权利要求37所述的参数的调整方法,其特征在于,所述利用所述环境光亮度,确定所述初始曝光参数,包括:基于环境光亮度与曝光参数的对应关系,确定与所述环境光亮度匹配的曝光参数。
- 根据权利要求38所述的参数的调整方法,其特征在于,所述环境光亮度和曝光参数的对应关系的生成方法,包括:获取多组样本图像,一组所述样本图像对应一个环境光亮度,且一组所述样本图像包括多个样本图像,每个所述样本图像对应一个曝光参数;利用多组所述样本图像对应的环境光亮度,以及多组所述样本图像中的人眼注视显示屏的样本图像对应的曝光参数,生成所述环境光亮度和曝光参数的对应关系。
- 根据权利要求38所述的参数的调整方法,其特征在于,所述环境光亮度和曝光参数的对应关系的生成方法,包括:获取多个历史曝光参数调整值,以及每个所述历史曝光参数调整值对应的环境光亮度,每个所述历史曝光参数调整值满足所述标准亮度的要求;利用所述多个历史曝光参数调整值,以及每个所述历史曝光参数调整值对应的环境光亮度,生成所述环境光亮度和曝光参数的对应关系。
- 根据权利要求38所述的参数的调整方法,其特征在于,所述以所述图像的图像亮度与标准亮度的差值,调整所述前置摄像头的初始曝光参数,得到曝光参数调整值之后,还包括:以所述曝光参数调整值更新所述环境光亮度和曝光参数的对应关系。
- 根据权利要求36至41中任意一项所述的参数的调整方法,其特征在于,所述初始曝光参数包括:曝光时长、模拟增益和数字增益中的至少一个。
- 根据权利要求36至41中任意一项所述的参数的调整方法,其特征在于,在所述以所述图像的图像亮度与标准亮度的差值,调整所述前置摄像头的初始曝光参数,得到曝光参数调整值之前,还包括:利用所述环境光亮度,确定所述标准亮度。
- 根据权利要求43所述的参数的调整方法,其特征在于,所述利用所述环境光亮度,确定所述标准亮度,包括:基于环境光亮度与标准亮度的对应关系,确定与所述环境光亮度匹配的标准亮度。
- 根据权利要求44所述的参数的调整方法,其特征在于,所述环境光亮度与标准亮度的对应关系的生成方法,包括:获取多组样本图像,一组所述样本图像对应一个环境光亮度,且一组所述样本图像包括多个样本图像;利用多组所述样本图像对应的环境光亮度,以及多组所述样本图像中的人眼注视显示屏的样本图像的图像亮度,生成所述环境光亮度和标准亮度的对应关系。
- 根据权利要求44所述的参数的调整方法,其特征在于,所述环境光亮度与标准亮度的对应关系包括:多个环境光亮度区间和每个所述环境光亮度区间匹配的标准亮度。
- 根据权利要求36至41中任意一项所述的参数的调整方法,其特征在于,利用所述图像的图像数据,计算所述图像的图像亮度,包括:获取所述图像包括的每个像素点的RGB分量;计算所述图像的每个像素点的RGB分量的均值,所述均值作为所述图像的图像亮度。
- 根据权利要求36所述的参数的调整方法,其特征在于,所述控制所述显示屏不熄屏之前,还包括:确定所述前置摄像头在预设时段内拍摄的图像中,存在一帧图像的置信度不小于所述预设门限值,所述预设时段利用所述显示屏被设定的屏幕熄灭时间确定。
- 根据权利要求48所述的参数的调整方法,其特征在于,还包括:若所述前置摄像头在所述预设时段内拍摄的图像的置信度,均小于所述预设门限值,则控制所述显示屏熄屏。
- 根据权利要求36所述的参数的调整方法,其特征在于,在所述显示屏显示数据时,获取所述前置摄像头响应第一指令,以初始曝光参数运行拍摄的图像之前,还包括:利用所述环境光亮度,确定所述预设门限值。
- 根据权利要求50所述的参数的调整方法,其特征在于,所述利用所述环境光亮度,确定所述预设门限值,包括:基于环境光亮度与门限值的对应关系,确定与所述环境光亮度匹配的门限值。
- 根据权利要求51所述的参数的调整方法,其特征在于,所述环境光亮度和门限值的对应关系包括:多个环境光亮度区间,以及每个所述环境光亮度区间匹配的门限值。
- 根据权利要求51所述的参数的调整方法,其特征在于,所述环境光亮度和门限值的对应关系中,强光区间匹配的门限值和弱光区间匹配的门限值,小于正常光区间匹配的门限值。
- 根据权利要求36至41中任意一项所述的参数的调整方法,其特征在于,所述控制所述显示屏不熄屏之前,还包括:获取所述图像的人脸检测结果;确定所述人脸检测结果为检测到人脸。
- 一种电子设备,其特征在于,包括:显示屏,用于显示数据;环境光检测器,用于检测环境光,得到环境光亮度;前置摄像头,用于在所述显示屏显示数据时,以初始曝光参数运行拍摄图像;一个或多个处理器;存储器,其上存储有程序;当所述程序被所述一个或多个处理器执行时,使得所述电子设备执行如权利要求1至20任意一项所述的参数的调整方法,或者,如权利要求21至35中任意一项所述的显示的控制方法,如权利要求36至54中任意一项所述的参数的调整方法。
- 一种可读存储介质,其特征在于,其上存储有计算机程序,其中,所述计算机程序被处理器执行时实现如权利要求1至20中任意一项所述的参数的调整方法,或者,如权利要求21至35中任意一项所述的显示的控制方法,如权利要求36至54中任意一项所述 的参数的调整方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22854996.0A EP4191992A4 (en) | 2021-08-09 | 2022-05-13 | PARAMETER SETTING METHOD, DISPLAY CONTROL METHOD, ELECTRONIC DEVICE AND MEDIUM |
US18/245,280 US20230379588A1 (en) | 2021-08-09 | 2022-05-13 | Parameter Adjustment Method, Display Control Method, Electronic Device, and Medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110908422.9A CN113596345B (zh) | 2021-08-09 | 2021-08-09 | 参数的调整方法、显示的控制方法、电子设备及介质 |
CN202110908422.9 | 2021-08-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023016008A1 true WO2023016008A1 (zh) | 2023-02-16 |
Family
ID=78256407
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/092653 WO2023016008A1 (zh) | 2021-08-09 | 2022-05-13 | 参数的调整方法、显示的控制方法、电子设备及介质 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230379588A1 (zh) |
EP (1) | EP4191992A4 (zh) |
CN (1) | CN113596345B (zh) |
WO (1) | WO2023016008A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116309863A (zh) * | 2023-02-17 | 2023-06-23 | 合肥安迅精密技术有限公司 | 图像光源照明参数的标定方法及系统、存储介质 |
CN116338707A (zh) * | 2023-05-31 | 2023-06-27 | 深圳玩智商科技有限公司 | 曝光调整方法、装置、设备和计算机可读存储介质 |
CN118116088A (zh) * | 2024-04-30 | 2024-05-31 | 交通银行股份有限公司江西省分行 | 银行业务处理方法、电子设备以及存储介质 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111770283B (zh) * | 2020-07-03 | 2022-06-14 | 浙江大华技术股份有限公司 | 一种图像曝光的增益调整方法及装置 |
CN113596345B (zh) * | 2021-08-09 | 2023-01-17 | 荣耀终端有限公司 | 参数的调整方法、显示的控制方法、电子设备及介质 |
CN117135467A (zh) * | 2023-02-23 | 2023-11-28 | 荣耀终端有限公司 | 图像处理方法及电子设备 |
CN117149294B (zh) * | 2023-02-27 | 2024-07-23 | 荣耀终端有限公司 | 相机应用配置方法、设备及存储介质 |
CN116996762B (zh) * | 2023-03-29 | 2024-04-16 | 荣耀终端有限公司 | 一种自动曝光方法、电子设备及计算机可读存储介质 |
CN118102114B (zh) * | 2024-04-28 | 2024-09-17 | 荣耀终端有限公司 | 图像处理方法及相关装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109389069A (zh) * | 2018-09-28 | 2019-02-26 | 北京市商汤科技开发有限公司 | 注视点判断方法和装置、电子设备和计算机存储介质 |
CN109413336A (zh) * | 2018-12-27 | 2019-03-01 | 北京旷视科技有限公司 | 拍摄方法、装置、电子设备及计算机可读存储介质 |
CN109788138A (zh) * | 2019-02-01 | 2019-05-21 | Oppo广东移动通信有限公司 | 屏幕控制方法、装置、终端及存储介质 |
CN112829584A (zh) * | 2021-01-12 | 2021-05-25 | 浙江吉利控股集团有限公司 | 一种用于车辆显示装置的亮度调节方法及系统 |
CN113596345A (zh) * | 2021-08-09 | 2021-11-02 | 荣耀终端有限公司 | 参数的调整方法、显示的控制方法、电子设备及介质 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100079508A1 (en) * | 2008-09-30 | 2010-04-01 | Andrew Hodge | Electronic devices with gaze detection capabilities |
CN107300967B (zh) * | 2017-06-30 | 2020-07-07 | Oppo广东移动通信有限公司 | 一种智能导航方法、装置、存储介质和终端 |
EP3681371B1 (en) * | 2017-09-11 | 2022-11-30 | Tobii AB | Reliability of gaze tracking data for left and right eye |
CN108733999A (zh) * | 2018-04-24 | 2018-11-02 | 浙江大华技术股份有限公司 | 一种解锁方法及装置、终端设备和可读存储介质 |
CN111179860A (zh) * | 2018-11-12 | 2020-05-19 | 奇酷互联网络科技(深圳)有限公司 | 电子设备的背光模式调整方法、电子设备及装置 |
CN109547701B (zh) * | 2019-01-04 | 2021-07-09 | Oppo广东移动通信有限公司 | 图像拍摄方法、装置、存储介质及电子设备 |
WO2020181523A1 (zh) * | 2019-03-13 | 2020-09-17 | 华为技术有限公司 | 唤醒屏幕的方法和装置 |
CN111083386B (zh) * | 2019-12-24 | 2021-01-22 | 维沃移动通信有限公司 | 图像处理方法及电子设备 |
-
2021
- 2021-08-09 CN CN202110908422.9A patent/CN113596345B/zh active Active
-
2022
- 2022-05-13 WO PCT/CN2022/092653 patent/WO2023016008A1/zh unknown
- 2022-05-13 US US18/245,280 patent/US20230379588A1/en active Pending
- 2022-05-13 EP EP22854996.0A patent/EP4191992A4/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109389069A (zh) * | 2018-09-28 | 2019-02-26 | 北京市商汤科技开发有限公司 | 注视点判断方法和装置、电子设备和计算机存储介质 |
CN109413336A (zh) * | 2018-12-27 | 2019-03-01 | 北京旷视科技有限公司 | 拍摄方法、装置、电子设备及计算机可读存储介质 |
CN109788138A (zh) * | 2019-02-01 | 2019-05-21 | Oppo广东移动通信有限公司 | 屏幕控制方法、装置、终端及存储介质 |
CN112829584A (zh) * | 2021-01-12 | 2021-05-25 | 浙江吉利控股集团有限公司 | 一种用于车辆显示装置的亮度调节方法及系统 |
CN113596345A (zh) * | 2021-08-09 | 2021-11-02 | 荣耀终端有限公司 | 参数的调整方法、显示的控制方法、电子设备及介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4191992A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116309863A (zh) * | 2023-02-17 | 2023-06-23 | 合肥安迅精密技术有限公司 | 图像光源照明参数的标定方法及系统、存储介质 |
CN116309863B (zh) * | 2023-02-17 | 2024-02-02 | 合肥安迅精密技术有限公司 | 图像光源照明参数的标定方法及系统、存储介质 |
CN116338707A (zh) * | 2023-05-31 | 2023-06-27 | 深圳玩智商科技有限公司 | 曝光调整方法、装置、设备和计算机可读存储介质 |
CN116338707B (zh) * | 2023-05-31 | 2023-08-11 | 深圳玩智商科技有限公司 | 曝光调整方法、装置、设备和计算机可读存储介质 |
CN118116088A (zh) * | 2024-04-30 | 2024-05-31 | 交通银行股份有限公司江西省分行 | 银行业务处理方法、电子设备以及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
EP4191992A4 (en) | 2024-05-08 |
EP4191992A1 (en) | 2023-06-07 |
CN113596345A (zh) | 2021-11-02 |
CN113596345B (zh) | 2023-01-17 |
US20230379588A1 (en) | 2023-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2023016008A1 (zh) | 参数的调整方法、显示的控制方法、电子设备及介质 | |
CN112717370B (zh) | 一种控制方法和电子设备 | |
WO2021027747A1 (zh) | 一种界面显示方法及设备 | |
WO2021135838A1 (zh) | 一种页面绘制方法及相关装置 | |
WO2022257451A1 (zh) | 一种显示方法、电子设备及计算机存储介质 | |
WO2021147396A1 (zh) | 图标管理方法及智能终端 | |
CN113963659A (zh) | 显示设备的调整方法及显示设备 | |
CN113254120A (zh) | 数据处理方法和相关装置 | |
WO2023056795A1 (zh) | 快速拍照方法、电子设备及计算机可读存储介质 | |
CN114463191B (zh) | 一种图像处理方法及电子设备 | |
WO2020155875A1 (zh) | 电子设备的显示方法、图形用户界面及电子设备 | |
US20230276125A1 (en) | Photographing method and electronic device | |
EP4280586A1 (en) | Point light source image detection method and electronic device | |
WO2023030168A1 (zh) | 界面显示方法和电子设备 | |
CN117499779B (zh) | 一种图像预览方法、设备以及存储介质 | |
WO2024179101A1 (zh) | 一种拍摄方法 | |
WO2023019999A1 (zh) | 工作模式的切换控制方法、电子设备及可读存储介质 | |
CN116708753B (zh) | 预览卡顿原因的确定方法、设备及存储介质 | |
CN116347217A (zh) | 图像处理方法、设备及存储介质 | |
CN115686403A (zh) | 显示参数的调整方法、电子设备、芯片及可读存储介质 | |
WO2024169305A1 (zh) | 应用管理的方法和电子设备 | |
WO2022247664A1 (zh) | 图形界面显示方法、电子设备、介质以及程序产品 | |
WO2024109198A1 (zh) | 窗口调整方法及相关装置 | |
WO2024036998A1 (zh) | 显示方法、存储介质及电子设备 | |
WO2024152910A1 (zh) | 一种录像方法以及电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2022854996 Country of ref document: EP Effective date: 20230301 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |