CN117938996A - Method for detecting ambient light and electronic device - Google Patents

Method for detecting ambient light and electronic device Download PDF

Info

Publication number
CN117938996A
CN117938996A CN202311848233.2A CN202311848233A CN117938996A CN 117938996 A CN117938996 A CN 117938996A CN 202311848233 A CN202311848233 A CN 202311848233A CN 117938996 A CN117938996 A CN 117938996A
Authority
CN
China
Prior art keywords
ambient light
camera
data
parameter
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311848233.2A
Other languages
Chinese (zh)
Inventor
李钊
董洁
石永志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311848233.2A priority Critical patent/CN117938996A/en
Publication of CN117938996A publication Critical patent/CN117938996A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

The application provides a method for detecting ambient light and electronic equipment, wherein the electronic equipment comprises a first camera, and the working mode of the first camera comprises a camera mode and an ambient light mode, and the method comprises the following steps: when the working mode of the first camera is detected to be an ambient light mode, a first communication channel between the first camera and the processor is opened, and the first communication channel is used for representing a communication channel corresponding to the ambient light mode; acquiring bare data of the ambient light in an ambient light mode by using a first camera, wherein the bare data is used for representing initial data of the ambient light acquired by the first camera; transmitting the bare data to the processor using the first communication path; and processing the bare data by using a processor to obtain the target ambient light parameter. The scheme replaces the scheme of collecting the ambient light by utilizing the independent ambient light device through multiplexing the photosensitive function of the camera, and the scheme can simplify the ambient light device and the matched accessories and circuits thereof on the whole machine.

Description

Method for detecting ambient light and electronic device
Technical Field
The present application relates to the field of electronic devices, and in particular, to a method for detecting ambient light and an electronic device.
Background
The current mobile phones are all provided with an ambient light device for detecting ambient light and realizing automatic adjustment of screen brightness according to the detection condition of the ambient light. Most of the ambient light devices are arranged at the top or bottom of the mobile phone, and the appearance is that a small hole called a light receiving hole is formed in the mobile phone shell. The light receiving hole is also required to be provided with various matched devices such as a light guide column, a visual angle, a glass cover plate and the like, and a corresponding processing circuit is also required to be arranged on a printed circuit board (printed circuit board, PCB) of the electronic equipment. Still other ambient light devices are located on the front side of the phone, i.e. below the screen, which results in occupation of the screen display area and the effect of the screen itself lighting may also result in inaccurate detected ambient light data. The back of cell-phone is also set up to some ambient light device, this also can cause the overall arrangement of all devices on cell-phone back to influence, because set up and still can not shelter from the receipts light hole under the consideration user's hand condition in back, and the overall arrangement of rear-mounted camera of back, flashlight etc. especially when there is the flashlight, receive the light hole can be with the luminance of the illumination light of flashlight as ambient light to descale screen luminance, lead to the bright in dark environment, cell-phone but misjudgement is under the environment that luminance is very high, lead to cell-phone screen luminance to be adjusted to too high luminance in automatically regulated, influence user's use experience. And a part of the light receiving holes of the ambient light device are positioned in the narrow slit area of the screen glass, so that a light receiving path is limited, and the process requirement is higher.
In short, in the conventional scheme, it is necessary to detect the ambient light by providing an ambient light device to achieve automatic adjustment of the screen brightness, but these methods have respective drawbacks and have a large influence on the external layout of the mobile phone and the layout of the internal circuit board.
Disclosure of Invention
The application provides a method for detecting ambient light and electronic equipment, which realize the detection of the ambient light by multiplexing the photosensitive function of a camera, so that an ambient light device can be removed from the electronic equipment.
In a first aspect, a method for detecting ambient light is provided, and the method is applied to an electronic device, the electronic device includes a first camera, an operation mode of the first camera includes a camera mode and an ambient light mode, and the method includes: when the working mode of the first camera is detected to be an ambient light mode, a first communication channel between the first camera and the processor is opened, and the first communication channel is used for representing a communication channel corresponding to the ambient light mode; acquiring bare data of the ambient light in an ambient light mode by using a first camera, wherein the bare data is used for representing initial data of the ambient light acquired by the first camera; transmitting the bare data to the processor using the first communication path; and processing the bare data by using a processor to obtain the target ambient light parameter.
According to the technical scheme, the camera is enabled to work in an ambient light mode to collect bare data and obtain target ambient light parameters after processing the bare data mainly through multiplexing the ambient light sensing function in the camera with both the image sensing function and the ambient light sensing function. Because the camera has natural area advantage and overall arrangement position advantage in comparison with traditional ambient light device, and does not receive the advantage that shelters from the influence when perception light, the light receiving area is bigger, and no longer need set up the supporting device of various traditional ambient light devices such as receiving hole, receipts light column, and do not need to carry out the complex operation that traditional ambient light device needs to compensate screen light etc.. In addition, at least one camera in the electronic equipment can directly work in a camera mode and an ambient light mode, which is equivalent to superposing an ambient light detection function on other functional devices, so that an ambient light sensing device and a matched processing circuit thereof can be directly removed from the electronic equipment, and the influence of the traditional ambient light device on the layout of a PCB, the layout of a complete machine and the appearance of the complete machine is avoided.
With reference to the first aspect, in some implementation manners of the first aspect, when detecting that the working mode of the first camera is an ambient light mode, when switching a communication path between the first camera and the processor to a path corresponding to the ambient light mode, the method may include: configuring an interface of a second communication path between the first camera and the processor from a default integrated circuit (I2C) interface to a General Purpose Input Output (GPIO) interface, wherein the second communication path is a communication path corresponding to a camera mode; the GPIO interface of the second communication path is set to a high-impedance state. In this implementation, when the first communication path is guaranteed to be opened, the other communication paths are blocked, so that the second communication path is blocked, that is, the interface of the second communication path is configured as a GPIO interface and then is set to a high-impedance state.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes: when the working mode of the first camera is detected to be a camera mode, the second communication channel is started, and an interface of the first communication channel is configured from a default I2C interface to a GPIO interface; the GPIO interface of the first communication path is set to a high-impedance state. In this implementation, when the operation mode of the camera is the camera mode, that is, when the camera is being invoked by the foreground application, this corresponds to a function of suspending ambient light detection, so that the second communication path is used to transmit image data, and the first communication path is blocked. In this implementation, when the second communication path is guaranteed to be opened, the other communication paths are blocked, so that the first communication path is blocked, that is, the interface of the first communication path is configured as a GPIO interface and then is set to a high-impedance state.
With reference to the first aspect, in some implementations of the first aspect, the first camera includes a first pixel set, the camera mode corresponds to the first pixel set, the first pixel set includes a second pixel set, the ambient light mode corresponds to the second pixel set, pixels in the second pixel set are used for sensing images and light, other pixels in the first pixel set except for the second pixel set are used for sensing images, and pixels in the second pixel set are uniformly distributed in the first pixel set. In this implementation, the function and distribution requirements of the pixels of the first camera are given, and the pixels capable of sensing light refer to pixels with ALS function, where the pixels are uniformly distributed in all pixels of the camera, and these pixels have two sensing capabilities, and are used to sense an image when in a camera mode and to sense light when in an ambient light mode.
With reference to the first aspect, in some implementations of the first aspect, when processing the bare data with a processor to obtain the target ambient light parameter, the method may include: substituting the bare data into a parameter fitting curve, and calculating to obtain a target ambient light parameter, wherein the parameter fitting curve is obtained by fitting ambient light data acquired under a plurality of known illuminances with the plurality of known illuminances. In this implementation, the ambient light parameters are calculated by means of a parameter fitting curve.
With reference to the first aspect, in certain implementations of the first aspect, the parameter-fitting curve includes a first curve and a second curve, the first curve corresponding to a first illumination interval, the second curve corresponding to a second illumination interval, a maximum illumination in the first illumination interval being less than or equal to a minimum illumination in the second illumination interval; when bare data is substituted into a parameter fitting curve to calculate a target ambient light parameter, the method may include: substituting the ambient light data into a first curve when judging that the bare data corresponds to the first illumination interval, and calculating to obtain a target ambient light parameter; or substituting the ambient light data into the second curve when the bare data is judged to correspond to the second illumination interval, and calculating to obtain the target ambient light parameter. In this example, multiple fitting curves are set between the partitions according to different characteristics of different illumination intervals, so that the fitting curves can be more accurate, and the calculation results are more accurate.
It will be understood that in this example, the parameter fitting curve is not a curve close to a straight line, but a curve with a higher curvature, so two illuminance sections can be divided by determining the inflection point (where the curvature is highest) of the curve and using the illuminance corresponding to the inflection point as a boundary, and then curve fitting is performed on two different illuminance sections to obtain two fitting curves. It should also be appreciated that based on such processing logic, the full illumination interval may also be divided into three or more finer illumination intervals, each of which fits a curve separately.
With reference to the first aspect, in some implementation manners of the first aspect, when substituting the bare data into the parameter fitting curve, calculating to obtain the target ambient light parameter may include: substituting the bare data into a parameter fitting curve, and calculating to obtain an initial ambient light parameter; and obtaining the target ambient light parameter by using the calibration coefficient of the first camera and the initial ambient light parameter. In this example, by introducing the calibration coefficient, the deviation caused by the difference between different electronic devices is eliminated, and the accuracy of the detection result is further improved.
In this example, when there is only one parameter fitting curve, bare data is substituted into the parameter fitting curve, and the initial ambient light parameter is calculated; or when the parameter fitting curve comprises the first curve and the second curve, substituting the bare data into the first curve or the second curve to calculate the initial ambient light parameter. It should be further understood that when the parameter fitting curves include three or more fitting curves, the fitting curves corresponding to the bare data are determined and then substituted, so as to obtain the initial ambient light parameters, which will not be described again.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes: acquiring a first ambient light parameter corresponding to the standard camera under the first illumination, and acquiring a second ambient light parameter corresponding to the first camera under the first illumination, wherein the first ambient light parameter is obtained by utilizing a parameter fitting curve and bare data acquired by the standard camera under the first illumination, and the second ambient light parameter is obtained by utilizing the parameter fitting curve and the bare data acquired by the first camera under the first illumination; and determining a calibration coefficient of the first camera by using the first ambient light parameter and the second ambient light parameter. In this example, determining the calibration coefficient based on the difference of the ambient light parameters (not bare data) output by the standard camera (reference camera) and the first camera after the parameter fitting curve calculation under the same illuminance, instead of determining the calibration coefficient based on the difference of the bare data, can make the calibration coefficient more accurate. The reason is that the first ambient light parameter and the second ambient light parameter are obtained after the calculation of the parameter fitting curve, which is already improved in accuracy compared with the bare data, and the calibration is performed after the calculation of the parameter fitting curve instead of the calibration in the bare data stage, so that the calibration coefficient can be more accurate by selecting the latest data before the calibration to determine the calibration coefficient.
In this example, assuming that the parameter fitting curve is one, the first ambient light parameter is obtained after the bare data collected by the standard camera under the first illuminance is calculated by the parameter fitting curve, and the second ambient light parameter is obtained after the bare data collected by the first camera under the first illuminance is calculated by the parameter fitting curve; assuming that the parameter fitting curves are multiple, the first ambient light parameter is obtained by firstly determining an illumination interval to which the bare data collected by the standard camera under the first illumination belongs, then selecting a corresponding parameter fitting curve, and then calculating the collected bare data by using the selected parameter fitting curve; the second ambient light parameter is obtained by firstly determining an illumination interval of the bare data collected by the first camera under the first illumination, then selecting a corresponding parameter fitting curve, and then calculating the collected bare data by using the selected parameter fitting curve.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes: determining the weight of each pixel point in the second pixel point set according to the contribution degree of each pixel point in the second pixel point set in the first camera to the whole output data of the first camera, so as to obtain a weight matrix corresponding to the second pixel point set; when bare data of ambient light is collected in an ambient light mode using the first camera, it may include: acquiring first data by using each pixel point in a second pixel point set in the first camera, wherein the first data comprises data of ambient light acquired by each pixel point; multiplying the first data by the weight matrix to obtain bare data. In this implementation, the influence degree of each pixel point on the acquisition result is fine, so the bare data can still be relatively accurate for the ultra-low illumination interval and the ultra-high pixel interval, and the situation that the bare data for the ultra-low illumination such as 1lux, 3lux and 5lux are equal in the actual test can not occur.
In a second aspect, there is provided an apparatus for performing detection of ambient light, the apparatus comprising means for performing any one of the methods of the first aspect, comprised of software and/or hardware.
In a third aspect, there is provided an electronic device comprising a memory, one or more processors and a computer program stored in the memory and executable on the processors, the one or more processors when executing the computer program enabling the electronic device to carry out any one of the methods of the first aspect.
In a fourth aspect, there is provided a chip comprising a processor for reading and executing a computer program stored in a memory, which when executed by the processor enables an electronic device in which the chip is located to carry out any one of the methods of the first aspect.
Optionally, the chip further comprises a memory, the memory being electrically connected to the processor.
Optionally, the chip may further comprise a communication interface.
In a fifth aspect, a computer readable storage medium is provided, the computer readable storage medium storing a computer program, the computer program being capable of implementing any one of the methods of the first aspect when executed by an electronic device.
In a sixth aspect, a computer program product is provided, the computer program product comprising a computer program capable of implementing any one of the methods of the first aspect when the computer program is executed by an electronic device.
Drawings
Fig. 1 is a schematic diagram of a conventional under-screen positive-emission ambient light device.
Fig. 2 is a schematic diagram of a distribution of pixels of a dual-mode camera according to an embodiment of the present application.
Fig. 3 is a schematic architecture diagram of a hardware circuit for ambient light detection in accordance with an embodiment of the present application.
Fig. 4 is a schematic flow chart of a method for switching the working mode of a camera according to an embodiment of the application.
Fig. 5 is a schematic flow chart of a method of detecting ambient light according to an embodiment of the application.
Fig. 6 is a graph comparing test results of the ambient light detection scheme of the embodiment of the present application with those of the conventional ambient light detection scheme under the same conditions.
Fig. 7 is a graph showing the comparison of the test results of the ambient light detection scheme according to the embodiment of the present application with the ambient light detection schemes of other manufacturers under the same conditions, respectively.
FIG. 8 is a schematic diagram of illumination simulation intensity according to an embodiment of the present application.
Fig. 9 is a schematic diagram comparing the layout of the whole machine with the conventional scheme.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described below with reference to the accompanying drawings.
In the conventional ambient light detection scheme, the physical ambient light device arranged on the electronic equipment is required to be utilized, and in the whole electronic equipment system, the physical ambient light device is an independent electronic component, so that the layout problem of the physical ambient light device on the PCB and the influence of the physical device on the whole architecture are required to be considered. As can be seen from the above common layout, there are various defects such as light leakage, bad layout, etc. wherever the layout is. Examples of the layout and defects of several common ambient light devices are given in table 1.
TABLE 1
As shown in table 1, the narrow slit ambient light process has more requirements and large debugging difficulty, and the ejection ambient light needs to be provided with a light guide column and has the risk of blocking holes. The under-screen positive output of ambient light is in a non-operable area (ACTIVE AREA, AA), and because the area of the non-AA area is smaller, the layout of an ambient light device is difficult, and because the device is affected by screen light, the problem of light leakage compensation needs to be considered. The under-screen ambient light of the AA needs to use a complex algorithm to compensate the influence of light leakage, and in addition, the whole under-screen ambient light device is arranged in the AA area, so that the touch function and the display of the screen are influenced to a certain extent more or less. The problems with other arrangements of ambient light devices are not listed.
Fig. 1 is a schematic diagram of a conventional under-screen positive-emission ambient light device. As shown in fig. 1 (a), in the under-screen forward-out ambient light scheme, a light receiving hole is arranged above an ambient light sensor (amblientlightsensor, ALS), a middle frame, a light equalizing film, a copper foil (backcleanfoam, SCF)) are arranged at the light receiving hole, a back plate (back plane, BP) with supporting function, a polyimide film (polyimidefilm, PF) with plastic packaging function, a thin film field effect transistor (thinfilmtransistor, TFT) with a display driving circuit, an OLED (organic light-emitting diode) and an anti-corrosion packaging layer (thinfilmencapsulation, TFE), a touch sensor (TPsensor), a polarizer, an optical gelatin (opticallyclearadhesive, OCA) and a glass cover plate (coverglass, CG) are arranged above the light receiving hole.
Ambient light is emitted into the light receiving aperture from above the CG. However, since the screen itself emits light, as shown in fig. 1 (b), the light emitted by the screen is transmitted between penel layers (the sum of all layers between CG and SCF), that is, between CG and SCF copper foil layers, when light leaks to an ALS device when passing through a light receiving hole, the light leaks to the ALS device, so that the ambient light data collected during the ALS device actually includes two parts, one part is the actual ambient light vertically irradiated through CG glass, and the other part is the light leak of the screen light, which results in inaccurate detected ambient light data, and if light leak compensation is performed, that is, the influence of the screen light is removed, other matched processing methods and processing algorithms are also required to be set, which is complex and inaccurate.
Except that the data acquisition of the above-mentioned ambient light devices can cause inaccurate data due to the influence of screen lighting, all the ambient light devices have the problem of influencing the layout of the PCB and the overall layout. With the increasing requirement on the appearance refinement of the whole machine, the layout space of the environment light device is compressed to be smaller and smaller, and an environment light scheme with better acquisition effect and lower influence on the PCB layout and the whole machine layout is required to be found.
Aiming at the problems, the application provides a new scheme for detecting the ambient light, and the scheme for collecting the ambient light by using the physical ambient light device is replaced by multiplexing the photosensitive function of the camera device, so that one physical device can be simplified on the whole machine, namely the ambient light device is directly removed. In addition, the scheme is a multiplexing camera device, and the layout position of the whole camera has an inherent advantage on the characteristic of ambient light, because the cameras can be laid out at the most suitable position of the whole camera to ensure the shooting performance of the camera. This scheme does not need to set up independent leaded light post, does not need to increase the samming membrane again, does not need to debug printing ink alone, also does not need to receive the light leak influence of screen, also does not need to be forced to adopt minimum receipts light hole, so has advantage with low costs, that the reliability is high, the effect uniformity is good. In addition, there is no impact on the PCB layout and overall layout.
Fig. 2 is a schematic diagram of a distribution of pixels of a dual-mode camera according to an embodiment of the present application. As shown in fig. 2, the camera includes x×y pixels, which can sense an image, and X0×y0 pixels, which can sense light in addition to an image. That is, the camera includes X0X Y0 pixels with ALS function, and the pixels are uniformly distributed in all the pixels (X Y pixels).
Because the area of the camera is tens to tens times, possibly hundreds times, larger than that of the traditional light receiving hole, the pixels of the ALS function are uniformly distributed in all the pixels of the camera, so that the data can be more accurate for collecting the ambient light.
When the camera is used for shooting, all pixel points are used for collecting images, and when the camera is used for detecting ambient light, all ALS pixel points can be used for detecting the ambient light. That is, the operation modes of the camera shown in fig. 2 include a camera mode and an ambient light mode, wherein an image is collected during the camera mode, and ambient light data is collected during the ambient light mode.
In one example, X is 8192, Y is 6144, X0 is 16, Y0 is 12, that is, 16×12=192 photosensitive pixels are uniformly distributed in the total pixel 8198×6144.
It should be understood that the camera shown in fig. 2 is a new camera with both image sensing and light sensing functions, and is a new camera that has been recently developed. The application is based on the new camera, and the light sensing function is used for replacing the original light sensing function of the ambient light device. It should also be understood that the solution according to the embodiments of the present application needs to be implemented based on a camera with a light sensing function, and that the solution according to the embodiments of the present application cannot be used without such a camera.
Fig. 3 is a schematic architecture diagram of a hardware circuit for ambient light detection in accordance with an embodiment of the present application. As shown in fig. 3, a processor module, i.e., a System On Chip (SOC), is used to process various data from the camera, and may be used to send control instructions to control the operation of the camera, which is used to collect images or ambient light.
The SOC comprises a sensing processor (sensorHub, SH) and an application processor (application processor, AP), the camera transmits acquired image data to the AP for processing in a camera mode, and the camera transmits the acquired ambient light data to the SH for processing in an ambient light mode. The interfaces of the camera head include a serial clock interface (SCL) and a serial data interface (serialdata, SDA). The interface SCL of the camera is respectively connected with the serial clock interface SCL0 of the SH and the serial clock interface SCL1 of the AP, and the interface SDA of the camera is respectively connected with the serial data interface SDA0 of the SH and the serial data interface SDA1 of the AP.
In the camera mode, the camera and the AP communicate through two groups of interfaces respectively connected, and in the ambient light mode, the camera and the SH communicate through two groups of interfaces respectively connected. The path between the AP and the camera can be regarded as one example of the second communication path hereinafter, and the path between the SH and the camera can be regarded as one example of the first communication path hereinafter.
Since the interface in the electronic device may correspond to two modes of an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) and a general-purpose input/output (GPIO) interface, and the interfaces of the two communication paths may operate in the I2C mode by default, the I2C may not be set to a high-impedance state (equivalent to blocking signal transmission or understood as preventing reflection of a signal), in the embodiment of the present application, when one of the paths needs to be set to the high-impedance state, the interface on the path needs to be set to the high-impedance state after the default I2C mode is configured to the GPIO mode.
Fig. 4 is a schematic flow chart of a method for switching the working mode of a camera according to an embodiment of the application. The steps shown in fig. 4 are described below. Fig. 4 can be considered as a method of switching the operation mode of the camera in the circuit shown in fig. 3.
S401, identifying the working mode of the camera.
It should be understood that, since the camera of the electronic device is in the camera mode by default when invoked by the camera application, the camera can operate in the camera mode whenever the camera is invoked by the camera application or other third party applications requiring photographing or code scanning, and operate in the ambient light mode at other times, that is, the ambient light mode is an operation mode other than the camera mode, and no mode judgment is required.
In one implementation, when the camera is detected to be called by a foreground application, determining the working mode of the camera as a camera mode; or when the camera is detected not to be called by the foreground application, determining the working mode of the camera as an ambient light mode. In such an implementation, the working mode of the camera is determined based on whether the camera is invoked by a foreground application, which is convenient and easy to implement.
S402, when the working mode of the camera is an ambient light mode, the interface of the AP processor is configured to be a GPIO interface from a default I2C interface.
S403, setting the GPIO interface of the AP processor to be in a high-resistance state.
That is, in the ambient light mode, the communication path between the camera and the AP processor is blocked through steps S402 and S403.
S404, the camera works in an ambient light mode, acquires ambient light data and transmits the ambient light data to the SH processor.
S405, when the working mode of the camera is a camera mode, the interface of the SH processor is configured from a default I2C interface to a GPIO interface.
S406, setting the GPIO interface of the SH processor to be in a high-resistance state.
That is, in the camera mode, the communication path between the video camera and the SH processor is blocked through steps S405 and S406.
S407, the camera works in a camera mode, acquires image data and transmits the image data to the AP processor.
Fig. 4 gives an example of a switching scheme of the operation mode.
Fig. 5 is a schematic flow chart of a method of detecting ambient light according to an embodiment of the application. The steps shown in fig. 5 are described below. The scheme can be applied to electronic equipment with cameras such as mobile phones and tablet computers. The scheme can be used for optimizing the electronic equipment provided with the ambient light device, so that the ambient light device on the electronic equipment can be directly optimized.
The electronic device comprises a first camera, and the working mode of the first camera comprises a camera mode and an ambient light mode. The first camera may be a front camera or a rear camera. The first camera may be the camera shown in fig. 2 or fig. 3, and the first camera may be the camera of fig. 4.
S501, when the working mode of the first camera is detected to be an ambient light mode, a first communication channel between the first camera and the processor is opened.
The first communication path is used to represent a communication path corresponding to an ambient light pattern.
The processor here may be the SOC of fig. 3, and the first communication path may be a path between the camera and the SH processor of fig. 3.
In one implementation, step S501 may include: configuring an interface of a second communication path between the first camera and the processor from a default I2C interface to a GPIO interface, wherein the second communication path is a communication path corresponding to a camera mode; the GPIO interface of the second communication path is set to a high-impedance state. In this implementation, when the first communication path is guaranteed to be opened, the other communication paths are blocked, so that the second communication path is blocked, that is, the interface of the second communication path is configured as a GPIO interface and then is set to a high-impedance state. The second communication path may be a path between the camera and the AP processor in fig. 3, where configuring the interface of the second communication path from the default I2C interface to the GPIO interface refers to configuring the interface of the processor side in the second communication path from the default I2C interface to the GPIO interface, and does not include configuring the interface of the camera side, because the camera needs to use the interface of the camera in both the ambient light mode and the camera mode, the interface of the camera is not set to a high-impedance state. Similarly, setting the GPIO interface of the second communication path to the high-impedance state refers to setting the interface of the processor side in the second communication path to the high-impedance state, and not setting the interface of the camera side to the high-impedance state. Steps S402 and S403 in fig. 4 can be regarded as one example of this implementation.
In another implementation, the method further includes: when the working mode of the first camera is detected to be a camera mode, the second communication channel is started, and an interface of the first communication channel is configured from a default I2C interface to a GPIO interface; the GPIO interface of the first communication path is set to a high-impedance state. In this implementation, when the operation mode of the camera is the camera mode, that is, when the camera is being invoked by the foreground application, this corresponds to a function of suspending ambient light detection, so that the second communication path is used to transmit image data, and the first communication path is blocked. In this implementation, when the second communication path is guaranteed to be opened, the other communication paths are blocked, so that the first communication path is blocked, that is, the interface of the first communication path is configured as a GPIO interface and then is set to a high-impedance state. The first communication path may be a path between the camera and the SH processor in fig. 3, where configuring the interface of the first communication path from the default I2C interface to the GPIO interface refers to configuring the interface of the processor side in the first communication path from the default I2C interface to the GPIO interface, and does not include configuring the interface of the camera side, because the camera needs to use the interface of the camera in both the ambient light mode and the camera mode, the interface of the camera is not set to a high-impedance state. Similarly, setting the GPIO interface of the first communication path to the high-impedance state refers to setting the interface of the processor side in the second communication path to the high-impedance state, and not setting the interface of the camera side to the high-impedance state. Steps S405 and S406 in fig. 4 can be regarded as one example of this implementation.
In one implementation, the first camera includes a first pixel set, the camera mode corresponds to the first pixel set, the first pixel set includes a second pixel set, the ambient light mode corresponds to the second pixel set, the pixels in the second pixel set are used for sensing images and light, the pixels in the first pixel set except for the second pixel set are used for sensing images, and the pixels in the second pixel set are uniformly distributed in the first pixel set. In this implementation, the function and distribution requirements of the pixels of the first camera are given, and the pixels capable of sensing light refer to pixels with ALS function, where the pixels are uniformly distributed in all pixels of the camera, and these pixels have two sensing capabilities, and are used to sense an image when in a camera mode and to sense light when in an ambient light mode. X×y pixels in fig. 2 can be regarded as an example of the first pixel set, and X0×y0 pixels can be regarded as an example of the second pixel set.
S502, acquiring bare data of ambient light in an ambient light mode by using a first camera.
The bare data is used to represent initial data of the ambient light collected by the first camera.
When the camera is irradiated by an external light source, the pixel points of the ALS function of the camera can sense light, and the camera can directly output bare data, namely rawdata data.
It should be understood that how to convert perceived light into bare data inside a camera is not an important concern in embodiments of the present application, which are developed by camera manufacturers, and the embodiments of the present application only need to be able to access such bare data.
It will be appreciated that these bare data are likely to be data that does not have a physical meaning, so that some processing is required to obtain the parameters of the real ambient light, i.e. to obtain the illuminance of the ambient light.
In one implementation manner, the data collected by all the ALS pixel points in the camera may be preprocessed, so as to obtain the bare data. The preprocessing comprises the steps of constructing data acquired by all ALS pixel points in a camera into an initial data set, and multiplying the initial data set by a weight matrix to obtain the bare data. The weight matrix is determined by evaluating the contribution degree of each ALS pixel point in all ALS pixel points to sensitization after illumination test is carried out on the camera. It can be understood that the implementation manner mainly aims at the problem that different illuminances cannot be distinguished in an ultra-low illuminance region, that is, the problem that an illuminance camera below 5lux in table 2 below cannot accurately sense, and the part can also accurately sense after the preprocessing is performed by introducing the weight matrix. The related content will be described again below, and will not be described again here.
S503, the bare data is sent to the processor by using the first communication path.
After the bare data is collected, the bare data may be sent to the processor via the first communication path for subsequent processing by the processor.
S504, processing the bare data by using a processor to obtain the target ambient light parameter.
After receiving the bare data, the processor can perform some processing to obtain a target ambient light parameter, namely the ambient light intensity corresponding to the bare data.
The mapping relation curve (mathematical expression, equation, calculation formula) between the bare data and the actual ambient light parameter can be obtained by performing curve fitting after test data are obtained under the condition of known illuminance. After the curve is fitted, the obtained bare data can be substituted into the fitted curve, and the actual illuminance corresponding to the bare data is calculated.
In one implementation, the bare data of the camera can be collected under the illumination of different light sources, and the average value of the bare data at the same illumination point is taken as an independent variable, and the target illumination LUX is taken as a variable to perform curve fitting. In the implementation mode, the method is equivalent to carrying out certain processing on the bare data, and the difference of different light sources under the same illumination is eliminated.
The different light sources here mainly refer to different color temperatures. In one example, the different light sources may include cool white light, and yellow light. In another example, the different light sources may include light a (standard illuminant a, light emitted by a full radiator at 2856K), light C (standard illuminant C, average daylight with a correlated color temperature of about 6774K, daylight with a light color approximating a cloudy sky), and light D (standard illuminant D65, daylight with a correlated color temperature of about 6504K). It will be appreciated that other combinations of different light sources may be used without limitation. For example, the different light sources may also include daylight, light and moonlight. As another example, the different light sources may also comprise a standard illuminant B, direct sunlight with a correlated color temperature of about 4874K. It will be appreciated that the more kinds the different light sources contain, the more accurate the averaging is after being an independent variable. It should also be appreciated that when selecting different light sources, it is also desirable to combine the usage scenario of the electronic device for selection, such as when the electronic device is a cell phone, the usage field Jing Jiaoduo is portable, but for electronic devices such as notebook computers that are primarily used indoors, the consideration of direct sunlight outdoors may be impaired. That is, when the bare data corresponding to different light sources under the same illumination point position are averaged, a mode of screening before averaging can be adopted, and for the use scene of the electronic equipment, only the bare data under the light source of the common scene of the electronic equipment is selected to be averaged.
In one implementation, step S503 may include: substituting the bare data into a parameter fitting curve, and calculating to obtain a target ambient light parameter, wherein the parameter fitting curve is obtained by fitting ambient light data acquired under a plurality of known illuminances (lux) with the plurality of known illuminances. In this implementation, the ambient light parameters are calculated by means of a parameter fitting curve.
Table 2 is a set of experimental data for an embodiment of the present application, performed under default gain (gain) and exposure time conditions. Table 2 is obtained by collecting data under three light sources, light a, light C and light D, respectively. The point lux in table 2 can be understood as known illuminance, the illuminometer report value can be understood as an average value of the same point lux under different light sources, rawdata is the bare data corresponding to each point lux, and the bare data is obtained by taking the average value of the bare data acquired under different light sources under each point lux. The post-calibration lux report value may be understood as the value obtained by substituting rawdata into the fitted curve, and may be understood herein as the target ambient light parameter or the initial ambient light parameter hereinafter. The post-calibration accuracy is the ratio between the post-calibration lux report value and the illuminometer report value.
The parameter fit curve used in table 2 is lux= (10.238 x rawdata-1327.8) x calibration coefficients, where the calibration coefficients are determined from the numerical differences between the current camera and the standard camera. It should be understood that the coefficients of the parameter fitting curve are not limited, and are obtained by fitting the set of test data in table 2, that is, the parameter fitting curve is obtained after fitting by using rawdata and illuminometer values, and then each rawdata may be substituted into the parameter fitting curve again to obtain the calibrated lux values. Or the LUX in the parameter fitting curve is the value of the LUX report after calibration.
TABLE 2
As can be seen from table 2, although the fitting accuracy is relatively high for most of the points lux, the low-illuminance interval accuracy is low, for example, the post-calibration accuracy is higher than 30% at both the points lux of 30lux and 40 lux. For point lux below 5lux, absolute error is adopted for the post-calibration accuracy, although the accuracy is within the acceptable range of human eyes, the absolute error is larger, and as can be seen from table 2, rawdata data below 5lux are unchanged, so that post-calibration lux report values are unchanged. It is understood that the lower limit of the perceived light of this camera is 3lux, and illuminance lower than 3lux is recognized as 3lux.
In practical applications, the range requirement for the calibrated accuracy may be determined according to the comfort level of the human eye, for example, the accuracy range may be required to be within ±30%, that is [ -30%, +30% ].
In order to further improve the precision, the problem that the accurate perception cannot be realized under 5lux in the table 2 is solved by a mode of preprocessing by setting a weight matrix.
In one implementation, the method further includes: determining the weight of each pixel point in the second pixel point set according to the contribution degree of each pixel point in the second pixel point set in the first camera to the whole output data of the first camera, so as to obtain a weight matrix corresponding to the second pixel point set; when bare data of ambient light is collected in an ambient light mode using the first camera, it may include: and acquiring first data by using each pixel point in the second pixel point set in the first camera, wherein the first data comprises the data of the ambient light acquired by each pixel point, and multiplying the first data by a weight matrix to obtain the bare data. In this implementation, the influence degree of each pixel point on the acquisition result is fine, so for the ultra-low illumination interval and the ultra-high pixel interval, the bare data can still be relatively accurate, and the condition that the bare data of 1lux, 3lux and 5lux in table 2 are equal (138.06 is all avoided). Through experimental tests, after the weight matrix is introduced to obtain the bare data by using the implementation manner, the bare data of 1lux, 3lux and 5lux in table 2 are 16.4617, 16.7128 and 16.8567 respectively, and the variation trend of the bare data is more fit with the actual illuminance variation trend.
FIG. 8 is a schematic diagram of illumination simulation intensity according to an embodiment of the present application. As shown in fig. 8 (a), it can be seen that under the same illumination intensity, the light sensing results at different pixel points are different, the horizontal axis is the transverse coordinate from the center point, the vertical axis is the longitudinal coordinate from the center point, the pixel points are uniformly distributed in a circular ring shape by taking the center point as the center point, and the conference squares or rectangles are uniformly distributed according to rows and columns, as shown in fig. 2, or are uniformly distributed in a rectangular shape. As shown in fig. 8 (b), the contribution degree is not strictly linearly increased with the distance, as it can be seen that the bar graph of the photosensitive data on different diameters is statistically obtained with the distance from the center point as the center point.
As can be seen from fig. 8, the construction of the weight matrix based on the evaluation of the contribution degree to the output of each pixel point in the above implementation is more accurate.
It should be further noted that, in addition to the above implementation manner for determining the weight based on the contribution degree of each pixel to the photosensitive result, the weight of other pixel points may be set by taking the center point as a reference, but the manner of setting the weight is more suitable for the traditional ambient light device, but is not suitable for the camera of the present application. Because the light receiving holes of the traditional ambient light device are very small, all the photosites are not too far from the center point, and the more accurate the photosites are from the center. However, the application uses a camera with relatively large area and relatively uniform light sensing, and has no central point, so that if a mode of setting weight based on the central point is adopted, the weight matrix is not accurate enough, but the accuracy is improved compared with the condition of not considering the weight matrix.
In order to further improve the precision, the curve can be fitted in a segmented mode, namely, the curve is fitted in a segmented mode aiming at the characteristic that characteristics of different illumination intervals are different.
In one example, the parameter fitting curve may include a first curve corresponding to a first illumination interval and a second curve corresponding to a second illumination interval, the maximum illumination in the first illumination interval being less than or equal to the minimum illumination in the second illumination interval; substituting the bare data into the parameter fitting curve, and calculating to obtain the target ambient light parameter, wherein the method comprises the following steps: substituting the ambient light data into a first curve when judging that the bare data corresponds to the first illumination interval, and calculating to obtain a target ambient light parameter; or substituting the ambient light data into the second curve when the bare data is judged to correspond to the second illumination interval, and calculating to obtain the target ambient light parameter. In this example, multiple fitting curves are set between the partitions according to different characteristics of different illumination intervals, so that the fitting curves can be more accurate, and the calculation results are more accurate.
It will be understood that in this example, the parameter fitting curve is not a curve close to a straight line, but a curve with a higher curvature, so two illuminance sections can be divided by determining the inflection point (where the curvature is highest) of the curve and using the illuminance corresponding to the inflection point as a boundary, and then curve fitting is performed on two different illuminance sections to obtain two fitting curves. It should also be appreciated that based on such processing logic, the full illumination interval may also be divided into three or more finer illumination intervals, each of which fits a curve separately.
In one example, the first illumination interval is less than or equal to 600LUX, and the first curve is represented as lux= 1.6466 bare data-218.7; the second illumination interval is greater than 600LUX, and the second curve is represented as lux= 10.238 bare data-1327.8. It should be understood that 600lux is determined by observation after plotting a curve based on the actual test data of table 2 above, and other values may be selected. The first curve and the second curve provided according to this example were tested to obtain the data shown in table 3. The illuminometer report value in table 3 can be described with reference to the correlation of table 2, the bare data in table 3 can be regarded as rawdata in table 2, that is, the initial data collected by the camera, the target ambient light parameter is the ambient light parameter output by the electronic device, the precision is obtained according to the difference between the illuminometer report value and the target ambient light parameter, the point location 5lux and below, that is, the precision of 4.93 and below in table 3 is represented by absolute difference, and the precision is represented by error when the point location is above 5lux, that is, the ratio of the absolute difference between the target ambient light parameter and the illuminometer report value in the illuminometer report value. The LUX in the fitted curve formula in table 3 corresponds to the target ambient light parameter.
TABLE 3 Table 3
As can be seen from table 3, the target ambient light parameters below 5lux are no longer fixed, but rather are closer to the true value (illuminometer value), with a precision of 30lux (illuminometer value 29.58) within ±30%. All accuracies of table 3 are within ±30%.
Note that, the acquired bare data may be a numerical value having no physical meaning, but the illuminance interval corresponding to the bare data may be estimated approximately, so in the above example, the illuminance interval corresponding to the bare data may be determined. However, in practical application, the numerical value intervals of the bare data corresponding to the two illumination intervals can be directly determined, and when the illumination interval corresponding to the bare data is determined, the corresponding illumination interval can be determined by determining the numerical value interval of the bare data where the bare data is located.
In another example, when bare data is substituted into the parameter fitting curve, the calculation results in the target ambient light parameter may include: substituting the bare data into a parameter fitting curve, and calculating to obtain an initial ambient light parameter; and obtaining the target ambient light parameter by using the calibration coefficient of the first camera and the initial ambient light parameter. In this example, by introducing the calibration coefficient, the deviation caused by the difference between different electronic devices is eliminated, and the accuracy of the detection result is further improved.
In this example, when there is only one parameter fitting curve, bare data is substituted into the parameter fitting curve, and the initial ambient light parameter is calculated; or when the parameter fitting curve comprises the first curve and the second curve, substituting the bare data into the first curve or the second curve to calculate the initial ambient light parameter. It should be further understood that when the parameter fitting curves include three or more fitting curves, the fitting curves corresponding to the bare data are determined and then substituted, so as to obtain the initial ambient light parameters, which will not be described again.
In another example, the method further comprises: acquiring a first ambient light parameter corresponding to the standard camera under the first illumination, and acquiring a second ambient light parameter corresponding to the first camera under the first illumination, wherein the first ambient light parameter is obtained by utilizing a parameter fitting curve and bare data acquired by the standard camera under the first illumination, and the second ambient light parameter is obtained by utilizing the parameter fitting curve and the bare data acquired by the first camera under the first illumination; and determining a calibration coefficient of the first camera by using the first ambient light parameter and the second ambient light parameter. In this example, determining the calibration coefficient based on the difference of the ambient light parameters (not bare data) output by the standard camera (reference camera) and the first camera after the parameter fitting curve calculation under the same illuminance, instead of determining the calibration coefficient based on the difference of the bare data, can make the calibration coefficient more accurate. The reason is that the first ambient light parameter and the second ambient light parameter are obtained after the calculation of the parameter fitting curve, which is already improved in accuracy compared with the bare data, and the calibration is performed after the calculation of the parameter fitting curve instead of the calibration in the bare data stage, so that the calibration coefficient can be more accurate by selecting the latest data before the calibration to determine the calibration coefficient.
In this example, assuming that the parameter fitting curve is one, the first ambient light parameter is obtained after the bare data collected by the standard camera under the first illuminance is calculated by the parameter fitting curve, and the second ambient light parameter is obtained after the bare data collected by the first camera under the first illuminance is calculated by the parameter fitting curve; assuming that the parameter fitting curves are multiple, the first ambient light parameter is obtained by firstly determining an illumination interval to which the bare data collected by the standard camera under the first illumination belongs, then selecting a corresponding parameter fitting curve, and then calculating the collected bare data by using the selected parameter fitting curve; the second ambient light parameter is obtained by firstly determining an illumination interval of the bare data collected by the first camera under the first illumination, then selecting a corresponding parameter fitting curve, and then calculating the collected bare data by using the selected parameter fitting curve.
The description will be made in connection with the assumption of the actual numerical value, assuming that the first illuminance is 100lux, the bare data collected by the standard camera is 213, the bare data collected by the first camera is 212, the parameter fitting curve is one, the first ambient light parameter calculated by the parameter fitting curve is 96, the second ambient light parameter is 94, the calibration coefficient is 94/96, and the standard coefficient is determined to be 0.98. Assuming that the first illuminance is 100lux, the bare data collected by the standard camera is 213, the bare data collected by the first camera is 212, the parameter fitting curves are two, namely a curve #1 and a curve #2, the illuminance interval where the curve #1 is located is corresponding to the bare data collected by the standard camera, the first ambient light parameter calculated by the curve #1 is 98, the curve #1 is corresponding to the bare data collected by the first camera, the second ambient light parameter calculated by the curve #1 is 93, the calibration coefficient is 93/98, and the standard coefficient is 0.95. It will also be appreciated that in this hypothetical example, the calibration coefficients could also be 96/94, and 98/93, then when the calibration coefficients are used later, only the difference between multiplication with the calibration coefficients or division with the calibration coefficients is based on the initial ambient light parameters. This hypothetical example is provided to facilitate understanding of the method of determining calibration coefficients in embodiments of the present application, and specific values are not limiting.
The results of the method of determining the calibration coefficients in the conventional scheme and the method of calibrating the coefficients of the present application are compared with each other as follows in conjunction with tables 4 and 5.
TABLE 4 Table 4
Table 4 is a scheme for calibrating a conventional physical ambient light device, and it can be seen that the calibration coefficient is determined by using the ratio of the bare data (rawdata) output by the ambient light device to the bare data of the standard ambient light device, but since the calibration is performed by substituting the fitted curve, the accuracy of the finally obtained ambient light parameter is reduced, which can be understood as limited calibration effect. Furthermore, since the bare data and the final ambient light parameter are not very linear, determining the calibration factor directly based on the bare data results in a high accuracy for the selected bare data for calibration, but the accuracy is not guaranteed when other bare data are calibrated according to this calibration factor.
The gold machine in table 4 can be understood as a standard machine, i.e. a standard electronic device, and the ambient light device in the gold machine is a standard ambient light device. The machine to be tested is an electronic device to be tested, and the machine to be tested comprises an ambient light device to be tested. For the illumination point of 2500LUX, bare data output by a standard environment optical device in a gold machine is 400, and assuming that a conversion formula adopts lux= 10.238 ×bare data-1327.8, a pre-calibration LUX report value=10.238×400-1327.8 =2767 can be obtained, the precision before calibration is 10.68, calibration is not required based on the gold machine, the LUX after calibration is 2767, and the precision after calibration is still 10.68%. For a machine to be tested, for an illuminance point of 2500lux, bare data output by an ambient light device to be tested in the machine to be tested is 420, the bare data is substituted into the conversion formula to obtain a lux report value before calibration of 2972, the precision before calibration is 18.80%, a calibration coefficient is 400/420 (bare data of a gold machine/bare data of the machine to be tested under the illuminance point) =0.95, the lux report value after calibration=10.238×420×0.95-1327.8 =2767, and the precision after calibration is 10.68% which is the same as the precision of the point of the gold machine. However, for a machine to be tested, when the illuminance point is 30lux, bare data output by an ambient light device to be tested in the machine to be tested is 133, the bare data is substituted into the conversion formula to obtain a pre-calibration lux report value 33, the pre-calibration precision is 10%, the calibration coefficient is 400/420=0.95, the post-calibration lux report value=10.238×133×0.95-1327.8 = -34, the post-calibration precision is-213%, and the post-calibration precision is reduced, that is, the error is increased. The calibration coefficient is determined based on bare data, and cannot be adapted to the operation of a subsequent parameter fitting curve, so that the calibration coefficient is only effective for the selected illuminance point used for determining the calibration coefficient, and the accuracy after calibration cannot be ensured for other illuminance points.
Aiming at the problems, the calibration scheme adopted by the application is to calibrate based on the parameters output by the fitting dotted line and output the target ambient light parameters after calibration. Table 5 is a scheme for calibrating cameras under the scheme of the embodiment of the present application.
TABLE 5
As can be seen from table 5, the calibration coefficient is determined by using the ratio of the initial ambient light parameter obtained by calculating the parameter fitting curve of the bare data output by the camera to the initial ambient light parameter obtained by calculating the same parameter fitting curve of the bare data output by the standard camera. It can be understood that the traditional scheme is to calibrate first and then substitute the conversion formula, and the scheme of the application is to substitute the fitting curve first and then calibrate on the output value. The scheme of the application is not influenced by subsequent calculation during calibration, so that the calibration is more accurate. Because a good linear relationship between the initial ambient light parameter and the target ambient light parameter can be considered.
The camera in the gold machine in table 5 can be understood as a standard machine, that is, a standard electronic device, and the camera in the gold machine is a standard camera (an example of the standard camera described above). The machine to be tested is an electronic device to be tested, and the machine to be tested comprises a camera to be tested. For the illumination point of 2500LUX, bare data output by a standard camera in a gold machine is 400, and assuming that a fitting curve formula adopts lux= 10.238 ×bare data-1327.8, a pre-calibration LUX report value=10.238×400-1327.8 =2767 can be obtained, the pre-calibration precision is 10.68, calibration is not required based on the gold machine, the LUX after calibration is 2767, and the post-calibration precision is still 10.68%. For a machine to be tested, for an illuminance point of 2500lux, bare data output by a camera to be tested in the machine to be tested is 420, the bare data is substituted into the fitting curve formula to obtain a lux report value before calibration of 2972, the accuracy before calibration is 18.80%, a calibration coefficient is 2767/2972 (bare data of a gold machine/bare data of the machine to be tested under the illuminance point) =0.93, the lux report value after calibration=2972×0.93=2767, and the accuracy after calibration is 10.68% which is the same as the accuracy of the point of the gold machine. When the illuminance point is 30lux, bare data output by a camera to be measured in the machine to be measured is 133, the bare data are substituted into the fitting curve formula to obtain a lux report value before calibration of 33, the accuracy before calibration is 10%, the calibration coefficient is 0.93, the lux report value after calibration is=33×0.93=30.69, the accuracy after calibration is 2%, and the accuracy after calibration is improved, namely the error is reduced. The calibration coefficient is determined based on the ambient light parameters output by the fitting curve, has good linear relation between the values before and after calibration, and can be adapted to all other illumination points, so that the accuracy after calibration is ensured to be improved.
The scheme shown in fig. 5 mainly enables the camera to work in an ambient light mode to collect bare data and process the bare data to obtain target ambient light parameters by multiplexing the ambient light sensing function in the camera with both image sensing and ambient light sensing functions. Compared with the traditional environment light devices, the camera has the advantages of natural area and layout position, is not influenced by shielding when sensing light, has larger light receiving area, does not need to be provided with various matched devices of the traditional environment light devices such as a light receiving hole, a light receiving column and the like, and does not need to carry out complex operations of the traditional environment light devices, such as compensation and the like on screen light. In addition, at least one camera in the electronic equipment can directly work in a camera mode and an ambient light mode, which is equivalent to superposing an ambient light detection function on other functional devices, so that an ambient light sensing device and a matched processing circuit thereof can be directly removed from the electronic equipment, and the influence of the traditional ambient light device on the layout of a PCB, the layout of a complete machine and the appearance of the complete machine is avoided.
It should be noted that, in the scheme of the present application, the implementation is based on a camera with two functions, and if the camera is a traditional camera capable of sensing only an image, the scheme of the embodiment of the present application cannot be implemented. Although manufacturers try to collect images directly by using a traditional camera and then determine ambient light after image analysis, the camera needs to run in the background all the time, the algorithm of image analysis is very complex, the occupation of resources of a processor is very large, the background needs to collect images all the time, image analysis operation and determine the ambient light, the image analysis operation and the image analysis operation are required to be carried out all the time, the use of other applications is likely to be caused, the blocking is likely to happen, the power consumption is excessive, and the service life damage to electronic equipment is also large. Therefore, in the scheme of the application, the improvement scheme is abandoned, and the scheme of the application is realized by utilizing the light sensing function of the camera based on the camera with the two functions of image sensing and light sensing.
Fig. 6 is a graph comparing test results of the ambient light detection scheme of the embodiment of the present application with those of the conventional ambient light detection scheme under the same conditions. For example, in the same test environment, the electronic device of the embodiment of the application is used for collecting the final output ambient light parameters of the electronic device under different illumination, drawing the curve B in fig. 6 by combining the data output by the camera, and collecting the final output ambient light parameters of the electronic device under different illumination by using the electronic device of the traditional physical ambient light device, and drawing the curve a in fig. 6 by combining the data output during the physical ambient light. In fig. 6, the horizontal axis is rawdata, the vertical axis is a target ambient light parameter, curve a is a test curve obtained based on an entity ambient light device in a traditional ambient light detection scheme, curve B is a test curve obtained based on a camera with a photosensitive function in the ambient light detection scheme, and it can be seen that the two curves almost coincide, so that the effect achieved by the scheme of the application is sufficiently comparable to that of the traditional ambient light detection scheme, and therefore, the scheme of the embodiment of the application can still be utilized to detect more accurate ambient light by directly removing the entity ambient light device. In addition, the curve C and the curve D correspond to +30% precision and-30% precision, respectively, that is, the accuracy of the scheme can be ensured to meet the requirement as long as the curve B is between the curve C and the curve D.
Fig. 7 is a graph showing the comparison of the test results of the ambient light detection scheme according to the embodiment of the present application with the ambient light detection schemes of other manufacturers under the same conditions, respectively. Fig. 7 (a) shows a comparison chart of the test results of the present application and the a manufacturer under the C light environment, fig. 7 (b) shows a comparison chart of the test results of the present application and the a manufacturer under the D light environment, and fig. 7 (C) shows a comparison chart of the test results of the present application and the a manufacturer under the a light environment. It can be seen that the test effect of the scheme of the application is better than the test effect of the light device in the same physical environment under different illumination environments. Fig. 7 (D) shows a comparison chart of the test results of the present application and the S manufacturer in the C light environment, fig. 7 (e) shows a comparison chart of the test results of the present application and the S manufacturer in the D light environment, and fig. 7 (f) shows a comparison chart of the test results of the present application and the S manufacturer in the a light environment. It can be seen that the test effect of the scheme of the application is better than the test effect of the light device in the same physical environment under different illumination environments. As can be seen from comparing (a) with (d), (b) with (e), (c) and (f) in fig. 7, in the same illumination environment, the scheme of the embodiment of the application can achieve equivalent test effects compared with different physical environment light devices.
Fig. 9 is a schematic diagram comparing the layout of the whole machine with the conventional scheme. In fig. 9 (a), the layout of the whole traditional electronic device is shown, the top of the electronic device includes a very small light receiving hole, which is smaller than the hole of the speaker, the ambient light device is located below the light receiving hole, and the front camera on the screen can only be used for shooting (photographing, video recording or code scanning, etc.), but it should be understood that the rear camera of the traditional electronic device shown in fig. 9 (a) can also only be used for shooting. Fig. 9 (b) shows the overall layout of the electronic device according to the embodiment of the present application, where no light receiving hole is provided on the electronic device, and no ambient light device is provided on the electronic device, and the front camera of the electronic device can be used for both photographing and detecting ambient light. It should be understood that a dual mode camera (a camera having both image sensing and light sensing functions) may be provided as a front camera, as well as a rear camera. That is, the first camera in the electronic device according to the embodiment of the present application may be a rear camera.
It should also be understood that the speaker holes shown in fig. 9 (a) and (b) are for convenience in distinguishing the light receiving holes from other openings, and in practice, the speaker holes may not be provided on the top (top of the housing) or may be provided on the top of the screen in the same direction as the screen. The light receiving hole shown in fig. 9 (a) is also shown on the top, and may be located at other positions in practice, which is not shown. Fig. 9 is mainly for illustrating that the scheme of the embodiment of the present application attaches ambient light detection to the camera to be performed, so that independent ambient light devices and matched accessories and processing circuits can be removed from the whole machine.
The method of the embodiment of the present application is mainly described above with reference to the drawings. It should be understood that, although the steps in the flowcharts related to the above embodiments are shown in order, these steps are not necessarily performed in the order shown in the drawings. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages. The following describes an apparatus according to an embodiment of the present application with reference to the accompanying drawings.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application. As shown in fig. 10, the electronic device 900 may include a processor 910, an external memory interface 920, an internal memory 921, a universal serial bus (universal serial bus, USB) interface 930, a charge management module 940, a power management module 941, a battery 942, an antenna 1, an antenna 2, a mobile communication module 950, a wireless communication module 960, a sensor module 980, a camera 993, a display 994, and the like. The sensor module 980 may include, among other things, a pressure sensor 980A, a touch sensor 980K, etc.
It should be noted that the electronic device 900 does not include an ambient light sensor, that is, the sensor module 980 does not include an ambient light sensor.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 900. In other embodiments of the application, electronic device 900 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Illustratively, the processor 910 shown in fig. 10 may include one or more processing units, such as: the processor 910 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 900, among other things. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 910 for storing instructions and data. In some embodiments, the memory in the processor 910 is a cache memory. The memory may hold instructions or data that the processor 910 has just used or recycled. If the processor 910 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 910 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 910 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
In some embodiments, the I2C interface is a bi-directional synchronous serial bus including a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). The processor 910 may include multiple sets of I2C buses. The processor 910 may be coupled to the touch sensor 980K, charger, flash, camera 993, etc., respectively, through different I2C bus interfaces. For example, the processor 910 may couple the touch sensor 980K through an I2C interface, causing the processor 910 to communicate with the touch sensor 980K through an I2C bus interface, implementing the touch functionality of the electronic device 900.
In some embodiments, the UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. UART interfaces are typically used to connect the processor 910 with the wireless communication module 960. For example, the processor 910 communicates with a bluetooth module in the wireless communication module 960 through a UART interface to implement bluetooth functions.
In some embodiments, a MIPI interface may be used to connect processor 910 with peripheral devices such as display 994, camera 993, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. The processor 910 and the camera 993 communicate through the CSI interface to implement the photographing function of the electronic device 900. Processor 910 and display 994 communicate via a DSI interface to implement the display functions of electronic device 900.
In some embodiments, the GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. The GPIO interface may be used to connect the processor 910 with the camera 993, the display 994, the wireless communication module 960, the sensor module 980, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
It should be understood that the connection between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 900. In other embodiments of the present application, the electronic device 900 may also employ different interfacing manners, or a combination of interfacing manners, in the above embodiments.
The embodiment of the application mainly relates to that an interface for communicating the pre-configuration processor 910 and the camera 993 is defaulted to be an I2C interface, and can be configured as a GPIO interface so as to switch the working mode and the communication path of the camera.
The electronic device 900 implements display functionality via a GPU, a display 994, and an application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 994 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 910 may include one or more GPUs that execute program instructions to generate or change display information.
The display 994 is used to display images, videos, and the like. The display 994 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (flex), mini-Led, micro-oLed, a quantum dot light-emitting diode (quantum dot light-emitting diodes, QLED), or the like. In some embodiments, the electronic device 900 may include 1 or N displays 994, N being a positive integer greater than 1.
The electronic device 900 may implement shooting functions through an ISP, a camera 993, a video codec, a GPU, a display 994, an application processor, and the like.
The ISP is used to process the data fed back by the camera 993. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, an ISP may be provided in the camera 993.
The camera 993 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device 900 may include 1 or N cameras 993, N being a positive integer greater than 1.
The internal memory 921 may be used to store computer-executable program code that includes instructions. The processor 910 executes various functional applications of the electronic device 900 and data processing by executing instructions stored in the internal memory 921. The internal memory 921 may include a stored program area and a stored data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 900 (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 921 may include a high-speed random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The pressure sensor 980A is configured to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 980A may be disposed on the display 994. The pressure sensor 980A is of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor 980A, the capacitance between the electrodes changes. The electronic device 900 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 994, the electronic device 900 detects the touch operation intensity from the pressure sensor 980A. The electronic device 900 may also calculate the location of the touch based on the detection signal of the pressure sensor 980A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity smaller than a first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
Touch sensor 980K, also referred to as a "touch panel". The touch sensor 980K may be disposed on the display 994, and the touch sensor 980K and the display 994 form a touch screen, which is also referred to as a "touch screen". The touch sensor 980K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 994. In other embodiments, the touch sensor 980K may be disposed on a surface of the electronic device 900 other than where the display 994 is located.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The embodiment of the application also provides electronic equipment, which comprises: one or more processors, memory, and a computer program stored in the memory and executable on the one or more processors, the one or more processors executing the computer program to cause an electronic device to perform the steps of any of the methods described above. The embodiment of the application also provides a computer readable storage medium, and the computer readable storage medium stores a computer program, and the computer program can realize the steps in the above method embodiments when being executed by electronic equipment.
The computer readable medium may include at least: any entity or device capable of carrying computer program code to a camera device/electronic apparatus, a recording medium, a computer memory, a read-only memory (ROM), a random access memory (random access memory, RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
Embodiments of the present application provide a computer program product comprising a computer program for performing the steps of the method embodiments described above when the computer program is executed by an electronic device. The computer program comprises computer program code which may be in source code form, object code form, executable file or in some intermediate form, etc.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/device and method may be implemented in other manners. For example, the apparatus/device embodiments described above are merely illustrative, e.g., the division of modules or elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (11)

1. A method for detecting ambient light applied to an electronic device, wherein the electronic device comprises a first camera, and an operation mode of the first camera comprises a camera mode and an ambient light mode, and the method comprises:
when the working mode of the first camera is detected to be an ambient light mode, a first communication channel between the first camera and the processor is opened, wherein the first communication channel is used for representing a communication channel corresponding to the ambient light mode;
acquiring bare data of ambient light in an ambient light mode by using the first camera, wherein the bare data is used for representing initial data of the ambient light acquired by the first camera;
transmitting the bare data to the processor using the first communication path;
and processing the bare data by using the processor to obtain a target ambient light parameter.
2. The method according to claim 1, wherein when the operation mode of the first camera is detected as the ambient light mode, switching the communication path between the first camera and the processor to the path corresponding to the ambient light mode includes:
Configuring an interface of a second communication path between the first camera and the processor from a default integrated circuit I2C interface to a general purpose input/output GPIO interface, wherein the second communication path is a communication path corresponding to the camera mode;
And setting the GPIO interface of the second communication path to be in a high-resistance state.
3. The method according to claim 2, wherein the method further comprises:
When the working mode of the first camera is detected to be a camera mode, the second communication channel is opened, and an interface of the first communication channel is configured to be a GPIO interface from a default I2C interface;
And setting the GPIO interface of the first communication channel to be in a high-resistance state.
4. The method of claim 1, wherein the first camera comprises a first set of pixels, the camera pattern corresponds to the first set of pixels, the first set of pixels comprises a second set of pixels, the ambient light pattern corresponds to the second set of pixels, the pixels in the second set of pixels are used for sensing images and light, the pixels in the first set of pixels other than the second set of pixels are used for sensing images, and the pixels in the second set of pixels are uniformly distributed in the first set of pixels.
5. The method of any one of claims 1 to 4, wherein processing the bare data with the processor results in a target ambient light parameter, comprising:
Substituting the bare data into a parameter fitting curve, and calculating to obtain the target ambient light parameter, wherein the parameter fitting curve is obtained by fitting ambient light data acquired under a plurality of known luminances lux with the plurality of known luminances.
6. The method of claim 5, wherein the parameter-fitting curve comprises a first curve and a second curve, the first curve corresponding to a first illumination interval and the second curve corresponding to a second illumination interval, a maximum illumination in the first illumination interval being less than or equal to a minimum illumination in the second illumination interval;
Substituting the bare data into a parameter fitting curve, and calculating to obtain the target ambient light parameter, wherein the method comprises the following steps:
substituting the ambient light data into the first curve when the bare data is judged to correspond to the first illumination interval, and calculating to obtain the target ambient light parameter; or alternatively
And substituting the ambient light data into the second curve when the bare data is judged to correspond to the second illumination interval, and calculating to obtain the target ambient light parameter.
7. The method according to claim 5 or 6, wherein substituting the bare data into a parameter fitting curve, calculating the target ambient light parameter, comprises:
substituting the bare data into the parameter fitting curve, and calculating to obtain an initial ambient light parameter;
and obtaining the target ambient light parameter by using the calibration coefficient of the first camera and the initial ambient light parameter.
8. The method of claim 7, wherein the method further comprises:
Acquiring a first ambient light parameter corresponding to a standard camera under a first illumination, and acquiring a second ambient light parameter corresponding to the first camera under the first illumination; the first ambient light parameter is obtained by utilizing a parameter fitting curve and bare data acquired by the standard camera under the first illumination, and the second ambient light parameter is obtained by utilizing a parameter fitting curve and bare data acquired by the first camera under the first illumination;
And determining a calibration coefficient of the first camera by using the first ambient light parameter and the second ambient light parameter.
9. The method according to any one of claims 4 to 8, further comprising:
Determining the weight of each pixel point in the second pixel point set according to the contribution degree of each pixel point in the second pixel point set in the first camera to the whole output data of the first camera, so as to obtain a weight matrix corresponding to the second pixel point set;
The acquiring the bare data of the ambient light by using the first camera in the ambient light mode comprises the following steps:
acquiring first data by using each pixel point in the second pixel point set in the first camera, wherein the first data comprises data of ambient light acquired by each pixel point;
Multiplying the first data by the weight matrix to obtain the bare data.
10. An electronic device comprising a memory, one or more processors, and a computer program stored in the memory and executable on the processor, wherein execution of the computer program by the one or more processors causes the electronic device to implement the method of any one of claims 1 to 9.
11. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by an electronic device, implements the method according to any one of claims 1 to 9.
CN202311848233.2A 2023-12-28 2023-12-28 Method for detecting ambient light and electronic device Pending CN117938996A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311848233.2A CN117938996A (en) 2023-12-28 2023-12-28 Method for detecting ambient light and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311848233.2A CN117938996A (en) 2023-12-28 2023-12-28 Method for detecting ambient light and electronic device

Publications (1)

Publication Number Publication Date
CN117938996A true CN117938996A (en) 2024-04-26

Family

ID=90749948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311848233.2A Pending CN117938996A (en) 2023-12-28 2023-12-28 Method for detecting ambient light and electronic device

Country Status (1)

Country Link
CN (1) CN117938996A (en)

Similar Documents

Publication Publication Date Title
US10510136B2 (en) Image blurring method, electronic device and computer device
JP5786254B2 (en) Method and apparatus for controlling light emitting devices in a terminal device, and terminal device
CN104584113B (en) Display device
CN107945769A (en) Ambient light intensity detection method, device, storage medium and electronic equipment
CN115601244B (en) Image processing method and device and electronic equipment
CN112017615A (en) Method for calibrating ambient light brightness of electronic equipment and electronic equipment
CN110139088A (en) Color temperature compensating method, electronic equipment and computer readable storage medium
CN108881875B (en) Image white balance processing method and device, storage medium and terminal
CN115601274B (en) Image processing method and device and electronic equipment
CN109729281A (en) Image processing method, device, storage medium and terminal
CN109855727A (en) Environment light detection method, device, electronic equipment and storage medium
CN110519485A (en) Image processing method, device, storage medium and electronic equipment
CN106210517A (en) The processing method of a kind of view data, device and mobile terminal
CN106060402A (en) Image data processing method and device, and mobile terminal
WO2021093513A1 (en) Under-display camera systems and methods
CN109361853A (en) Image processing method, device, electronic equipment and storage medium
CN110868533B (en) HDR mode determination method, device, storage medium and terminal
CN104869319A (en) Image processing method and image processing device
CN109040729B (en) Image white balance correction method and device, storage medium and terminal
CN113177886B (en) Image processing method, device, computer equipment and readable storage medium
JP2014116710A (en) Imaging apparatus, imaging condition setting method, and program
CN111918047A (en) Photographing control method and device, storage medium and electronic equipment
CN117938996A (en) Method for detecting ambient light and electronic device
CN115706750B (en) Color temperature calibration method, color temperature calibration device and storage medium
US11308846B2 (en) Electronic devices with color compensation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination