WO2023065994A1 - 紫外线检测方法及电子设备 - Google Patents

紫外线检测方法及电子设备 Download PDF

Info

Publication number
WO2023065994A1
WO2023065994A1 PCT/CN2022/121907 CN2022121907W WO2023065994A1 WO 2023065994 A1 WO2023065994 A1 WO 2023065994A1 CN 2022121907 W CN2022121907 W CN 2022121907W WO 2023065994 A1 WO2023065994 A1 WO 2023065994A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
ultraviolet
electronic device
grayscale
Prior art date
Application number
PCT/CN2022/121907
Other languages
English (en)
French (fr)
Inventor
胡宏伟
郜文美
卢曰万
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023065994A1 publication Critical patent/WO2023065994A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information

Definitions

  • the present application relates to the field of terminal technology, in particular to an ultraviolet detection method and electronic equipment.
  • Sun protection methods include methods and measures to protect against ultraviolet rays, such as applying sunscreen to resist ultraviolet rays.
  • the user can learn the ultraviolet index of the environment through the weather forecast, and guide sun protection based on the ultraviolet index.
  • the ultraviolet index reported by the weather forecast usually roughly gives the weather conditions in a large area, and cannot give the accurate ultraviolet situation at the user's location. Therefore, it is urgent to propose a solution that can accurately measure the ultraviolet index.
  • the application provides an ultraviolet detection method and electronic equipment, which can improve the detection accuracy of ultraviolet indicators.
  • an ultraviolet detection method is provided, the method is applied to an electronic device or a component capable of realizing the function of an electronic device (such as a chip system), the electronic device includes a first camera, the first camera includes an ultraviolet camera, and the method includes: detecting the first instruction , and after detecting the first instruction, collect the first image through the first camera; determine the ultraviolet index according to the first image, and display the first information; the first instruction is used to instruct the detection of the ultraviolet index; the first information includes the ultraviolet index information.
  • the electronic device can collect ultraviolet images through the ultraviolet camera, and determine the location of the user according to the collected ultraviolet images. Real-time UV indicator and prompt the user for the UV indicator. Since the ultraviolet index of this solution is obtained through the ultraviolet image collected in real time at the user's location, the detection result of the ultraviolet index is more accurate.
  • second information is displayed for prompting to aim the first camera at the target object, where the target object includes the sky.
  • the user can adjust the shooting direction and angle of the ultraviolet camera so that the ultraviolet camera can be directed to a position where the ultraviolet reflection and scattering are less, so that ultraviolet rays can be collected more effectively and the detection accuracy of ultraviolet rays can be improved.
  • the electronic device further includes a second camera
  • the method further includes: collecting a second image through the second camera; the second camera includes a color camera;
  • the method further includes:
  • a user interface is displayed, the user interface includes part of the content of the first image, and/or includes part of the content of the second image.
  • third information is displayed, and the third information includes parameters of the first camera; the parameters include sensitivity and exposure time.
  • capturing the second image through the second camera includes: capturing the second image through the second camera while capturing the first image through the first camera;
  • the method further includes: judging whether the second image satisfies the first condition; when it is determined that the second image satisfies the first condition, determining the ultraviolet index according to the first image; the first condition includes : the target object exists in the second image; or, the first condition includes: the target object exists in the second image, and the area ratio of the target object in the second image is greater than or equal to a threshold.
  • a target object such as the sky
  • a color camera Since color images generally have more imaging details and higher image quality, a target object (such as the sky) can be detected more accurately through the color image collected by a color camera.
  • using the second camera to collect the second image includes: after detecting the first instruction and before using the first camera to collect the first image, using the second camera to collect the second image;
  • the method includes: judging whether the second image satisfies the first condition;
  • Capturing the first image through the first camera includes: capturing the first image through the first camera when it is determined that the second image satisfies the first condition.
  • determining the ultraviolet index according to the first image includes: determining the ultraviolet index according to gray values of pixels included in the first image.
  • the method further includes:
  • the grayscale value of the first image is the target grayscale
  • adjust the parameters of the first camera and collect a third image through the first camera, wherein the grayscale of the third image is the target grayscale
  • Determining the ultraviolet index according to the first image includes: determining the ultraviolet index according to the first camera parameter corresponding to the first image, the first camera parameter corresponding to the third image, and the target grayscale.
  • the method further includes:
  • the grayscale value of the first image is not the target grayscale, then adjust the parameters of the first camera, and collect the fourth image, the grayscale of the fourth image is the target grayscale;
  • the grayscale of the fifth image is the target grayscale
  • Determining the ultraviolet index according to the first image includes: determining the ultraviolet index according to the first camera parameter corresponding to the fourth image, the first camera parameter corresponding to the fifth image, and the target grayscale.
  • the gray value of the collected ultraviolet image can be adjusted, and then the ultraviolet index of the location of the electronic device can be determined according to the gray value.
  • the parameters of the first camera include exposure time and sensitivity.
  • the electronic device can also detect the user's application of sunscreen according to the ultraviolet index, and prompt the user with the detection result of the application of sunscreen. In this way, users can guide daily sun protection based on the test results.
  • an electronic device in a second aspect, includes a first camera, the first camera includes an ultraviolet camera, and the electronic device includes:
  • a processing module configured to detect the first instruction input by the user through the input module
  • the first camera is configured to collect a first image after the processing module detects the first instruction
  • a processing module configured to determine an ultraviolet index according to the first image
  • the display module is used to display the first information; the first instruction is used to instruct to detect the ultraviolet index; the first information includes the information of the ultraviolet index.
  • the display module is further configured to display second information before the processing module determines the ultraviolet index according to the first image, and the second information is used to prompt the first camera to aim at the target object, and the target object includes the sky .
  • the electronic device further includes a second camera for collecting a second image; the second camera includes a color camera;
  • the display module is further configured to display a user interface after the first image is captured by the first camera, the user interface includes part of the first image, and/or includes part of the second image.
  • the display module is further configured to display third information after the first image is captured by the first camera, where the third information includes parameters of the first camera; the parameters include sensitivity and exposure time.
  • the second camera is used to collect the second image, including: collecting the second image by the second camera while collecting the first image by the first camera;
  • the processing module is also used to determine whether the second image satisfies the first condition after the second image is captured by the second camera; when it is determined that the second image satisfies the first condition, determine the ultraviolet index according to the first image; the first The condition includes: the target object exists in the second image; or, the first condition includes: the target object exists in the second image, and the area ratio of the target object in the second image is greater than or equal to a threshold.
  • the second camera, used to collect the second image includes: after the processing module detects the first instruction and before the first camera collects the first image, collect the second image;
  • the processing module is further configured to determine whether the second image satisfies the first condition after the second camera captures the second image;
  • the first camera configured to collect the first image, includes: collecting the first image when it is determined that the second image satisfies the first condition.
  • the processing module configured to determine the ultraviolet index according to the first image, includes: determining the ultraviolet index according to the gray value of the pixels included in the first image.
  • the processing module is further configured to, if the grayscale value of the first image is the target grayscale, adjust the second A camera parameter, and under the parameter condition, call the first camera to collect the third image, wherein the grayscale of the third image is the target grayscale;
  • the processing module is configured to determine the ultraviolet index according to the first image, including: determining the ultraviolet index according to the first camera parameter corresponding to the first image, the first camera parameter corresponding to the third image, and the target grayscale.
  • the processing module is further configured to adjust the parameters of the first camera if the grayscale value of the first image is not the target grayscale after the first image is captured by the first camera, and Under the condition, the first camera is called to collect the fourth image, and the grayscale of the fourth image is the target grayscale;
  • the processing module is also used to adjust the parameters of the first camera and collect the fifth image, the grayscale of the fifth image is the target grayscale;
  • the processing module is configured to determine the ultraviolet index according to the first image, including: determining the ultraviolet index according to the first camera parameter corresponding to the fourth image, the first camera parameter corresponding to the fifth image, and the target grayscale.
  • the parameters of the first camera include exposure time and sensitivity.
  • the electronic device can also detect the user's application of sunscreen according to the ultraviolet index, and prompt the user with the detection result of the application of the sunscreen. In this way, users can guide daily sun protection based on the test results.
  • the present application provides an electronic device, which has a function of implementing the detection method in any one of the foregoing aspects and any possible implementation manner.
  • This function may be implemented by hardware, or may be implemented by executing corresponding software on the hardware.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • the present application provides a computer-readable storage medium, including computer instructions.
  • the computer instructions When the computer instructions are run on the electronic device, the electronic device executes any one of any aspect and any one of the possible implementations. detection method.
  • the present application provides a computer program product, which, when the computer program product is run on an electronic device, causes the electronic device to execute any detection method according to any aspect and any possible implementation manner thereof.
  • a circuit system in a sixth aspect, includes a processing circuit, and the processing circuit is configured to execute the detection method in any one of the above aspects and any possible implementation manners thereof.
  • the embodiment of the present application provides a chip system, including at least one processor and at least one interface circuit.
  • at least one processor executes the detection method in any one of the above aspects and any possible implementation manners.
  • Fig. 1 is a schematic diagram of the principle of ultraviolet imaging provided by the embodiment of the present application.
  • FIGS. 2A-2D are schematic diagrams of the electronic equipment provided by the embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 4 is a schematic diagram of the software architecture of the electronic device provided by the embodiment of the present application.
  • Figure 5 is an interface diagram provided by the embodiment of the present application.
  • FIG. 6A and Figure 6B are interface diagrams provided by the embodiment of the present application.
  • FIGS 7-9 are interface diagrams provided by the embodiment of this application.
  • FIG. 10 is a schematic diagram of the relationship between the parameters of the camera and the ultraviolet index provided by the embodiment of the present application;
  • Figure 11 is a schematic diagram of the method flow provided by the embodiment of the present application.
  • FIG. 12 and Figure 13 are schematic diagrams of the interface provided by the embodiment of the present application.
  • Figure 14 is a schematic diagram of the flow of the detection method provided by the embodiment of the present application.
  • Fig. 15 is a schematic diagram of the training and use of the classifier provided by the embodiment of the present application.
  • Figure 16 is a schematic diagram of the device provided by the embodiment of the present application.
  • FIG. 17 is a schematic diagram of a chip system provided by an embodiment of the present application.
  • UV imaging The wavelength of ultraviolet light is not in the range of visible light, so it is invisible to the human eye.
  • ultraviolet light can be used as a light source, and an ultraviolet imaging system can be used for imaging.
  • This technology of capturing ultraviolet light and imaging it with an ultraviolet imaging system can be called ultraviolet imaging technology.
  • ultraviolet imaging users can observe the ultraviolet image formed by the object under ultraviolet irradiation.
  • red, green, blue, RGB red, green, blue, RGB
  • ultraviolet images are grayscale images that have one color channel, ie gray (gray) channel.
  • the ultraviolet imaging system includes an ultraviolet camera module.
  • the ultraviolet camera module can be a camera capable of capturing ultraviolet rays, which is called an ultraviolet camera.
  • an ultraviolet camera may include components such as a lens and an image sensor.
  • the lens can be made of materials such as but not limited to quartz glass, which can transmit ultraviolet rays.
  • the image sensor records information about the ultraviolet rays passing through the lens.
  • the ultraviolet camera can be applied to ultraviolet detection (including but not limited to ultraviolet intensity detection).
  • the ultraviolet image is collected by the ultraviolet camera, and the ultraviolet index corresponding to the ultraviolet image is determined according to the characteristics of the ultraviolet image.
  • the intensity of ultraviolet light can be judged according to the color depth of the ultraviolet image.
  • the amount of incoming ultraviolet light shown in (1) of Figure 1 is more than that shown in (2) of Figure 1 , therefore, in (1) of Figure 1 , the ultraviolet image formed by the ultraviolet camera is brighter and the color shallower.
  • the embodiment of the present application provides a detection method, which can be applied in electronic equipment, and the electronic equipment is equipped with an ultraviolet camera having a structure such as that shown in FIG. 1 .
  • the electronic device in the embodiment of the present application may be a mobile phone, a tablet computer, a wearable device, a vehicle-mounted device, a notebook computer, etc.
  • the mobile phone in the embodiment of the present application may be a folding screen mobile phone or a non-folding screen mobile phone.
  • the specific type of equipment is not limited in any way.
  • the layout of the cameras on the electronic device 100 can be referred to FIG. 2A , where the front of the electronic device 100 is the plane where the display screen 194 is located.
  • the camera 1931 is located on the front of the electronic device 100 , so the camera is a front camera.
  • the camera 1932 is located on the back of the electronic device 100 , and the camera is a rear camera.
  • the camera may include an ultraviolet camera and a color camera.
  • Color cameras include but are not limited to RGB cameras.
  • the solution of the embodiment of the present application may be applied to an electronic device 100 with a folding screen (that is, the display screen can be folded) having multiple display screens.
  • the folding screen may be a flexible folding screen.
  • the flexible folding screen includes a folding shaft made of flexible material. Part or all of the flexible folding screen is made of flexible materials. For example: only the foldable part (such as the folding shaft) of the flexible folding screen is made of flexible material, and the other parts are made of rigid material; or, the flexible folding screen is all made of flexible material.
  • the folding screen can be folded along the folding axis to form at least two sub-screens.
  • FIG. 2A shows a folding screen electronic device 100 .
  • the foldable screen is folded inwardly (or outwardly) along the folding edge, so that the foldable screen forms at least two sub-screens (for example, A sub-screen and B sub-screen).
  • there is a display screen (such as a C screen) on the outside of the fold. If the electronic device 100 is provided with a camera on the surface where the C-screen is located. Then, in the unfolded scene of the electronic device 100 as shown in (c) of FIG.
  • the camera on the C-screen is on the back of the electronic device 100 and can be regarded as a rear camera.
  • the camera on the C-screen becomes the front of the electronic device 100 and can be regarded as a front camera. That is to say, the front camera and the rear camera in this application do not limit the nature of the camera itself, but are only an illustration of a positional relationship.
  • the ultraviolet camera can be set on the C screen (by digging a hole or setting a camera under the screen, etc.).
  • the folding screen of the folding screen electronic device may form multiple (such as 2, 3, etc.) sub-screens.
  • the flexible folding screen as shown in (1) in FIG. 2B may include folding lines 030 and folding lines 031 . After being folded longitudinally along the folding line 030, a sub-screen 032, a sub-screen 033 and a sub-screen 034 as shown in (2) in FIG. 2B can be formed.
  • the screen arrangement of the folding screen electronic device may be a top-bottom screen arrangement such as that shown in (d) in FIG. 2A, or may also be a left-right screen arrangement such as that shown in (1) or (2) in FIG. 2C wait.
  • the embodiment of the present application does not limit the screen arrangement of the folding screen electronic device.
  • the flexible folding screen as shown in (1) in FIG. 2C after being folded laterally along the folding line 040, can form a sub-screen 041 and a sub-screen 042 as shown in (2) in FIG. 2C.
  • the folding screen device may be a mobile phone in a folded state (refer to the state shown in (1) of FIG. 2D ), and may be a tablet computer in an unfolded state.
  • the folding screen may be a single-sided screen (that is, only one side can display a user interface) or a double-sided screen (that is, both opposite sides can display a user interface).
  • a single-sided folding screen when the folding screen is turned toward the side that can display the user interface (that is, the front of the single-sided folding screen), it can be called forward folding; when the folding screen is turned toward the side that can display the user interface Folding is performed on the opposite side (ie, the back of the single-sided folding screen), which can be called reverse folding.
  • FIG. 2C represents a schematic diagram of forward folding
  • FIG. 2D represents a schematic diagram of reverse folding.
  • a folding screen device can determine whether it is currently being folded forward or backward.
  • FIG. 3 shows a schematic structural diagram of the electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals according to the instruction opcode and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the processor 110 is configured to call an ultraviolet camera to collect ultraviolet images, and determine ultraviolet indicators according to the ultraviolet images.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input of the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the display screen 194 , the camera 193 and so on.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (bluetooth, BT) and other wireless communication functions applied on the electronic device 100. solution.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • BT bluetooth
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the optical signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • the ISP can control the photosensitive element to expose and take pictures according to the shooting parameters.
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the camera 193 may be located in the edge area of the electronic device, and may be an under-screen camera, a liftable camera, or a hole-digging camera.
  • the camera 193 can include a rear camera, and can also include a front camera. The embodiment of the present application does not limit the specific position and shape of the camera 193 .
  • the camera 193 includes a common camera and an ultraviolet camera.
  • the ordinary camera includes a color camera.
  • the color camera may be, for example but not limited to, an RGB camera.
  • the color camera can be used to collect images, and the collected images can be used to determine whether objects such as the sky exist.
  • the captured color image can also be used to mark UV indicators.
  • the ultraviolet camera can be used to collect ultraviolet images, and the electronic device 100 can detect ultraviolet indicators according to the collected ultraviolet images.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the NPU uses image recognition technology to identify whether the image collected by the camera 193 includes a sky image.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio data into an analog audio electrical signal output, and is also used to convert an analog audio electrical signal input into digital audio data.
  • the audio module 170 may include an analog/digital converter and a digital/analog converter.
  • the audio module 170 is used to convert the analog audio electrical signal output by the microphone 170C into digital audio data.
  • the audio module 170 may also be used to encode and decode audio data.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also called “horn” is used to convert analog audio electrical signals into sound signals. Electronic device 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert analog audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into analog audio electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the microphone 170C may be a built-in component of the electronic device 100 , or may be an external accessory of the electronic device 100 .
  • the electronic device 100 may include one or more microphones 170C, where each microphone or multiple microphones cooperate to collect sound signals from various directions, and convert the collected sound signals into analog audio electrical signals. It can also achieve noise reduction, identify the source of sound, or directional recording functions.
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present invention uses a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 4 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the operating system of the electronic device (such as the system to be detected) is divided into four layers, which are respectively the kernel layer, the hardware abstract layer (hardware abstract layer, HAL), the application program framework layer, from bottom to top. and the application layer.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least camera driver, audio driver, display driver, and sensor driver.
  • Sensor drivers include but are not limited to image sensor drivers and acoustic sensor drivers.
  • a kernel layer such as a camera driver is invoked to turn on the camera.
  • the image sensor is called by the image sensor driver to complete the image acquisition.
  • the hardware abstract layer (hardware abstract layer, HAL) is located between the kernel layer and the application framework layer, and is used to define the interface that drives the hardware implementation of the application program, and converts the value implemented by the driving hardware into the software implementation program language. For example, identify the value driven by the camera, convert it into a software program language and upload it to the application framework layer, and then call the corresponding function.
  • the HAL can upload the sky image collected by the camera 193 to the application framework layer for further processing.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer obtains the original input event from the kernel layer via the HAL, and identifies the control corresponding to the input event.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include view system, phone manager, resource manager, notification manager, window manager, etc.
  • the application framework layer includes a first module.
  • the first module is used to call the ultraviolet camera to collect ultraviolet images through the camera driver, and determine the ultraviolet index according to the ultraviolet images.
  • the first module can also be arranged in other layers, and the first module can also be divided into more sub-modules. Each sub-module is used to perform a corresponding function.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 . For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • the application layer can consist of a series of application packages.
  • the application package may include applications such as camera, video, call, WLAN, music, short message, Bluetooth, map, calendar, gallery, and navigation.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the electronic device is a mobile phone with a structure such as that shown in FIG. 4 as an example.
  • the mobile phone can start the ultraviolet detection method of the embodiment of the present application when it detects that the preset conditions are met.
  • the preset conditions may include but not limited to any one or more of the following conditions: the preset application is opened.
  • Preset applications may include, but are not limited to, applications dedicated to UV detection.
  • the mobile phone may start the ultraviolet detection method of the embodiment of the present application after detecting that the user has input a preset instruction to the mobile phone.
  • the embodiment of the present application does not limit the manner of enabling the ultraviolet detection function.
  • the user can instruct the mobile phone to start the camera application and display the shooting preview interface by means of touch operation, button operation, air gesture operation or voice operation.
  • the mobile phone displays an interface as shown in (a) in FIG. 5 , and the interface includes an icon 401 of a camera application.
  • the mobile phone starts the camera application, and calls a common camera (such as an RGB camera) to collect images, and the shooting preview interface 402 shown in (b) in FIG. RGB image.
  • a common camera such as an RGB camera
  • the mobile phone starts the camera application, and calls the RGB camera to capture images, and obtains the shooting preview interface 402 shown in (b) in FIG. 5 .
  • the shooting preview interface may include one or more controls for realizing different functions.
  • an ultraviolet detection control 403 is included.
  • the ultraviolet detection control 403 can be used to enable the ultraviolet detection function of the embodiment of the present application, or used to indicate the detection of ultraviolet indicators.
  • the shooting preview interface 402 may also include other controls.
  • controls that can be used to turn on or off the filter function, controls to control taking pictures, controls to adjust the focus distance, and controls to adjust the brightness of the screen.
  • the mobile phone can call the color camera to collect a color image, and it can be shown in (b) of Figure 5
  • the preview interface 402 displays the captured color image.
  • the mobile phone can also call the ultraviolet camera to collect ultraviolet images while the color camera collects color images, and determine the ultraviolet index according to the collected ultraviolet images.
  • the ultraviolet index includes but not limited to ultraviolet intensity, ultraviolet index, and ultraviolet level.
  • the mobile phone can also directly display the ultraviolet images collected by the ultraviolet camera.
  • the mobile phone may display a prompt message 404 as shown in (b) of FIG. 5 , which is used to remind the user that the mobile phone is detecting the ultraviolet indicator (such as ultraviolet intensity).
  • the ultraviolet indicator such as ultraviolet intensity
  • the mobile phone may prompt the user for the detected ultraviolet index.
  • the mobile phone displays prompt information 405 for indicating the UV index 8 detected this time and the corresponding UV intensity.
  • the electronic device can conveniently and quickly obtain the ultraviolet index (such as ultraviolet intensity) at the current location by collecting images.
  • the ultraviolet index such as ultraviolet intensity
  • the mobile phone can also simultaneously display the ultraviolet image and the color image on the preview interface.
  • the mobile phone displays a preview interface as shown in (a) of FIG. 6A .
  • the mobile phone can call the color camera to collect color images, call the ultraviolet camera to collect ultraviolet images, and can be displayed in (a) of Figure 6A
  • the preview interface displays the collected color image and the collected UV image.
  • the color image can be used to identify the subject, and the ultraviolet image can be used to identify ultraviolet indicators.
  • a control 605 and a control 606 can also be displayed on the preview interface, and the control 605 is used to control the ultraviolet image 603 display size, the control 606 is used to control the display size of the color image 602 .
  • the mobile phone can display the ultraviolet image and the color image respectively in different preview interfaces. Users can choose to switch the image displayed in the current preview interface. Exemplarily, as shown in (a) of FIG. 6B , after detecting that the user clicks on the ultraviolet detection control 403, the mobile phone calls the color camera to collect color images, and calls the ultraviolet camera to collect ultraviolet images, and displays the color image 402 in the preview interface. After detecting that the user clicks the switch view control 406, the mobile phone may display the ultraviolet image 407 shown in (b) of FIG. 6B on the preview interface. For the meaning of (c) in FIG. 6B , refer to the related description of (c) in FIG. 5 , which will not be repeated here.
  • the mobile phone collects the color image and the ultraviolet image, and displays the color image in the preview interface by default.
  • the phone captures a color image as well as a UV image and displays in a preview interface the image from the view selected by the user the last time the camera app was exited. For example, when the user used the camera last time, he chose to display the ultraviolet image in the preview interface.
  • the ultraviolet detection control 403 shown in (a) in FIG. The UV image is displayed in the preview interface.
  • the user can adjust the shooting direction and angle of the ultraviolet camera so that the ultraviolet camera can be directed to a position where the ultraviolet reflection and scattering are less, so that ultraviolet rays can be collected more effectively and the detection accuracy of ultraviolet rays can be improved.
  • locations with less ultraviolet reflection and scattering include but are not limited to the sky.
  • the mobile phone can call the color camera to collect a color image, and judge whether the color image satisfies the first condition; If the first condition is satisfied, the ultraviolet index is determined according to the first image.
  • the first condition includes: the target object exists in the second image; or, the first condition includes: the target object exists in the second image, and the target object is in the second image
  • the proportion of the area is greater than or equal to the threshold.
  • the mobile phone can call the color camera to collect a color image, and determine whether there is a sky in the color image. Since color images generally have more imaging details and higher image quality, a target object (such as the sky) can be detected more accurately through the color image collected by a color camera.
  • the mobile phone after detecting that the user clicks on the ultraviolet detection control 403 shown in (a) of FIG. A camera) is aimed at the sky for framing.
  • the mobile phone first collects color images through the color camera, and when it is determined that there is sky in the collected color images (that is, the first condition is satisfied), the mobile phone can call the ultraviolet camera to collect ultraviolet images, and determine the ultraviolet index according to the ultraviolet images.
  • the mobile phone may display prompt information 701 (an example of second information) for prompting the user to point the camera (first camera) to Aim at the sky for framing.
  • the mobile phone collects color images through the color camera, and at the same time collects ultraviolet images through the ultraviolet camera. Since the ultraviolet camera and the color camera work synchronously, when it is determined that there is sky in the collected color image (that is, the first condition is met), it means that the ultraviolet image is also aligned. The image collected by the sky, then, the mobile phone can determine the ultraviolet indicator according to the ultraviolet image.
  • the mobile phone detects the sky in the color image, which can be implemented as follows: set the color segmentation range of the sky [R1 ⁇ R2, G1 ⁇ G2, B1 ⁇ B2], and perform color segmentation on the pixels in the color image according to the color segmentation range .
  • the R (red) component of the pixel is within the range of R1-R2
  • the G component is within the range of G1-G2
  • the B component is within the range of B1-B2
  • the imaging pixels contained in the sky in the color image are detected.
  • the mobile phone can also use other methods to detect the sky from the color image.
  • a machine learning method can be used to train a classifier for recognizing the sky, and the sky can be recognized from the color image through the classifier.
  • the embodiment of the present application does not limit the specific implementation of detecting the sky.
  • the phone can prompt the user to point the camera at the sky.
  • the preview interface 402 shown in (a) of FIG. 7 does not include the sky, indicating that the camera of the mobile phone is not aimed at the sky to take a view.
  • the mobile phone may display prompt information 701 for prompting the user to point the camera at the sky to take pictures.
  • the phone could play a voice prompting the user to point the camera at the sky and take a picture.
  • the embodiment of the present application does not limit the manner in which the mobile phone prompts the user to adjust the shooting direction and/or angle. The user can adjust the shooting direction and/or the shooting angle based on the prompt of the mobile phone so as to point the camera at the sky.
  • the mobile phone can call the ultraviolet camera to collect ultraviolet images, and determine the ultraviolet indicators based on the ultraviolet images and the constructed model.
  • the mobile phone may display a prompt message 702 for prompting that the ultraviolet indicator is being detected.
  • the mobile phone may display an interface 405 shown in (c) of FIG. 7 for prompting the detected ultraviolet index.
  • the camera may be aimed at the sky actively.
  • the mobile phone calls the color camera to collect a color image (an example of the second image)
  • it can detect the sky in the color image, that is, the camera is currently aiming at the sky for viewing.
  • the mobile phone can call the ultraviolet camera to collect ultraviolet images, and determine the intensity of ultraviolet rays according to the ultraviolet images.
  • the embodiment of the present application does not limit the timing of turning on the ultraviolet camera.
  • the ultraviolet camera and the color camera collect images synchronously.
  • the mobile phone when detecting that the user clicks on the ultraviolet detection control 403 shown in (b) such as Figure 5, the mobile phone calls the ultraviolet camera to collect the ultraviolet image, and at the same time, the mobile phone calls the color camera to collect the color image, and the collection of the color camera and the ultraviolet camera Synchronize.
  • the mobile phone may open the camera application and display the interface shown in (a) of FIG. 8 .
  • the mobile phone After detecting that the user clicks on the ultraviolet detection control 403, the mobile phone can call the color camera to collect color images (for identifying objects), and simultaneously call the ultraviolet camera to collect ultraviolet images (for determining ultraviolet indicators).
  • the mobile phone may display the collected color image 802 and ultraviolet image 801 in the preview interface.
  • the mobile phone may also display prompt information 701 for prompting the user to point the camera at the sky for framing.
  • the mobile phone When the user adjusts the shooting direction of the camera, the mobile phone continues to collect color images and simultaneously collects ultraviolet images. If the sky is detected in the collected color images, the mobile phone can calculate the ultraviolet index based on the ultraviolet images collected synchronously with the color images. .
  • a prompt message 702 such as that shown in (b) of FIG. 8 may be displayed to remind the user that the ultraviolet index is being calculated.
  • the mobile phone can also display the collected color image 804 and ultraviolet image 804 on the preview interface. After the ultraviolet index (such as ultraviolet intensity) is determined, the mobile phone may display an interface such as that shown in (c) of FIG. 8 for prompting the ultraviolet index.
  • the mobile phone after the ultraviolet detection function is turned on, for example, after the user clicks the ultraviolet detection control 403 shown in (a) in Figure 7, the mobile phone first calls the color camera to collect a color image, and detects whether there is a sky in the color image. . If the mobile phone determines that the sky exists in the color image, it means that the user has currently aimed the camera of the mobile phone at the sky for framing. Then, the mobile phone starts to call the ultraviolet camera to collect ultraviolet images, and detect ultraviolet indicators according to the ultraviolet images. For another example, as shown in (a) of FIG.
  • the mobile phone after detecting that the user clicks on the ultraviolet detection control 403, the mobile phone first invokes the color camera to collect a color image, and recognizes whether the user has pointed the camera at the sky for viewing according to the color image.
  • the color images collected as shown in (a) of FIG. 9 may be displayed on the preview interface, and a prompt message 701 may be displayed to prompt the user to point the camera at the sky for framing. If it is detected that there is a sky in the collected color image, it means that the user has pointed the camera at the sky for viewing. Then, the mobile phone can call the ultraviolet camera to capture the ultraviolet rays in the direction of the sky, form an ultraviolet image, and then determine the ultraviolet index according to the ultraviolet image.
  • the mobile phone can collect the ultraviolet image (part of the first image) collected by the ultraviolet camera and the color image (part of the second image) collected by the color camera. content) are displayed in the preview interface at the same time.
  • the description of (c) in FIG. 9 refer to the related description of (c) in FIG. 8 , which will not be repeated here.
  • the color depth of the ultraviolet image may be represented by grayscale values.
  • the grayscale value of the ultraviolet image may refer to the average grayscale value of the ultraviolet image.
  • the average gray value includes but is not limited to any gray value as follows: arithmetic average gray value, weighted average gray value. Wherein, in the arithmetic mean gray value, each pixel corresponds to the same weight.
  • the weights corresponding to different pixels may be different.
  • the brightness of the middle part of the ultraviolet image may be greater than the brightness of the edge part. This situation may be caused by more ultraviolet rays irradiating the middle part of the image sensor, and little or no ultraviolet rays irradiating the edge part of the image sensor. It can be seen that in this case, since the ultraviolet radiation is reflected in the middle pixels of the ultraviolet image, the middle pixels of the ultraviolet image are particularly important for detecting ultraviolet rays, and it is considered that a greater weight is given to the middle pixels of the ultraviolet image. Correspondingly, less weight is assigned to the edge part pixels of the ultraviolet image.
  • the edge part of the image and the middle part of the image can be predefined, for example, the pixels that are set to be separated from the center of the image by more than L (positive integer) pixels form the edge part of the image, and other parts in the image form the middle part of the image .
  • L may be flexibly determined according to scenarios, for example, may be determined according to empirical values, which is not limited in this embodiment of the present application.
  • the definition of the edge part and the middle part of the image can also have other ways, for example, assuming that the size of the image is M pixels*N pixels, the center of the image is the geometric center, and the side length is P (positive integer) pixels
  • the image pixels within the square can be regarded as the middle part of the image, and the image pixels outside the square can be regarded as the edge part of the image.
  • P is smaller than min(M,N), that is, P is smaller than the minimum value of M and N.
  • the embodiments of the present application do not limit the specific definitions of the edge and the middle part.
  • two weight values can be set, the weight value of the pixels in the middle part of the ultraviolet image is the first weight, the weight value of the pixels in the edge part of the ultraviolet image is the second weight, and the first weight value is greater than the second weight.
  • multiple weight values are set, a pixel closer to the center of the image is given a larger weight value, and a pixel farther away from the center of the image is given a smaller weight value.
  • a pixel whose distance from the image center is within a first range is given a first weight
  • a pixel whose distance from the image center is within a second range is given a second weight
  • a pixel whose distance from the image center is within a second range is given a second weight.
  • Pixels whose distance is within a third range are given a third weight.
  • the average gray value of the ultraviolet image is not only related to the ultraviolet index (such as ultraviolet intensity), but also related to the parameters of the ultraviolet camera.
  • the ultraviolet camera parameters associated with the average gray value of the ultraviolet image include but are not limited to any one or more of the following: exposure time, ISO.
  • the exposure time can be understood as the time when the shutter is opened.
  • the UV light is projected through the lens onto the image sensor.
  • the shutter is closed, ultraviolet light is blocked from being transmitted to the image sensor.
  • the longer the exposure time that is, the longer the shutter is open, the more ultraviolet light enters, and the average gray value of the formed ultraviolet image is usually higher, and the ultraviolet image is brighter.
  • the shorter the exposure time of the UV camera the lower the average gray value of the UV image, and the darker the UV image.
  • the average gray value of the ultraviolet image collected by the ultraviolet camera can be adjusted, and according to the above, the average gray value is related to the detection and identification of ultraviolet indicators. Therefore, one or more camera parameters above are related to the detection and identification of ultraviolet indicators.
  • the ultraviolet index is related to one or more parameters of the ultraviolet camera and the average gray value of the ultraviolet image
  • the ultraviolet index can be determined according to the camera parameters and the average gray value of the ultraviolet image.
  • a model is constructed according to the parameters of the ultraviolet camera, ultraviolet indicators (such as ultraviolet intensity) and the average gray value of the ultraviolet image.
  • the mobile phone can determine the ultraviolet index based on the known ultraviolet camera parameters, the average gray value of the ultraviolet image, and the constructed model.
  • the method for building a model may include the following steps:
  • G_min is the minimum value of the UV camera ISO
  • G_max is the maximum value of the UV camera ISO.
  • the ultraviolet camera is aimed at the ultraviolet light source, so that the ultraviolet camera can capture the ultraviolet rays emitted by the ultraviolet light source to form an ultraviolet image.
  • the ultraviolet intensity V of the ultraviolet light source is gradually adjusted from low to high.
  • the exposure time T of the ultraviolet camera is adjusted so that the ultraviolet image collected by the ultraviolet camera The average gray value reaches the target gray value.
  • the range of the target gray value is: Gray ⁇ Q.
  • Gray is the grayscale reference set to avoid overexposure or overdarkening.
  • Q is the amount of deviation.
  • each ultraviolet intensity has a corresponding ultraviolet image whose gray value is within the target gray value range (for example, within the Gray ⁇ Q range), and each ultraviolet intensity has a corresponding exposure time T.
  • FIG. 10 shows an exemplary functional relationship between the ultraviolet intensity V and the exposure time T when the ISO is G1.
  • the ultraviolet light source may be a dedicated ultraviolet lamp or other light sources.
  • the embodiment of the present application does not limit the type of the ultraviolet light source.
  • n (a positive integer) represents n different ISO conditions.
  • fn is a function that maps exposure time T to UV intensity V.
  • V a*T ⁇ 3+b*T ⁇ 2+c*T+d (Formula 1), where a, b, c, and d represent function parameter coefficients.
  • T ⁇ 3 means T to the third power.
  • the parameters a/b/c/d are estimated according to the polynomial curve corresponding to Formula 1.
  • Formula 1 only exemplarily shows a possible mapping relationship between the ultraviolet intensity V and the exposure time T, and it should be understood that other mapping relationships between the ultraviolet intensity V and the exposure time T may also be used.
  • the embodiment of the present application does not limit the specific relationship between the two.
  • this function can be used as a model for detecting ultraviolet rays.
  • the method for determining the environmental ultraviolet index based on the model shown in Figure 10 and the ultraviolet images collected by the ultraviolet camera is introduced as follows. As shown in Figure 11, the method includes:
  • the electronic device collects a first image.
  • the mobile phone calls the color camera to collect color images, and calls the ultraviolet camera to collect ultraviolet images.
  • the mobile phone may display the collected color image 602 and the ultraviolet image 601 in the preview interface.
  • the electronic device can determine the ultraviolet index according to the first image. As a possible implementation manner, the electronic device determines the ultraviolet index according to the gray value of the pixel included in the first image.
  • the process for the electronic device to determine the ultraviolet index according to the first image includes the following steps S102-S105a (or A105b).
  • step S102 The electronic device judges whether the average grayscale of the first image is the target grayscale. If yes, execute step S103a; if not, execute step S103b.
  • the target grayscale can be obtained according to statistics, or the target grayscale can be obtained according to empirical values.
  • the embodiment of the present application does not limit the specific setting manner of the target grayscale.
  • the ultraviolet image usually does not have the problem of overexposure or overdarkening.
  • the gray level of the image is the target gray level.
  • the target grayscale is 128 ⁇ 5, and if the grayscale of the image is within the range of 128 ⁇ 5, the grayscale of the image is called the target grayscale.
  • the average grayscale of the ultraviolet image based on which the ultraviolet index is detected is within a certain range, which can avoid the problem of inaccurate detection caused by overexposure or overdarkening of the ultraviolet image.
  • the electronic device obtains the ultraviolet camera parameters corresponding to the first image.
  • the parameters of the ultraviolet camera include but are not limited to exposure time and sensitivity.
  • the first image is the ultraviolet image 601 shown in (a) of FIG. 12
  • the average grayscale of the ultraviolet image 601 is within 128 ⁇ 5
  • the mobile phone may display the average grayscale of the ultraviolet image 601 on the interface.
  • the ultraviolet camera parameters corresponding to the third image are different from the ultraviolet camera parameters corresponding to the first image.
  • the mobile phone adjusts the ultraviolet camera parameters according to a preset algorithm, so that under the adjusted ultraviolet camera parameters, the obtained average grayscale of the third image 603 is the target grayscale.
  • the mobile phone may execute step S105a.
  • the mobile phone may display the collected ultraviolet image 603 in a preview interface.
  • the mobile phone may display the ultraviolet camera parameters (an example of the third information) corresponding to the ultraviolet image 603 on the interface.
  • UV camera parameters include but not limited to sensitivity, exposure time.
  • the mobile phone may display the average grayscale of the ultraviolet image 603 on the interface.
  • the UV camera parameters can be adjusted so that the average gray scale of the UV images collected under various parameter conditions is always within the target gray scale range to prevent inaccurate detection caused by overexposure or overdarkening .
  • the adjusted parameters of the ultraviolet camera are shown in (b) of FIG. 13 .
  • the mobile phone may display the fourth image 604 obtained under the parameter conditions of the ultraviolet camera in a preview interface as shown in (b) of FIG. 13 . It can be seen that the gray scale of the image 604 is different from that of the image 601 .
  • the mobile phone can perform one or more adjustments of the parameters of the ultraviolet camera referring to the solutions of the above-mentioned embodiments.
  • the parameters of the ultraviolet camera are adjusted to those shown in (c) of FIG. 13 .
  • the mobile phone may display the ultraviolet image 603 obtained under the conditions of the ultraviolet camera parameters in a preview interface as shown in (c) of FIG. 13 . It can be seen that the image 604 has the same gray scale as the image 603 .
  • the mobile phone may execute step S105b.
  • S105a Determine the ultraviolet index according to the ultraviolet camera parameters corresponding to the first image, the ultraviolet camera parameters corresponding to the third image, the target grayscale and the model.
  • the average grayscales of the ultraviolet image 601 and the ultraviolet image 603 as shown in FIG. function to determine the UV index.
  • the final ultraviolet intensity can be determined according to V1 and V2.
  • the average value of V1 and V2 may be calculated, and the average value may be an arithmetic average value or a weighted average value, which is not limited in this embodiment of the present application.
  • the maximum or minimum value of multiple results can be used as the final UV index.
  • the mobile phone calls the ultraviolet camera to collect the first image 601. After calculation, the brightness of the ultraviolet image 601 is too large, resulting in overexposure. Phenomenon. Then, the mobile phone can adjust the parameters of the ultraviolet camera so that the brightness of the ultraviolet image collected by the ultraviolet camera is within a preset range after the parameters are adjusted. Exemplarily, after adjusting the parameters of the ultraviolet camera, the ultraviolet image captured by the ultraviolet camera of the mobile phone is shown as the ultraviolet image 604 in (b) of FIG. 13 , and the average grayscale of the ultraviolet image 604 is the target grayscale.
  • the mobile phone adjusts the parameters of the ultraviolet camera according to a certain step, and calls the ultraviolet camera to collect ultraviolet images under the adjusted parameter conditions, and calculates the average gray level of the ultraviolet images. If the average grayscale meets the target grayscale range, determine that the UV camera parameter is the parameter that needs to be adjusted. If the average grayscale does not meet the target grayscale range, the mobile phone continues to adjust the UV camera parameters, and collects UV images to calculate the UV image The average grayscale of the ultraviolet image obtained after adjusting the parameters is within the preset range, and this parameter is determined to be the parameter to be adjusted. or,
  • the mobile phone invokes the ultraviolet camera to collect ultraviolet images, and calculates the parameter values of the ultraviolet camera that need to be adjusted according to the ultraviolet images and the algorithm.
  • the mobile phone may collect an ultraviolet image, and calculate the average gray level of the ultraviolet image, so as to determine the accuracy of the parameter values of the ultraviolet camera.
  • the method for adjusting the ultraviolet camera may be other methods, and the embodiment of the present application does not limit the method for adjusting the parameters of the ultraviolet camera.
  • the mobile phone may display the adjusted parameters of the ultraviolet camera on the interface.
  • the ISO such as G1
  • T1 the exposure time
  • T1 adjust the ISO to G1 through the algorithm
  • S105b Determine the ultraviolet index according to the ultraviolet camera parameters corresponding to the fourth image, the ultraviolet camera parameters corresponding to the fifth image, the target grayscale and the model.
  • the average grayscale of the ultraviolet image 604 and the ultraviolet image 603 as shown in Figure 13 are all within the target grayscale range, and the ultraviolet camera parameters corresponding to the ultraviolet image 604 and the ultraviolet image 603 as shown in Figure 13 are substituted into such as The function shown in Figure 10, and then determine the UV index.
  • Figure 13 and Figure 12 take the two parameter conditions that make the average grayscale of the UV image within the target grayscale range (that is, adjust the UV camera parameters once in Figure 12, and adjust the UV camera parameters twice in Figure 13) as examples to illustrate , in actual implementation, more or fewer parameter adjustments can be made to obtain the corresponding ultraviolet index under more or fewer parameter conditions, and the final comprehensive ultraviolet index is determined according to the ultraviolet index of different parameter conditions . Wherein, since different parameter conditions are taken into account, the deviation of the ultraviolet index under some parameter conditions can be corrected, so that the final output ultraviolet index is more accurate.
  • multiple image features can be extracted from the ultraviolet image, and the feature vector formed by the multiple image features is input into the classifier, and the classifier outputs the ultraviolet index.
  • the image features include but are not limited to one or more of the following features: the average gray value of the image area, the maximum gray value, the minimum gray value, the difference between the gray values of adjacent pixels in the image area, the image The gray value of each pixel in the area, the contrast of the image area, histogram, histogram of oriented gradients (HOG), standard deviation, color scale, mean square deviation, and variance.
  • the average gray value of the image area the maximum gray value
  • the minimum gray value the difference between the gray values of adjacent pixels in the image area
  • the image The gray value of each pixel in the area, the contrast of the image area, histogram, histogram of oriented gradients (HOG), standard deviation, color scale, mean square deviation, and variance.
  • HOG histogram of oriented gradients
  • N is a positive integer
  • the samples are feature vectors of ultraviolet images with known ultraviolet indicators.
  • the training samples may also include labels corresponding to each ultraviolet image (representing the ultraviolet index corresponding to the ultraviolet image), and a classifier can be obtained by training multiple samples.
  • data such as training feature vectors may be processed, such as smoothing and normalization.
  • the normalization process can reduce the complexity of the algorithm. Smoothing can further include operations such as noise reduction and fitting to reduce the impact of statistical errors.
  • the classifier may be evaluated and tested.
  • the recognition rate of the classifier reaches a certain threshold, it indicates that the classifier has been trained.
  • the recognition rate of the classifier is low, the classifier can continue to be trained until the recognition accuracy of the classifier reaches a certain threshold.
  • the training process of the classifier can be performed on the device side (such as a terminal such as a mobile phone) or on the cloud side (such as a server). Training can be offline training or online training. The embodiment of the present application does not limit the specific training method of the classifier. Subsequently, the trained classifier can output the ultraviolet index corresponding to the corresponding ultraviolet image according to the input feature vector of the ultraviolet image whose ultraviolet index is unknown.
  • the ultraviolet image is obtained by capturing the ultraviolet light directly emitted by the ultraviolet light source.
  • the model can also be constructed by capturing the reflected light.
  • an ultraviolet light source is used to irradiate the target object, and the electronic device captures the ultraviolet light reflected by the target object to form an ultraviolet image, and constructs a mathematical model for detecting ultraviolet indicators according to the above method.
  • Target objects include but are not limited to greenery, sand, etc.
  • the ultraviolet camera After building the model, when detecting the ultraviolet index, it is also necessary to point the electronic device at the target object for viewing, that is, the electronic device needs to be aimed at the target object, call the ultraviolet camera to collect the ultraviolet image, and determine the ultraviolet index according to the collected ultraviolet image.
  • some parts absorb more ultraviolet rays, reflect less ultraviolet rays or reflect no ultraviolet rays. For these parts, less or no ultraviolet rays pass through the ultraviolet camera, Therefore, these sites are usually not visible in UV images. For other parts, which absorb less UV rays and reflect more UV rays, more UV rays pass through the UV camera for these parts, so these parts are usually visible in the UV image.
  • the scheme 1 and scheme 2 mentioned above are only two exemplary algorithms.
  • the algorithm used to detect the ultraviolet index in the embodiment of the present application can also be other, such as regression algorithms such as logistic regression, or other classification algorithms.
  • the algorithm determines the ultraviolet indicator through the gray scale of the collected image, it may be within the scope of the technical solutions of the embodiments of the present application.
  • the embodiment of the present application does not limit the type and quantity of the ultraviolet camera parameters used to detect ultraviolet indicators, as long as the parameters are associated with the grayscale of the ultraviolet image, or the grayscale of the ultraviolet image can be adjusted by adjusting this parameter, or The change of this parameter can cause the gray scale of the ultraviolet image to change accordingly, and this parameter can be used in the technical solution of the embodiment of the present application.
  • the parameter of the ultraviolet camera which is the aperture value, can be used in the technical solution for detecting the ultraviolet index in the embodiment of the present application.
  • the mobile phone can also prompt the user with personalized ultraviolet sun protection suggestions according to the ultraviolet index. For example, when the UV index is high, prompt the user to apply sunscreen.
  • the apparatus may be the above-mentioned electronic device (such as a folding screen mobile phone).
  • the apparatus may include: a display screen, memory and one or more processors.
  • the display screen, memory and processor are coupled.
  • the memory is used to store computer program code comprising computer instructions.
  • the processor executes the computer instructions, the electronic device can execute various functions or steps performed by the mobile phone in the foregoing method embodiments.
  • FIG. 2A-FIG. 2D and FIG. 3 For the structure of the electronic device, reference may be made to the electronic device shown in FIG. 2A-FIG. 2D and FIG. 3 .
  • the core structure of the electronic device may be represented as the structure shown in FIG. 16 , and the core structure may include: a processing module 1301 , an input module 1302 , a storage module 1303 , and a display module 1304 .
  • the processing module 1301 may include at least one of a central processing unit (CPU), an application processor (Application Processor, AP) or a communication processor (Communication Processor, CP).
  • the processing module 1301 may perform operations or data processing related to control and/or communication of at least one of other elements of the user electronic device.
  • the processing module 1301 can be configured to control the content displayed on the main screen according to a certain trigger condition. Or determine what is displayed on the screen according to preset rules.
  • the processing module 1301 is also used to process the input instruction or data, and determine the display style according to the processed data.
  • the input module 1302 is configured to obtain instructions or data input by the user, and transmit the obtained instructions or data to other modules of the electronic device.
  • the input mode of the input module 1302 may include touch, gesture, screen approach, etc., and may also be voice input.
  • the input module may be a screen of an electronic device, acquires user input operations, generates input signals according to the acquired input operations, and transmits the input signals to the processing module 1301 .
  • the input module may be used to receive the first instruction input by the user, and/or perform other steps.
  • the collection module 1306 is configured to collect data and transmit the collected data to other modules of the electronic device.
  • the collection module 1306 may be a camera of the electronic device, and the camera may transmit the collected images to the processing module 1301, and/or perform other steps.
  • Cameras include but are not limited to color cameras and ultraviolet cameras.
  • the storage module 1303 may include a volatile memory and/or a nonvolatile memory.
  • the storage module is used to store at least one instruction or data related to other modules of the user terminal device. Specifically, the storage module can record images collected by the camera.
  • the display module 1304 may include, for example, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, an Organic Light Emitting Diode (OLED) display, a Micro Electro Mechanical System (MEMS) display or an electronic paper display. Used to display user-viewable content (eg, text, images, videos, icons, symbols, etc.). In the embodiment of the present application, the display module may be implemented as a display screen.
  • LCD Liquid Crystal Display
  • LED Light Emitting Diode
  • OLED Organic Light Emitting Diode
  • MEMS Micro Electro Mechanical System
  • the structure shown in FIG. 16 can also have a communication module 1305, which is used to support the electronic device to communicate with other electronic devices.
  • the communication module can be connected to a network via wireless communication or wired communication to communicate with other personal terminals or a network server.
  • the wireless communication may employ at least one of cellular communication protocols, such as Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Universal Mobile Communications System (UMTS), Wireless Broadband (WiBro), or Global System for Mobile Communications (GSM).
  • Wireless communications may include, for example, short-range communications.
  • the short-range communication may include at least one of wireless fidelity (Wi-Fi), Bluetooth, near field communication (NFC), magnetic stripe transmission (MST), or GNSS.
  • the embodiment of the present application also provides a chip system, as shown in FIG. 17 , the chip system includes at least one processor 1401 and at least one interface circuit 1402 .
  • the processor 1401 and the interface circuit 1402 may be interconnected through wires.
  • interface circuit 1402 may be used to receive signals from other devices, such as memory of an electronic device.
  • the interface circuit 1402 may be used to send signals to other devices (such as the processor 1401).
  • the interface circuit 1402 can read instructions stored in the memory, and send the instructions to the processor 1401 .
  • the electronic device may be made to execute various steps in the foregoing embodiments.
  • the chip system may also include other discrete devices, which is not specifically limited in this embodiment of the present application.
  • the embodiment of the present application also provides a computer storage medium, the computer storage medium includes computer instructions, and when the computer instructions are run on the above-mentioned electronic device, the electronic device is made to perform the various functions or steps performed by the mobile phone in the above-mentioned method embodiment .
  • the embodiment of the present application also provides a computer program product, which, when the computer program product is run on a computer, causes the computer to execute each function or step performed by the mobile phone in the method embodiment above.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be Incorporation or may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the unit described as a separate component may or may not be physically separated, and the component displayed as a unit may be one physical unit or multiple physical units, that is, it may be located in one place, or may be distributed to multiple different places . Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium Among them, several instructions are included to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: various media that can store program codes such as U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk.

Abstract

一种紫外线检测方法及电子设备(100),涉及终端技术领域,可以提升紫外线指标的检测精度,检测方法包括:方法应用于电子设备(100),电子设备(100)包括第一摄像头(193),第一摄像头(193)包括紫外摄像头,方法包括:检测第一指令,并在检测到第一指令之后,通过第一摄像头(193)采集第一图像(601);根据第一图像(601)确定紫外线指标,并显示第一信息;第一指令用于指示检测紫外线指标;第一信息包括紫外线指标的信息。

Description

紫外线检测方法及电子设备
本申请要求于2021年10月21日提交国家知识产权局、申请号为202111229520.6、发明名称为“紫外线检测方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及紫外线检测方法及电子设备。
背景技术
目前,越来越多用户注重皮肤的防晒。防晒方法包括防紫外线的方法和措施,比如涂抹防晒霜来抵挡紫外线。在一些情况下,用户可以通过天气预报获知环境的紫外线指标,并基于该紫外线指标指导防晒。
但是,天气预报所报告的紫外线指标通常是粗略的给出一个较大区域内的天气情况,并不能给出用户所在地点的准确紫外线情况,因此,亟待提出一种能够精准测量紫外线指标的方案。
发明内容
本申请提供紫外线检测方法及电子设备,可以提高紫外线指标的检测精度。
为了实现上述目的,本申请实施例提供了以下技术方案:
第一方面,提供紫外线检测方法,该方法应用于电子设备或能够实现电子设备功能的组件(比如芯片系统),电子设备包括第一摄像头,第一摄像头包括紫外摄像头,方法包括:检测第一指令,并在检测到第一指令之后,通过第一摄像头采集第一图像;根据第一图像确定紫外线指标,并显示第一信息;第一指令用于指示检测紫外线指标;第一信息包括紫外线指标的信息。
相比于天气预报只能粗略的给出一个较大地理区域内的紫外线指标,本申请实施例的技术方案,电子设备能够通过紫外摄像头采集紫外图像,并根据采集的紫外图像确定用户所在地点的实时的紫外线指标,并向用户提示该紫外线指标。由于本方案的紫外线指标是通过在用户所在地点实时采集的紫外图像获取,因此,紫外线指标检测结果更为精准。
在一种可能的设计中,在根据第一图像确定紫外线指标之前,显示第二信息,用于提示将第一摄像头对准目标对象,目标对象包括天空。
由于摄像头的拍摄角度和方向不同,往往能够捕捉的光线不同,进而获得的成像内容不同。在本申请实施例中,用户可以调整紫外摄像头的拍摄方向和角度,使得紫外摄像头尽可能朝向紫外线反射、散射较少的位置,进而能够更加有效的采集紫外线,提升紫外线检测精确度。
在一种可能的设计中,电子设备还包括第二摄像头,方法还包括:通过第二摄像头采集第二图像;第二摄像头包括彩色摄像头;
在通过第一摄像头采集第一图像之后,方法还包括:
显示用户界面,用户界面包括第一图像的部分内容,和/或包括第二图像的部分内 容。
在一种可能的设计中,在通过第一摄像头采集第一图像之后,显示第三信息,第三信息包括第一摄像头的参数;参数包括感光度、曝光时间。
在一种可能的设计中,通过第二摄像头采集第二图像,包括:在通过第一摄像头采集第一图像的同时,通过第二摄像头采集第二图像;
在通过第二摄像头采集第二图像之后,方法还包括:判断第二图像是否满足第一条件;在确定第二图像满足第一条件的情况下,根据第一图像确定紫外线指标;第一条件包括:第二图像中存在目标对象;或,第一条件包括:第二图像中存在目标对象,且目标对象在第二图像中的面积占比大于或等于阈值。
由于彩色图像通常具有更多的成像细节以及更高的画质,因此,通过彩色摄像头采集的彩色图像,能够更加准确的检测目标对象(比如天空)。
在一种可能的设计中,通过第二摄像头采集第二图像,包括:在检测到第一指令之后,通过第一摄像头采集第一图像之前,通过第二摄像头采集第二图像;
通过第二摄像头采集第二图像之后,方法包括:判断第二图像是否满足第一条件;
通过第一摄像头采集第一图像,包括:在确定第二图像满足第一条件的情况下,通过第一摄像头采集第一图像。
在一种可能的设计中,根据第一图像确定紫外线指标,包括:根据第一图像所包括像素的灰度值,确定紫外线指标。
在一种可能的设计中,在通过第一摄像头采集第一图像之后,根据第一图像确定紫外线指标之前,方法还包括:
若第一图像的灰度值为目标灰度,则调整第一摄像头的参数,并通过第一摄像头采集第三图像,其中,第三图像的灰度为目标灰度;
根据第一图像确定紫外线指标,包括:根据第一图像对应的第一摄像头参数、第三图像对应的第一摄像头参数以及目标灰度,确定紫外线指标。
在一种可能的设计中,在通过第一摄像头采集第一图像之后,根据第一图像确定紫外线指标之前,方法还包括:
若第一图像的灰度值不是目标灰度,则调整第一摄像头的参数,并采集第四图像,第四图像的灰度为目标灰度;
调整第一摄像头的参数,并采集第五图像,第五图像的灰度为目标灰度;
根据第一图像确定紫外线指标,包括:根据第四图像对应的第一摄像头参数、第五图像对应的第一摄像头参数以及目标灰度,确定紫外线指标。
通过在采集图像过程中,调整紫外摄像头的参数,可以调整所采集紫外图像的灰度值,进而根据灰度值确定电子设备所在位置的紫外线指标。
在一种可能的设计中,第一摄像头的参数包括曝光时间、感光度。
在一种可能的设计中,由于不同用户所处地点不同,因此,不同检测到的紫外线指标也可能不同。基于电子设备提示的紫外线指标,可以为不同用户进行个性化的防晒指导,以降低晒伤和皮肤癌风险。可选的,电子设备还可以根据紫外线指标,检测用户的防晒霜涂抹情况,并向用户提示防晒霜涂抹的检测结果。如此一来,用户可以根据检测结果指导日常防晒。
第二方面,提供一种电子设备,该电子设备包括第一摄像头,第一摄像头包括紫外摄像头,该电子设备包括:
处理模块,用于检测用户通过输入模块输入的第一指令;
第一摄像头,用于在处理模块检测到第一指令之后,采集第一图像;
处理模块,用于根据第一图像确定紫外线指标;
显示模块,用于显示第一信息;第一指令用于指示检测紫外线指标;第一信息包括紫外线指标的信息。
在一种可能的设计中,显示模块,还用于在处理模块根据第一图像确定紫外线指标之前,显示第二信息,第二信息用于提示将第一摄像头对准目标对象,目标对象包括天空。
在一种可能的设计中,电子设备还包括第二摄像头,第二摄像头,用于采集第二图像;第二摄像头包括彩色摄像头;
显示模块,还用于在通过第一摄像头采集第一图像之后,显示用户界面,用户界面包括第一图像的部分内容,和/或包括第二图像的部分内容。
在一种可能的设计中,显示模块,还用于在通过第一摄像头采集第一图像之后,显示第三信息,第三信息包括第一摄像头的参数;参数包括感光度、曝光时间。
在一种可能的设计中,第二摄像头,用于采集第二图像,包括:在通过第一摄像头采集第一图像的同时,通过第二摄像头采集第二图像;
处理模块,还用于在通过第二摄像头采集第二图像之后,判断第二图像是否满足第一条件;在确定第二图像满足第一条件的情况下,根据第一图像确定紫外线指标;第一条件包括:第二图像中存在目标对象;或,第一条件包括:第二图像中存在目标对象,且目标对象在第二图像中的面积占比大于或等于阈值。
在一种可能的设计中,第二摄像头,用于采集第二图像,包括:在处理模块检测到第一指令之后,第一摄像头采集第一图像之前,采集第二图像;
处理模块,还用于在第二摄像头采集第二图像之后,判断第二图像是否满足第一条件;
第一摄像头,用于采集第一图像,包括:在确定第二图像满足第一条件的情况下,采集第一图像。
在一种可能的设计中,处理模块,用于根据第一图像确定紫外线指标,包括:根据第一图像所包括像素的灰度值,确定紫外线指标。
在一种可能的设计中,处理模块,还用于若第一图像的灰度值为目标灰度,则在通过第一摄像头采集第一图像之后,根据第一图像确定紫外线指标之前,调整第一摄像头的参数,并在该参数条件下,调用第一摄像头采集第三图像,其中,第三图像的灰度为目标灰度;
处理模块,用于根据第一图像确定紫外线指标,包括:根据第一图像对应的第一摄像头参数、第三图像对应的第一摄像头参数以及目标灰度,确定紫外线指标。
在一种可能的设计中,处理模块,还用于在通过第一摄像头采集第一图像之后,若第一图像的灰度值不是目标灰度,则调整第一摄像头的参数,并在该参数条件下,调用第一摄像头采集第四图像,第四图像的灰度为目标灰度;
处理模块,还用于调整第一摄像头的参数,并采集第五图像,第五图像的灰度为目标灰度;
处理模块,用于根据第一图像确定紫外线指标,包括:根据第四图像对应的第一摄像头参数、第五图像对应的第一摄像头参数以及目标灰度,确定紫外线指标。
在一种可能的设计中,第一摄像头的参数包括曝光时间、感光度。
在一种可能的设计中,电子设备还可以根据紫外线指标,检测用户的防晒霜涂抹情况,并向用户提示防晒霜涂抹的检测结果。如此一来,用户可以根据检测结果指导日常防晒。
第三方面,本申请提供一种电子设备,该电子设备具有实现如上述任一方面及其中任一种可能的实现方式中的检测方法的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。该硬件或软件包括一个或多个与上述功能相对应的模块。
第四方面,本申请提供一种计算机可读存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行如任一方面及其中任一种可能的实现方式中任一项的检测方法。
第五方面,本申请提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行如任一方面及其中任一种可能的实现方式中任一项的检测方法。
第六方面,提供一种电路系统,电路系统包括处理电路,处理电路被配置为执行如上述任一方面及其中任一种可能的实现方式中的检测方法。
第七方面,本申请实施例提供一种芯片系统,包括至少一个处理器和至少一个接口电路,至少一个接口电路用于执行收发功能,并将指令发送给至少一个处理器,当至少一个处理器执行指令时,至少一个处理器执行如上述任一方面及其中任一种可能的实现方式中的检测方法。
附图说明
图1为本申请实施例提供的紫外成像原理的示意图;
图2A-图2D为本申请实施例提供的电子设备的形态示意图;
图3为本申请实施例提供的电子设备的结构示意图;
图4为本申请实施例提供的电子设备的软件架构示意图;
图5为本申请实施例提供的界面图;
图6A、图6B为本申请实施例提供的界面图;
图7-图9为本申请实施例提供的界面图;
图10为本申请实施例提供的摄像头各参数与紫外线指标之间的关系示意图;
图11为本申请实施例提供的方法流程的示意图;
图12、图13为本申请实施例提供的界面示意图;
图14为本申请实施例提供的检测方法流程的示意图;
图15为本申请实施例提供的分类器的训练以及使用的示意图;
图16为本申请实施例提供的装置的示意图;
图17为本申请实施例提供的芯片系统的示意图。
具体实施方式
首先,对本申请实施例涉及的一些技术术语进行介绍。
紫外线(ultraviolet)成像:紫外线的波长不在可见光波段范围内,因此,对于人眼来说,紫外线不可见。在一些方案中,可以利用紫外线作为光源,使用紫外线成像系统进行摄像,这种使用紫外线成像系统捕捉紫外线并成像的技术可称为紫外线成像技术。通过紫外线成像,用户可以观察物体在紫外线照射下形成的紫外图像。
与红绿蓝三基色(red,green,blue,RGB)图像通常具有三个颜色通道不同,通常情况下,紫外图像为灰度图像,具有一个颜色通道,即灰色(gray)通道。
紫外线成像系统包括紫外线摄像模组。紫外线摄像模组比如可以是可以捕获紫外线的摄像头,称为紫外摄像头。如图1所示,紫外摄像头可包括镜头、图像传感器等部件。该镜头可以是诸如但不限于石英玻璃一类的材质,可以透过紫外线。图像传感器可以记录穿过镜头的紫外线的信息。
本申请实施例中,考虑到紫外摄像头的特点,可以将紫外摄像头应用到紫外线检测(包括但不限于紫外线强度检测)中。通过紫外摄像头采集紫外图像,并根据紫外图像的特征确定该紫外图像对应的紫外线的指标。
比如,通常,在同等条件下,紫外线强度越强,其透过紫外镜头的进光量越大,相应的,在图像传感器上形成的紫外图像的颜色通常越浅。因此,在本申请的一些实施例中,可以根据紫外图像的颜色深浅判断紫外线的强弱。示例性的,图1的(1)所示紫外线进光量比图1的(2)所示紫外线进光量更多,因此,图1的(1)中,紫外摄像头形成的紫外图像更亮,颜色更浅。
本申请实施例提供一种检测方法,可以应用在电子设备中,该电子设备上安装有具有诸如图1所示结构的紫外摄像头。本申请实施例中的电子设备可以是手机、平板电脑、可穿戴设备、车载设备、笔记本电脑等,本申请实施例中的手机可以是折叠屏手机或非折叠屏手机,本申请实施例对电子设备的具体类型不作任何限制。
示例性的,电子设备100上摄像头的布局可以参见图2A,其中,电子设备100正面为显示屏194所在的平面。如图2A中(a)所示,摄像头1931位于电子设备100正面,则摄像头为前置摄像头。如图2A中(b)所示,摄像头1932位于电子设备100背面,则摄像头为后置摄像头。
在本申请的一些实施例中,摄像头可包括紫外摄像头和彩色摄像头。彩色摄像头包括但不限于RGB摄像头。
可选的,本申请实施例的方案可以应用于具有多个显示屏的折叠屏(即显示屏能够折叠)的电子设备100上。
在一些实施例中,折叠屏可以为柔性折叠屏。其中,柔性折叠屏包括采用柔性材质制作的折叠轴。该柔性折叠屏的部分或全部采用柔性材质制作。例如:该柔性折叠屏中只有可折叠的部分(如折叠轴)采用柔性材质制作,其它部分采用刚性材质制作;或者,该柔性折叠屏全部采用柔性材质制作。该折叠屏可沿折叠轴折叠形成至少两个子屏。
示例性的,图2A中(c)示出了一种折叠屏电子设备100。响应于用户的操作,如图2A中(d)所示,沿折叠边向内折叠(或向外折叠)可折叠屏幕,使得可折叠屏幕形成至少两个子屏(例如A子屏和B子屏)。可选的,如图2A中(e)所示,在折 叠的外侧有显示屏(例如C屏)。若电子设备100在C屏所在表面设置有摄像头。那么,在如图2A中(c)所示的电子设备100未折叠场景中,C屏上的摄像头在电子设备100的背面,可以视为后置摄像头。在如图2A中(e)所示的电子设备100已折叠场景中,C屏上的摄像头变为在电子设备100的正面,可以视为前置摄像头。也就是说,本申请中前置摄像头和后置摄像头并不对摄像头本身的性质进行限制,仅为一种位置关系的说明。
以图2A中的折叠屏手机为例,紫外摄像头可设置在诸如C屏(以挖孔或屏下摄像头等方式设置)上。
可选的,折叠屏电子设备的折叠屏可形成多个(比如2个、3个等)子屏。示例性的,如图2B中(1)所示的柔性折叠屏,可以包括折叠线030和折叠线031。在沿折叠线030纵向折叠后,可以形成如图2B中(2)所示的子屏032、子屏033和子屏034。
可选的,折叠屏电子设备的屏幕排列方式可以是诸如图2A中(d)所示的上下屏幕排列,或者,也可以为诸如图2C中(1)或(2)所示的左右屏幕排列等。本申请实施例对折叠屏电子设备的屏幕排列方式不做限制。如图2C中(1)所示的柔性折叠屏,在沿折叠线040横向折叠后,可以形成如图2C中(2)所示的子屏041、和子屏042。例如,该折叠屏设备在折叠状态(可参见图2D的(1)所示状态)可以为手机,在展开状态可以为平板电脑。
在本申请的实施例中,折叠屏可以是单面屏(即只有一面可以显示用户界面)也可以是双面屏(即相对的两面均可显示用户界面)。
对于单面折叠屏来说,折叠屏朝可显示用户界面的一面(即单面折叠屏的正面)进行翻折的情况,可以称为正向翻折;折叠屏朝可显示用户界面的一面的相对面(即单面折叠屏的背面)进行翻折的情况,可以称为反向翻折。例如,图2C中(2)表示正向翻折的示意图,图2D中(2)表示反向翻折的示意图。折叠屏设备可以确定当前是正向翻折还是反向翻折。
示例性的,图3示出了电子设备100的一种结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令 操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在本申请的一些实施例中,处理器110用于调用紫外摄像头采集紫外图像,并根据紫外图像确定紫外线指标。
充电管理模块140用于从充电器接收充电输入。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,显示屏194,摄像头193等供电。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT)等无线通信的解决方案。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。例如,ISP可以根据拍摄参数控制感光元件进行曝光和拍照。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。
在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。其中,摄像头193可以位于电子设备的边缘区域,可以为屏下摄像头,也可以是可升降的摄像头,或者挖孔摄像头。摄像头193可以包括后置摄像头,还可以包 括前置摄像头。本申请实施例对摄像头193的具体位置和形态不予限定。
在本申请实施例中,摄像头193包括普通摄像头以及紫外摄像头。
其中,普通摄像头包括彩色摄像头。彩色摄像头比如可以但不限于是RGB摄像头。
本申请实施例中,彩色摄像头可用于采集图像,采集的图像可被用来确定是否存在天空等对象。采集的彩色图像还可用来标记紫外线指标。
紫外摄像头,可用来采集紫外图像,电子设备100可根据采集的紫外图像检测紫外线指标。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
在一些实施例中,NPU利用图像识别技术,识别摄像头193采集到的图像中是否包含天空图像。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频数据转换成模拟音频电信号输出,也用于将模拟音频电信号输入转换为数字音频数据,音频模块170可以包括模/数转换器和数/模转换器。例如,音频模块170用于将麦克风170C输出的模拟音频电信号转换为数字音频数据。音频模块170还可以用于对音频数据进行编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将模拟音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将模拟音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为模拟音频电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。其中,该麦克风170C可以是电子设备100的内置部件, 也可以是电子设备100的外接配件。
在一些实施例中,电子设备100可以包括一个或多个麦克风170C,其中每一麦克风或多个麦克风合作可以实现采集各个方向的声音信号,并将采集到的声音信号转换为模拟音频电信号的功能,还可以实现降噪,识别声音来源,或定向录音功能等。
其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构为例,示例性说明电子设备100的软件结构。
图4是本发明实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将电子设备的操作系统(例如And待检测区域d系统)分为四层,从下至上分别为内核层,硬件抽象层(hardware abstract layer,HAL),应用程序框架层,以及应用程序层。
内核层是硬件和软件之间的层。内核层至少包含摄像头驱动,音频驱动,显示驱动,传感器驱动。传感器驱动包括但不限于图像传感器驱动、声敏传感器驱动。
在一些实施例中,比如在天空识别场景中,内核层的诸如摄像头驱动被调用以开启摄像头。再比如,通过图像传感器驱动调用图像传感器完成图像采集。
硬件抽象层(hardware abstract layer,HAL)位于内核层和应用程序框架层之间,用于定义驱动应用程序硬件实现的接口,将驱动硬件实现的值转化为软件实现程序语言。例如识别摄像头驱动的值,将其转化为软件程序语言上传至应用程序框架层,进而实现调用相应功能。
在一些实施例中,HAL可以将摄像头193采集到的天空图像上传至应用程序框架层进行进一步的处理。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层从内核层经由HAL获取原始输入事件,识别该输入事件所对应的控件。应用程序框架层包括一些预先定义的函数。
如图4所示,应用程序框架层可以包括视图系统,电话管理器,资源管理器,通知管理器,窗口管理器等。
在一些实施例中,应用框架层包括第一模块。第一模块用于通过摄像头驱动调用紫外摄像头采集紫外图像,并根据紫外图像确定紫外线指标。
可选的,第一模块还可以是设置在其他层中,并且,第一模块还可以划分为更多 子模块。每个子模块用于执行相应功能。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
应用程序层可以包括一系列应用程序包。
如图4所示,应用程序包可以包括相机,视频,通话,WLAN,音乐,短信息,蓝牙,地图,日历,图库,导航等应用程序。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
以下将以电子设备为具有诸如图4所示结构的手机为例,在一些实施例中,手机可以在检测到满足预设条件的情况下,开启本申请实施例的紫外线检测方法。可选的,预设条件可以包括但不限于如下任一项或多项条件:预设应用被开启。预设应用可以包括但不限于专用于紫外线检测的应用。或者,手机也可以是在检测到用户向手机输入了预设指令之后,开启本申请实施例的紫外线检测方法。本申请实施例并不限制开启紫外线检测功能的方式。
如下,以通过相机应用中的开启紫外线检测的开关开启紫外线检测方法为例,对本申请实施例的技术方案进行介绍。
可选的,用户可以通过触摸操作、按键操作、隔空手势操作或语音操作等方式,指示手机启动相机应用,并显示拍摄预览界面。示例性的,手机显示如图5中(a)所示的界面,该界面包括相机应用的图标401。在检测到用户的诸如点击相机应用图标的操作时,手机启动相机应用,并调用普通摄像头(比如RGB摄像头)采集图像,在图5中(b)所示的拍摄预览界面402显示RGB摄像头采集的RGB图像。或者,在检测到用户打开相机应用的语音指示操作时,手机启动相机应用,并调用RGB摄像头捕捉图像,得到图5中(b)所示的拍摄预览界面402。拍摄预览界面可包括一个或多个控件,用于实现不同功能。比如,包括紫外线检测控件403。该紫外线检测控件403可用于开启本申请实施例的紫外线检测功能,或者,用于指示检测紫外线指标。
可选的,拍摄预览界面402还可以包括其他控件。比如,可用于开启或关闭滤镜 功能的控件、控制拍照的控件、用于调整焦距的控件、调整屏幕亮度的控件。
在一些实施例中,如图5的(b)所示,若检测到用户的诸如点击紫外线检测控件403的操作,手机可调用彩色摄像头采集彩色图像,并可以在图5的(b)所示预览界面402显示采集的彩色图像。手机还可以在彩色摄像头采集彩色图像的同时调用紫外摄像头采集紫外图像,并根据采集的紫外图像确定紫外线指标。可选的,紫外线指标包括但不限于紫外线强度、紫外线指数、紫外线等级。可选的,手机也可以直接显示紫外摄像头采集的紫外图像。
可选的,在确定紫外线指标的过程中,手机可显示如图5的(b)所示的提示信息404,用于向用户提示手机正在检测紫外线指标(比如紫外线强度)。
在确定紫外线指标之后,手机可向用户提示本次检测到的紫外线指标。示例性的,如图5的(c)所示,手机显示提示信息405,用于表示本次检测的紫外线指数8以及对应的紫外线强度。
本申请实施例的技术方案,电子设备可通过采集图像的方式,方便快速的获取当前位置的紫外线指标(比如紫外线强度)。
在另一些实施例中,手机还可以在预览界面同时显示紫外图像和彩色图像。示例性的,在检测到用户点击图5所示的相机应用图标401之后,手机显示如图6A的(a)所示预览界面。如图6A的(a)所示,若检测到用户的诸如点击紫外线检测控件403的操作,手机可调用彩色摄像头采集彩色图像,调用紫外摄像头采集紫外图像,并可以在图6A的(a)所示预览界面显示采集的彩色图像以及采集的紫外图像。其中,彩色图像可用于识别拍摄对象,紫外图像可用于识别紫外线指标。
可选的,如图6A的(b)所示,在同一个预览界面中同时显示紫外图像和彩色图像的情况下,在预览界面还可以显示控件605和控件606,控件605用于控制紫外图像603的显示尺寸,控件606用于控制彩色图像602的显示尺寸。
在另一些实施例中,手机可以在不同的预览界面中分别显示紫外图像和彩色图像。用户可以选择切换在当前预览界面中显示的图像。示例性的如图6B的(a)所示,检测到用户点击紫外线检测控件403之后,手机调用彩色摄像头采集彩色图像,以及调用紫外摄像头采集紫外图像,并在预览界面中显示彩色图像402。在检测到用户点击切换视图控件406之后,手机可在预览界面显示如图6B的(b)所示紫外图像407。图6B的(c)的含义可参见图5的(c)的相关描述,这里不再赘述。
可选的,在检测到用户点击如图6B的(a)所示紫外线检测控件403之后,手机采集彩色图像以及紫外图像,并默认在预览界面中显示彩色图像。或者,手机采集彩色图像以及紫外图像,并在预览界面中显示上一次退出相机应用时用户所选视图的图像。比如,用户上一次使用相机时,选择在预览界面中显示紫外图像,则此次检测到用户点击如图6B的(a)所示紫外线检测控件403之后,手机采集彩色图像以及紫外图像,并在预览界面中显示紫外图像。
可以理解,摄像头的拍摄角度和方向不同,往往能够捕捉的光线不同,进而获得的成像内容不同。在本申请实施例中,用户可以调整紫外摄像头的拍摄方向和角度,使得紫外摄像头尽可能朝向紫外线反射、散射较少的位置,进而能够更加有效的采集紫外线,提升紫外线检测精确度。
可选的,紫外线反射、散射较少的位置包括但不限于天空。
在一些实施例中,在用户点击如图7的(a)所示紫外线检测控件403之后,手机可调用彩色摄像头采集彩色图像,并判断彩色图像是否满足第一条件;在确定所述第二图像满足所述第一条件的情况下,根据所述第一图像确定紫外线指标。所述第一条件包括:所述第二图像中存在目标对象;或,所述第一条件包括:所述第二图像中存在所述目标对象,且所述目标对象在所述第二图像中的面积占比大于或等于阈值。
以目标对象为天空举例,在用户点击如图7的(a)所示紫外线检测控件403之后,手机可调用彩色摄像头采集彩色图像,并确定彩色图像中是否存在天空。由于彩色图像通常具有更多的成像细节以及更高的画质,因此,通过彩色摄像头采集的彩色图像,能够更加准确的检测目标对象(比如天空)。
在一些实施例中,在检测到用户点击如图7的(a)所示紫外线检测控件403后,手机可显示提示信息701(第二信息的一种示例),用于提示用户将相机(第一摄像头)对准天空进行取景。手机先通过彩色摄像头采集彩色图像,在确定采集的彩色图像中存在天空时(即满足第一条件),手机可以调用紫外摄像头采集紫外图像,并根据紫外图像确定紫外线指标。或者,在检测到用户点击如图7的(a)所示紫外线检测控件403后,手机可显示提示信息701(第二信息的一种示例),用于提示用户将相机(第一摄像头)对准天空进行取景。手机通过彩色摄像头采集彩色图像,同时通过紫外摄像头采集紫外图像,由于紫外摄像头和彩色摄像头同步工作,在确定采集的彩色图像中存在天空时(即满足第一条件),意味着紫外图像也是对准天空采集的图像,那么,手机可以根据紫外图像确定紫外线指标。
可选的,手机在彩色图像中检测天空,可以实现为:设置天空的颜色分割范围[R1~R2,G1~G2,B1~B2],按照该颜色分割范围对彩色图像中的像素进行颜色分割。具体的,对于彩色图像中的某一像素点,若该像素点的R(红色)分量在R1~R2范围内,且G分量在G1~G2范围内,且B分量在B1~B2范围内,则认为该像素点是天空的成像像素。以此类推,检测出彩色图像中天空所包含的成像像素。
或者,手机还可以采用其他方式从彩色图像中检测天空,比如,可以利用机器学习方法,训练用于识别天空的分类器,并通过该分类器从彩色图像中识别出天空。本申请实施例并不限制检测天空的具体实现。
在另一些实施例的一些情况下,自用户点击紫外线检测控件403开始的一段时间内,若手机采集的彩色图像中不存在天空,或者在彩色图像上天空的面积占比小于一定比例(例如90%),则手机可以提示用户将摄像头对准天空。比如,如图7的(a)所示的预览界面402没有包括天空,说明手机的摄像头并未对准天空取景。那么,手机可以显示提示信息701,用于提示用户将摄像头对准天空进行拍摄。或者,手机可以播放语音,用于提示用户将摄像头对准天空进行拍摄。本申请实施例并不限制手机向用户提示调整拍摄方向和/或角度的方式。用户可以基于手机的提示调整拍摄方向和/或拍摄角度,以便将摄像头对准天空。
接下来,当用户将摄像头对准天空后,手机可调用紫外摄像头采集紫外图像,并根据紫外图像以及构建的模型确定紫外线指标。可选的,如图7的(b)所示,手机可显示提示信息702,用于提示正在检测紫外线指标。示例性的,手机确定紫外线指标 之后,可显示图7的(c)所示界面405,用于提示检测的紫外线指标。
在另一些情况下,用户点击诸如图5的(b)所示紫外线检测控件403后,可能主动将摄像头对准天空。这样一来,手机在调用彩色摄像头采集彩色图像(第二图像的一个示例)时,可在彩色图像中检测出天空,也就是说,摄像头当前正对准天空取景。在检测到摄像头对准天空取景的情况下,手机可以调用紫外摄像头采集紫外图像,并根据紫外图像确定紫外线强度。
需要说明的是,本申请实施例并不限制开启紫外摄像头的时机。作为一种可能的实现方式,紫外摄像头与彩色摄像头同步采集图像。示例性的,在检测到用户点击诸如图5的(b)所示紫外线检测控件403时,手机调用紫外摄像头采集紫外图像,同时,手机调用彩色摄像头采集彩色图像,且彩色摄像头与紫外摄像头的采集同步。再示例性的,如图5的(a)所示,检测到用户点击相机应用的图标401之后,手机可打开相机应用,并显示图8的(a)所示界面。检测到用户点击紫外线检测控件403之后,手机可调用彩色摄像头采集彩色图像(用于识别对象),并同步调用紫外摄像头采集紫外图像(用于确定紫外线指标)。可选的,如图8的(a)所示,手机可在预览界面中显示采集的彩色图像802以及紫外图像801。可选的,如图8的(a)所示,手机还可以显示提示信息701,用于提示用户将相机对准天空进行取景。
在用户调整摄像头拍摄方向的过程中,手机持续采集彩色图像以及同步采集紫外图像,若检测到采集的彩色图像中已存在天空,则手机可根据与该彩色图像同步采集的紫外图像,计算紫外线指标。可选的,手机计算紫外线指标的过程中,可显示诸如图8的(b)所示提示信息702,用于提示用户正在计算紫外线指标。手机还可以在预览界面显示采集的彩色图像804以及紫外图像804。在确定紫外线指标(比如紫外线强度)之后,手机可显示诸如图8的(c)所示界面,用于提示该紫外线指标。
作为另一种可能的实现方式,在开启紫外线检测功能,比如用户点击如图7的(a)所示紫外线检测控件403之后,手机先调用彩色摄像头采集彩色图像,并检测彩色图像中是否存在天空。若手机确定彩色图像中存在天空,则说明用户当前已将手机摄像头对准天空进行取景。那么,手机开始调用紫外摄像头采集紫外图像,并根据紫外图像检测紫外线指标。再示例性的,如图9的(a)所示,检测到用户点击紫外线检测控件403之后,手机先调用彩色摄像头采集彩色图像,并根据彩色图像识别用户是否已将摄像头对准天空进行取景。可选的,手机采集彩色图像的过程中,可以在预览界面显示诸如图9的(a)所示采集的彩色图像,并可显示提示信息701,以便提示用户将摄像头对准天空进行取景。若检测到采集的彩色图像中存在天空,则说明用户已将摄像头对准天空进行取景,那么,手机可调用紫外摄像头捕获天空方向的紫外线,并形成紫外图像,进而根据紫外图像确定紫外线指标。可选的,在开启紫外摄像头后,如图9的(b)所示,手机可以将紫外摄像头采集的紫外图像(第一图像的部分内容)以及彩色摄像头采集的彩色图像(第二图像的部分内容)同时显示在预览界面中。关于图9的(c)的描述可参见图8的(c)相关描述,这里不再赘述。
如下介绍根据紫外图像确定紫外线指标的具体实现方式。
前文的紫外成像原理已提到,可以根据紫外图像的颜色深浅判断紫外线指标。本申请实施例中,紫外图像的颜色深浅可以用灰度值表示。紫外图像的灰度值可以指紫 外图像的平均灰度值。可选的,平均灰度值包括但不限于如下任意灰度值:算数平均灰度值、加权平均灰度值。其中,算数平均灰度值中,每个像素对应的权重相同。
加权平均灰度值中,不同像素对应的权重可以不同。示例性的,在一些场景中,紫外图像的中间部分的亮度可能大于边缘部分的亮度。此种情况有可能是较多紫外线照射在图像传感器的中间部分,很少或没有紫外线照射在图像传感器的边缘部分导致。可见,此种情况下,由于紫外线照射情况集中反映在紫外图像的中间部分像素,因此紫外图像的中间部分像素对于检测紫外线尤为重要,进而考虑对于紫外图像的中间部分像素,赋予更大的权重。相应的,对于紫外图像的边缘部分像素,赋予更小的权重。
可选地,图像的边缘部分、图像的中间部分可以预定义,比如,设置与图像中心间隔L(正整数)个像素以上的像素构成图像的边缘部分,图像中的其他部分构成图像的中间部分。可选的,L可根据场景灵活确定,比如,可以根据经验值确定,本申请实施例对此不做限定。或者,图像的边缘部分、中间部分的定义还可以有其他方式,比如,假设图像的尺寸为M像素*N像素,以该图像的中心为几何中心,且边长为P(正整数)个像素的正方形内的图像像素可视为该图像中间部分,该正方形之外的图像像素可视为图像的边缘部分。其中,P小于min(M,N),即P小于M、N中的最小值。本申请实施例对边缘、中间部分的具体定义不做限制。
可选的,可以设置两种权重值,紫外图像的中间部分像素的权重值为第一权重,紫外图像的边缘部分像素的权重值为第二权重,且第一权重值大于第二权重。
或者,可选的,设置多个权重值,距离图像中心越近的像素,赋予越大的权重值,距离图像中心越远的像素,赋予越小的权重值。示例性的,与图像中心之间的距离在第一范围内的像素,赋予第一权重,与图像中心之间的距离在第二范围内的像素,赋予第二权重,与图像中心之间的距离在第三范围内的像素,赋予第三权重。
可以理解,紫外图像的平均灰度值不仅与紫外线指标(比如紫外线强度)有关,还与紫外摄像头的参数有关。可选的,与紫外图像的平均灰度值相关联的紫外摄像头参数包括但不限于如下任一项或多项:曝光时间、ISO。
其中,曝光时间可以理解为快门打开的时间。通常,在紫外摄像头中,当快门打开时,紫外线才能透过镜头投射到图像传感器上。快门关闭时,紫外线被阻挡,无法透射到图像传感器上。通常,在其他条件相同时,曝光时间越长,即快门打开的时间越长,紫外线的进光量越多,形成的紫外图像的平均灰度值通常越高,紫外图像越亮。反之,紫外摄像头的曝光时间越短,形成紫外图像的平均灰度值越低,紫外图像越暗。
ISO越高,紫外摄像头对紫外光线越敏感,相应的,形成紫外图像的亮度越高,平均灰度值越高。反之,ISO越低,紫外摄像头对紫外光线越不敏感,相应的,形成紫外图像的亮度越低,平均灰度值越低。
可见,通过调整紫外摄像头的上述一项或多项参数,可以调整紫外摄像头采集的紫外图像的平均灰度值,而根据上文,平均灰度值与紫外线指标的检测识别有关。因此,上述一项或多项摄像头参数与紫外线指标的检测识别有关。
考虑到紫外线指标与紫外摄像头的一项或多项参数以及紫外图像的平均灰度值有关,本申请实施例中,可以根据摄像头参数以及紫外图像的平均灰度值确定紫外线指标。作为一种可能的实现方式,根据紫外摄像头参数、紫外线指标(比如紫外线强度) 以及紫外图像的平均灰度值构建模型。后续,手机可根据已知的紫外摄像头参数、紫外图像的平均灰度值以及构建的模型,确定紫外线指标。
如下,介绍几种构建用于检测紫外线指标的模型,以使用相应模型确定紫外线指标的方案。
方案1
该方案中,构建模型的方法可包括如下步骤:
a)设置ISO为G1,G1∈[G_min,G_max]。其中,G_min是紫外摄像头ISO的最小值,G_max是紫外摄像头ISO的最大值。
在实验室测试阶段,将紫外摄像头对准紫外光源,使得紫外摄像头能够捕捉紫外光源发射的紫外线形成紫外图像。在捕捉紫外线形成紫外图像的过程中,将紫外光源的紫外线强度V从低到高逐渐调整,调整过程中,对于每一个紫外线强度,调节紫外摄像头的曝光时间T,使得紫外摄像头采集的紫外图像的平均灰度值达到目标灰度值。
可选的,目标灰度值的范围为:Gray±Q。其中,Gray是为了避免过曝或过暗设置的灰度基准。Q为偏差量。可选的,Gray、Q的值可以根据实际应用场景灵活设置,不做限制。示例性的,Gray=128,Q=5。
通过上述步骤,每一个紫外线强度均有对应的灰度值在目标灰度值范围内(比如Gray±Q范围内)的紫外图像,并且每个紫外线强度均有对应的曝光时间T。接下来,记录ISO为G1时,不同紫外线强度V对应的曝光时间T,并获得ISO为G1条件下,紫外线强度V与曝光时间T之间的函数关系f1。示例性的,图10示出了ISO为G1时,紫外线强度V与曝光时间T的一种示例性的函数关系。
其中,紫外光源可以是专用的紫外灯,也可以是其他光源。本申请实施例并不限制紫外光源的类型。
b)分别设置不同的ISO,按照步骤a)获得不同ISO条件下,曝光时间T与紫外线强度V的函数关系V=fn(T)。其中,n(正整数)表示n个不同的ISO条件。fn是将曝光时间T映射到紫外线强度V的函数。
示例性的,V=a*T^3+b*T^2+c*T+d(公式1),其中,a、b、c、d代表函数参数系数。T^3表示T的3次方。示例性的,参数a/b/c/d是根据公式1对应的多项式曲线估计出来的。
需要说明的是,公式1仅示例性示出了紫外线强度V和曝光时间T之间的一种可能的映射关系,应当理解,紫外线强度V和曝光时间T之间的映射关系还可以是其他。本申请实施例不限制两者之间的具体关系。
示例性的,图10分别示出了ISO=G2,ISO=G3时,紫外线强度V与曝光时间T的函数关系。
在得到图10所示的紫外线强度V与曝光时间的函数后,可将该函数作为检测紫外线的模型。
如下介绍根据诸如图10所示模型以及紫外摄像头采集的紫外图像,确定环境紫外线指标的方法。如图11所示,该方法包括:
S101、电子设备采集第一图像。
以紫外摄像头(第一摄像头的一个示例)与彩色摄像头(第二摄像头的一个示例) 同步采集图像为例,在开启紫外线检测功能,比如用户点击图12的(a)所示紫外线检测控件403(第一指令的一个示例)之后,手机调用彩色摄像头采集彩色图像,且调用紫外摄像头采集紫外图像。可选的,如图12的(a)所示,手机可在预览界面中显示采集的彩色图像602,以及紫外图像601。
在采集第一图像后,电子设备可以根据所述第一图像确定紫外线指标。作为一种可能的实现方式,电子设备根据所述第一图像所包括像素的灰度值,确定紫外线指标。电子设备根据第一图像确定紫外线指标的流程包括如下步骤S102-S105a(或A105b)。
S102、电子设备判断第一图像的平均灰度是否为目标灰度,若是,则执行步骤S103a,若否,则执行步骤S103b。
可选的,目标灰度可以根据统计获得,或目标灰度可以根据经验值获得。本申请实施例并不限制目标灰度的具体设置方式。紫外图像的平均灰度为目标灰度时,紫外图像通常不产生过曝或过暗的问题。
考虑到平均灰度的计算可能产生误差,因此,在判断图像的灰度是否为目标灰度时,可以设置一定数值范围,若图像的灰度在该数值范围内,则称该图像的灰度为目标灰度。示例性的,目标灰度的范围是128±5,若图像的灰度在128±5范围内,则称该图像的灰度为目标灰度。
通过设置目标灰度范围,使得检测紫外线指标所依据的紫外图像的平均灰度在一定范围内,能够避免紫外图像过曝或过暗导致的检测不准的问题。
S103a、电子设备获得第一图像对应的紫外摄像头参数。
可选的,紫外摄像头参数包括但不限于曝光时间、感光度。
示例性的,假设第一图像为图12的(a)所示紫外图像601,且经计算,紫外图像601的平均灰度在128±5内,则获取紫外图像601对应的紫外摄像头参数。
可选的,如图12的(a)所示,在获得紫外图像601对应的紫外摄像头参数(比如感光度ISO=125,曝光时间T=1/100)后,手机可在界面中显示紫外摄像头参数。
可选的,如图12的(a)所示,手机可在界面中显示紫外图像601的平均灰度。
S104a、调整紫外摄像头参数,使得在该紫外摄像头参数条件下得到的第三图像的平均灰度为目标灰度。
其中,第三图像对应的紫外摄像头参数与第一图像对应的紫外摄像头参数不同。
示例性的,以第一图像为图12的(a)所示紫外图像601,紫外图像601的平均灰度为目标灰度为例,在获得如图12的(a)所示紫外图像601对应的紫外摄像头参数之后,手机根据预设算法调整紫外摄像头参数,使得在调整后的紫外摄像头参数条件下,得到的第三图像603的平均灰度为目标灰度。在获得如图12所示的第一图像601和第三图像603之后,手机可以执行步骤S105a。
可选的,如图12的(b)所示,手机可在预览界面中显示采集的紫外图像603。
可选的,如图12的(b)所示,手机可在界面中显示紫外图像603对应的紫外摄像头参数(第三信息的一个示例)。紫外摄像头参数包括但不限于感光度、曝光时间。
可选的,如图12的(b)所示,手机可在界面中显示紫外图像603的平均灰度。
再示例性的,以第一图像为如图13的(a)所示图像601,图像601的平均灰度不是目标灰度为例,手机采集图像601之后,确定图像601的平均灰度没有在目标灰 度范围内,则可以调整紫外摄像头参数,并使得在各种参数条件所采集紫外图像的平均灰度始终在目标灰度范围内,以防止过曝或过暗导致的检测不准的问题。可选的,调整后的紫外摄像头参数如图13的(b)所示。可选的,手机可以将该紫外摄像头参数条件下得到的第四图像604显示在如图13的(b)所示预览界面中。可以看出,图像604与图像601的灰度不同。
之后,手机可以参照上述实施例的方案进行一次或多次紫外摄像头参数的调整。比如,将紫外摄像头参数调整为如图13的(c)所示参数。可选的,手机可以将该紫外摄像头参数条件下得到的紫外图像603显示在如图13的(c)所示预览界面中。可以看出,图像604与图像603的灰度相同。在得到如图13所示的第四图像604和第三图像603之后,手机可执行步骤S105b。
S105a、根据第一图像对应的紫外摄像头参数、第三图像对应的紫外摄像头参数、目标灰度以及模型,确定紫外线指标。
示例性的,在如图12所示的紫外图像601以及紫外图像603的平均灰度均为目标灰度的情况下,将紫外图像601和紫外图像603对应的紫外摄像头参数代入诸如图11所示函数,进而确定紫外线指标。
比如,假设图10中的G1=125,G2=200,由紫外图像601对应的紫外摄像头参数ISO1=125,T1=1/100,可确定该参数条件(G1=125,G2=200)下的紫外线强度V1,由由紫外图像603对应的紫外摄像头参数ISO=200,T=1/125,可确定该参数条件(即ISO=200,T=1/125)下紫外线强度V2。为了紫外线强度的计算精确度,可根据V1、V2确定最终的紫外线强度。
可选的,可以计算V1、V2的平均值,该平均值可以是算数平均值或加权平均值,本申请实施例对此并不加以限制。比如,可以采用多个结果的最大值或最小值作为最终的紫外线指标。
S103b、调整紫外摄像头参数,使得在该紫外摄像头参数条件下得到的第四图像的平均灰度为目标灰度。
如图13的(a)所示,在开启紫外线检测功能,比如用户点击紫外线检测控件403后,手机调用紫外摄像头采集第一图像601,经计算,该紫外图像601的亮度太大,产生过曝现象。那么,手机可以调整紫外摄像头参数,使得调整参数后紫外摄像头所采集紫外图像的亮度在预设范围内。示例性的,调整紫外摄像头参数后,手机的紫外摄像头采集的紫外图像如图13的(b)的紫外图像604所示,紫外图像604的平均灰度为目标灰度。
在一些实施例中,手机按照一定步长调整紫外摄像头参数,并在调整后的参数条件下,调用紫外摄像头采集紫外图像,计算紫外图像的平均灰度。若平均灰度符合目标灰度范围,则确定该紫外摄像头参数为需要调整至的参数,若平均灰度不符合目标灰度范围,则手机继续调整紫外摄像头参数,并采集紫外图像,计算紫外图像的平均灰度,直至调整参数后得到的紫外图像的平均灰度在预设范围内,确定该参数为需要调整至的参数。或者,
在另一些实施例中,手机调用紫外摄像头采集紫外图像,并根据该紫外图像以及算法计算需要调整到的紫外摄像头参数值。可选的,在计算出紫外摄像头参数值之后, 手机可以采集一张紫外图像,计算该紫外图像的平均灰度,以便确定紫外摄像头参数值的准确性。或者,调整紫外摄像头的方法还可以为其他,本申请实施例并不限制调整紫外摄像头参数的方法。
可选的,如图13的(b)所示,手机调整紫外摄像头参数之后,可以在界面中显示调整后的紫外摄像头参数。
示例性的,手机调整紫外摄像头参数时,可以固定ISO(比如G1),通过算法调整曝光时间至T1,使得在ISO=G1且T=T1的参数条件下,紫外图像的平均灰度值达到目标灰度值,则停止调整。然后,手机可以设置ISO为G2,再采用相同的调整操作获得G2对应的曝光时间T2,使得在ISO=G2且T=T2的参数条件下,紫外图像的平均灰度值达到目标灰度值,则停止调整。
或者,手机调整紫外摄像头参数时,可以固定曝光时间(比如T1),通过算法调整ISO至G1,使得在紫外图像的平均灰度值达到目标灰度值,则停止调整。然后,手机可以设置曝光时间T为T2,再采用相同的调整操作获得T2对应的感光度G2,使得在ISO=G2且T=T2的参数条件下,紫外图像的平均灰度值达到目标灰度值,则停止调整。
S104b、调整紫外摄像头参数,使得在该紫外摄像头参数条件下得到的第五图像的平均灰度为目标灰度。
S105b、根据第四图像对应的紫外摄像头参数、第五图像对应的紫外摄像头参数,目标灰度以及模型,确定紫外线指标。
示例性的,如图13所示的紫外图像604和紫外图像603的平均灰度均在目标灰度范围内,将如图13所示的紫外图像604和紫外图像603对应的紫外摄像头参数代入诸如图10所示函数,进而确定紫外线指标。
图13、图12中以使得紫外图像的平均灰度在目标灰度范围内的两种参数条件(即图12中调整一次紫外摄像头参数,图13中调整两次紫外摄像头参数)为例进行说明,在实际实现时,可以进行更多次数或更少次数的参数调整,从而得到在更多或更少参数条件下对应的紫外线指标,并根据不同参数条件的紫外线指标确定最终的综合的紫外线指标。其中,由于考虑到了不同参数条件,因此,可以修正部分参数条件下紫外线指标的偏差,使得最终输出的紫外线指标更加准确。
方案2
该方案中,如图14所示,可以从紫外图像提取多个图像特征,并将多个图像特征构成的特征向量输入分类器,由分类器输出紫外线指标。
可选的,图像特征包括但不限于如下一个或多个特征:图像区域的平均灰度值、最大灰度值、最小灰度值、图像区域中相邻像素的灰度值的差值、图像区域中各像素的灰度值、图像区域的对比度、直方图、方向梯度直方图(histogram of oriented gradients,HOG)、标准差、色阶、均方差、方差。
首先,对本申请实施例涉及的分类器的训练过程进行介绍。训练过程如图15所示,训练用于识别紫外线指标的分类器,需要提供N(N为正整数)个样本,样本是紫外线指标已知的紫外图像的特征向量。可选的,训练样本还可以包括各紫外图像对应的标签(表征紫外图像对应的紫外线指标),对多个样本进行训练即可得到分类器。
可选的,在训练分类器之前,可以对训练特征向量等数据进行处理,比如进行平滑处理,归一化处理。其中,归一化处理可降低算法的复杂度。平滑处理可进一步包括降噪、拟合等操作,以减少统计误差带来的影响。
可选的,为了提升分类器的识别准确率,可以对分类器进行评估、测试。当分类器的识别率达到一定阈值,说明该分类器已训练好。当分类器的识别率较低,可以继续训练分类器,直至分类器的识别准确率达到一定阈值。
可选的,分类器的训练过程可以在端侧(比如手机等终端)或云侧(比如服务器)。训练可以是离线训练或在线训练。本申请实施例对分类器的具体训练方式不做限制。后续,训练好的分类器可根据输入的紫外线指标未知的紫外图像的特征向量,输出相应紫外图像对应的紫外线指标。
上述实施例中,建立数学模型时,通过捕获紫外光源直接发射的紫外光获得紫外图像,在另一些实施中,也可以通过捕获反射光线构建模型。例如,使用紫外光源向目标对象进行照射,电子设备捕获目标对象反射的紫外光形成紫外图像,并按照上述方法构建用于检测紫外线指标的数学模型。
目标对象包括但不限于绿色植物、沙子等。
构建模型后,在检测紫外线指标时,也需要将电子设备对准目标对象进行取景,即电子设备需对准目标对象,调用紫外摄像头采集紫外图像,并根据采集的紫外图像确定紫外线指标。
对于某些目标对象来说,其某些部位吸收的紫外线较多,反射的紫外线较少或不反射紫外线,对于这些部位来说,透过紫外摄像头的紫外线较少或没有紫外线透过紫外摄像头,因此,这些部位在紫外图像中通常不可见。对于另外某些部位来说,其吸收的紫外线较少,反射的紫外线较多,对于这些部位来说,透过紫外摄像头的紫外线较多,因此,这些部位在紫外图像中通常可见。
上述提及的方案1和方案2仅是示例性的两种算法,本申请实施例中用于检测紫外线指标的算法还可以为其他,比如逻辑回归等回归算法,或其他分类算法。只要是该算法通过所采集图像的灰度确定紫外线指标,均可以在本申请实施例的技术方案范围内。此外,本申请实施例对用于检测紫外线指标的紫外摄像头参数的类型和数量不做限制,只要该参数与紫外图像的灰度关联,或者,通过调整该参数能够调整紫外图像的灰度,或者该参数的改变能够使得紫外图像的灰度随之变化,就可以将该参数用于本申请实施例的技术方案。比如,在另一些实施例中,可以将光圈值这一紫外摄像头参数用于本申请实施例的检测紫外线指标的技术方案中。
在一些实施例中,手机还可以根据紫外线指标,向用户提示个性化的紫外防晒建议。比如,在紫外线指数较高时,向用户提示需要涂抹防晒霜。
本申请另一些实施例提供了一种装置,该装置可以是上述电子设备(比如折叠屏手机)。该装置可以包括:显示屏、存储器和一个或多个处理器。该显示屏、存储器和处理器耦合。该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令。当处理器执行计算机指令时,电子设备可执行上述方法实施例中手机执行的各个功能或者步骤。该电子设备的结构可以参考图2A-图2D、图3所示的电子设备。
其中,该电子设备的核心结构可以表示为图16所示的结构,该核心结构可包括: 处理模块1301、输入模块1302、存储模块1303、显示模块1304。
处理模块1301,可包括中央处理器(CPU)、应用处理器(Application Processor,AP)或通信处理器(Communication Processor,CP)中的至少一个。处理模块1301可执行与用户电子设备的其他元件中的至少一个的控制和/或通信相关的操作或数据处理。具体地,处理模块1301可用于根据一定的触发条件,控制主屏上显示的内容。或者根据预设规则确定屏幕上显示的内容。处理模块1301还用于将输入的指令或数据进行处理,并根据处理后的数据确定显示样式。
输入模块1302,用于获取用户输入的指令或数据,并将获取到的指令或数据传输到电子设备的其他模块。可选的,输入模块1302的输入方式可以包括触摸、手势、接近屏幕等,也可以是语音输入。例如,输入模块可以是电子设备的屏幕,获取用户的输入操作并根据获取到的输入操作生成输入信号,将输入信号传输至处理模块1301。在本申请实施例中,输入模块可用于接收用户输入的第一指令,和/或执行其他步骤。
采集模块1306,用于采集数据,并将采集到的数据传输到电子设备的其他模块。可选的,采集模块1306可以是电子设备的摄像头,摄像头可以将采集的图像传输至处理模块1301,和/或执行其他步骤。摄像头包括但不限于彩色摄像头、紫外摄像头。
存储模块1303,可包括易失性存储器和/或非易失性存储器。存储模块用于存储用户终端设备的其他模块中的至少一个相关的指令或数据,具体地说,存储模块可记录摄像头采集的图像。
显示模块1304,可包括例如液晶显示器(LCD)、发光二极管(LED)显示器、有机发光二极管(OLED)显示器、微机电系统(MEMS)显示器或电子纸显示器。用于显示用户可观看的内容(例如,文本、图像、视频、图标、符号等)。在本申请实施例中,显示模块可实现为显示屏幕。
可选的,图16所示结构还可通信模块1305,用于支持电子设备与其他电子设备通信。例如,通信模块可经由无线通信或有线通信连接到网络,以与其他个人终端或网络服务器进行通信。无线通信可采用蜂窝通信协议中的至少一个,诸如,长期演进(LTE)、高级长期演进(LTE-A)、码分多址(CDMA)、宽带码分多址(WCDMA)、通用移动通信系统(UMTS)、无线宽带(WiBro)或全球移动通信系统(GSM)。无线通信可包括例如短距通信。短距通信可包括无线保真(Wi-Fi)、蓝牙、近场通信(NFC)、磁条传输(MST)或GNSS中的至少一个。
本申请实施例还提供一种芯片系统,如图17所示,该芯片系统包括至少一个处理器1401和至少一个接口电路1402。处理器1401和接口电路1402可通过线路互联。例如,接口电路1402可用于从其它装置(例如电子设备的存储器)接收信号。又例如,接口电路1402可用于向其它装置(例如处理器1401)发送信号。示例性的,接口电路1402可读取存储器中存储的指令,并将该指令发送给处理器1401。当所述指令被处理器1401执行时,可使得电子设备执行上述实施例中的各个步骤。当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请实施例还提供一种计算机存储介质,该计算机存储介质包括计算机指令,当所述计算机指令在上述电子设备上运行时,使得该电子设备执行上述方法实施例中手机执行的各个功能或者步骤。
本申请实施例还提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行上述方法实施例中手机执行的各个功能或者步骤。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (13)

  1. 一种紫外线检测方法,其特征在于,所述方法应用于电子设备,所述电子设备包括第一摄像头,所述第一摄像头包括紫外摄像头,所述方法包括:
    检测第一指令,所述第一指令用于指示检测紫外线指标;
    在检测到所述第一指令之后,通过第一摄像头采集第一图像;
    根据所述第一图像确定紫外线指标;
    显示第一信息,所述第一信息包括所述紫外线指标的信息。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:在根据所述第一图像确定紫外线指标之前,显示第二信息,所述第二信息用于提示将所述第一摄像头对准目标对象,所述目标对象包括天空。
  3. 根据权利要求1或2所述的方法,其特征在于,所述电子设备还包括第二摄像头,所述方法还包括:通过所述第二摄像头采集第二图像;所述第二摄像头包括彩色摄像头;
    在通过第一摄像头采集第一图像之后,所述方法还包括:
    显示用户界面,所述用户界面包括第一图像的部分内容,和/或包括第二图像的部分内容。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,在通过第一摄像头采集第一图像之后,所述方法还包括:显示第三信息,所述第三信息包括所述第一摄像头的参数;所述参数包括感光度、曝光时间。
  5. 根据权利要求1-4中任一项所述的方法,其特征在于,通过所述第二摄像头采集第二图像,包括:在通过第一摄像头采集第一图像的同时,通过所述第二摄像头采集第二图像;
    在通过所述第二摄像头采集第二图像之后,所述方法还包括:判断所述第二图像是否满足第一条件;在确定所述第二图像满足所述第一条件的情况下,根据所述第一图像确定紫外线指标;所述第一条件包括:所述第二图像中存在目标对象;或,所述第一条件包括:所述第二图像中存在所述目标对象,且所述目标对象在所述第二图像中的面积占比大于或等于阈值。
  6. 根据权利要求1-4中任一项所述的方法,其特征在于,通过第二摄像头采集第二图像,包括:在检测到所述第一指令之后,通过第一摄像头采集第一图像之前,通过第二摄像头采集第二图像;
    通过第二摄像头采集第二图像之后,所述方法包括:判断所述第二图像是否满足第一条件;
    通过所述第一摄像头采集所述第一图像,包括:在确定所述第二图像满足所述第一条件的情况下,通过所述第一摄像头采集所述第一图像。
  7. 根据权利要求1-6中任一项所述的方法,其特征在于,根据所述第一图像确定紫外线指标,包括:根据所述第一图像所包括像素的灰度值,确定紫外线指标。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,在通过第一摄像头采集第一图像之后,根据所述第一图像确定紫外线指标之前,所述方法还包括:
    若所述第一图像的灰度值为目标灰度,则调整所述第一摄像头的参数,并通 过所述第一摄像头采集第三图像,其中,所述第三图像的灰度为所述目标灰度;
    根据所述第一图像确定紫外线指标,包括:根据所述第一图像对应的第一摄像头参数、所述第三图像对应的第一摄像头参数以及所述目标灰度,确定紫外线指标。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,在通过第一摄像头采集第一图像之后,根据所述第一图像确定紫外线指标之前,所述方法还包括:
    若所述第一图像的灰度值不是目标灰度,则调整所述第一摄像头的参数,并采集第四图像,所述第四图像的灰度为目标灰度;
    调整所述第一摄像头的参数,并采集第五图像,所述第五图像的灰度为所述目标灰度;
    根据所述第一图像确定紫外线指标,包括:根据第四图像对应的第一摄像头参数、第五图像对应的第一摄像头参数以及所述目标灰度,确定紫外线指标。
  10. 根据权利要求1-9中任一项所述的方法,其特征在于,所述第一摄像头的参数包括曝光时间、感光度。
  11. 一种电子设备,其特征在于,包括:处理器,存储器,所述存储器与所述处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器从所述存储器中读取所述计算机指令,使得所述电子设备执行权利要求1-10任一项所述的方法。
  12. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行权利要求1-10中任一项所述的方法。
  13. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行权利要求1-10中任一项所述的方法。
PCT/CN2022/121907 2021-10-21 2022-09-27 紫外线检测方法及电子设备 WO2023065994A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111229520.6A CN116007744A (zh) 2021-10-21 2021-10-21 紫外线检测方法及电子设备
CN202111229520.6 2021-10-21

Publications (1)

Publication Number Publication Date
WO2023065994A1 true WO2023065994A1 (zh) 2023-04-27

Family

ID=86034121

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/121907 WO2023065994A1 (zh) 2021-10-21 2022-09-27 紫外线检测方法及电子设备

Country Status (2)

Country Link
CN (1) CN116007744A (zh)
WO (1) WO2023065994A1 (zh)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103167229A (zh) * 2011-12-15 2013-06-19 富泰华工业(深圳)有限公司 具有紫外线探测功能的电子装置及方法
US20140092292A1 (en) * 2012-10-02 2014-04-03 Lg Electronics Inc. Automatic recognition and capture of an object
CN104422514A (zh) * 2013-09-11 2015-03-18 西安凯倍耐特智能工程有限公司 一种能检测户外紫外线强度的数码摄像机
CN104715393A (zh) * 2013-12-16 2015-06-17 国际商业机器公司 用于推荐正确防晒霜使用的方法和系统
CN104853095A (zh) * 2015-04-30 2015-08-19 广东欧珀移动通信有限公司 一种拍照方法及用户终端
CN108337445A (zh) * 2018-03-26 2018-07-27 华为技术有限公司 拍照方法、相关设备及计算机存储介质
CN110166691A (zh) * 2019-05-31 2019-08-23 维沃移动通信(杭州)有限公司 一种拍摄方法及终端设备
CN113358217A (zh) * 2021-05-18 2021-09-07 北京优彩科技有限公司 一种紫外线强度检测方法及装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103167229A (zh) * 2011-12-15 2013-06-19 富泰华工业(深圳)有限公司 具有紫外线探测功能的电子装置及方法
US20140092292A1 (en) * 2012-10-02 2014-04-03 Lg Electronics Inc. Automatic recognition and capture of an object
CN104422514A (zh) * 2013-09-11 2015-03-18 西安凯倍耐特智能工程有限公司 一种能检测户外紫外线强度的数码摄像机
CN104715393A (zh) * 2013-12-16 2015-06-17 国际商业机器公司 用于推荐正确防晒霜使用的方法和系统
CN104853095A (zh) * 2015-04-30 2015-08-19 广东欧珀移动通信有限公司 一种拍照方法及用户终端
CN108337445A (zh) * 2018-03-26 2018-07-27 华为技术有限公司 拍照方法、相关设备及计算机存储介质
CN110166691A (zh) * 2019-05-31 2019-08-23 维沃移动通信(杭州)有限公司 一种拍摄方法及终端设备
CN113358217A (zh) * 2021-05-18 2021-09-07 北京优彩科技有限公司 一种紫外线强度检测方法及装置

Also Published As

Publication number Publication date
CN116007744A (zh) 2023-04-25

Similar Documents

Publication Publication Date Title
US11800221B2 (en) Time-lapse shooting method and device
CN113132620B (zh) 一种图像拍摄方法及相关装置
EP3893491A1 (en) Method for photographing the moon and electronic device
US11759143B2 (en) Skin detection method and electronic device
WO2020077511A1 (zh) 一种拍摄场景下的图像显示方法及电子设备
WO2021258321A1 (zh) 一种图像获取方法以及装置
WO2021078001A1 (zh) 一种图像增强方法及装置
WO2021258814A1 (zh) 视频合成方法、装置、电子设备及存储介质
CN113170037B (zh) 一种拍摄长曝光图像的方法和电子设备
CN113810603B (zh) 点光源图像检测方法和电子设备
WO2020015144A1 (zh) 一种拍照方法及电子设备
US20230276125A1 (en) Photographing method and electronic device
WO2023273323A9 (zh) 一种对焦方法和电子设备
WO2023241209A9 (zh) 桌面壁纸配置方法、装置、电子设备及可读存储介质
WO2020015149A1 (zh) 一种皱纹检测方法及电子设备
EP4131063A1 (en) Eye bag detection method and device
WO2023011348A1 (zh) 检测方法及电子设备
WO2023065994A1 (zh) 紫外线检测方法及电子设备
EP4366289A1 (en) Photographing method and related apparatus
CN115631250B (zh) 图像处理方法与电子设备
CN114115772B (zh) 灭屏显示的方法及装置
WO2022206783A1 (zh) 拍摄方法、装置、电子设备及可读存储介质
CN116055872B (zh) 图像获取方法、电子设备和计算机可读存储介质
US20240137659A1 (en) Point light source image detection method and electronic device
WO2024041180A1 (zh) 路径规划方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22882605

Country of ref document: EP

Kind code of ref document: A1