WO2021052436A1 - Skin roughness measurement method and electronic device - Google Patents

Skin roughness measurement method and electronic device Download PDF

Info

Publication number
WO2021052436A1
WO2021052436A1 PCT/CN2020/115973 CN2020115973W WO2021052436A1 WO 2021052436 A1 WO2021052436 A1 WO 2021052436A1 CN 2020115973 W CN2020115973 W CN 2020115973W WO 2021052436 A1 WO2021052436 A1 WO 2021052436A1
Authority
WO
WIPO (PCT)
Prior art keywords
texture
connected domain
image
image blocks
skin
Prior art date
Application number
PCT/CN2020/115973
Other languages
French (fr)
Chinese (zh)
Inventor
于野
胡宏伟
董辰
郜文美
姚烨
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021052436A1 publication Critical patent/WO2021052436A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/442Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the embodiments of the present application relate to the field of image processing technology, and in particular, to a skin roughness detection method and electronic equipment.
  • Skin Roughness as an important skin texture analysis method, can reflect the health of human body functions to a certain extent.
  • the embodiments of the present application provide a skin roughness detection method and electronic device, which are used to detect the roughness of the skin through the electronic device.
  • an embodiment of the present application provides a method for detecting skin roughness, which is applied to an electronic device or a chip in an electronic device.
  • the method includes: acquiring a grayscale image of a skin image to be processed;
  • the texture feature is extracted from the gray-scale image, and the texture feature includes at least one of texture depth, texture width, wide texture density, and texture density;
  • the texture depth is used to characterize the depth of the lines on the skin, and the texture width is
  • the wide texture density is used to characterize the density of the lines in the skin whose line width reaches a preset threshold, and the texture density is used to characterize the density of the lines in the skin; further according to the texture
  • the feature determines the roughness of the skin in the skin image to be processed.
  • the electronic device is used to determine the roughness of the skin based on the texture feature. It is easy to use and does not use a simple parameter threshold to determine the roughness of the skin. It has strong stability and high accuracy.
  • the extraction of texture features in the grayscale image can be achieved by dividing the grayscale image into K image blocks, and obtaining the grayscale from the K image blocks N image blocks with an average value within a preset grayscale range, extract the texture features from the N image blocks, K and N are both positive integers, and N is less than or equal to K; or, the grayscale The image is divided into K image blocks, the gray scale average size of the K image blocks is sorted, N image blocks sorted within a preset ranking range are obtained, and the texture feature is extracted from the N image blocks .
  • several image blocks are selected from multiple image blocks to extract texture features, which reduces the influence of ambient light on the extraction of texture features and improves accuracy.
  • the extraction of texture features from the N image blocks can be achieved by performing binarization processing on the N image blocks respectively to obtain N binarized images; N binarized images are subjected to connected domain analysis to obtain at least one first connected domain, where the at least one first connected domain is used to indicate the position of the skin texture area in the N image blocks; from the N images The texture feature is extracted from a region where the at least one first connected domain in the block is located.
  • performing binarization processing on the N image blocks to obtain N binarized images respectively includes: filtering the N image blocks respectively, and performing the filtered N images Block binarization processing to obtain N binarized images.
  • filtering is performed before binarization, which can smooth and denoise the image.
  • the performing connected domain analysis on the N binarized images to obtain at least one first connected domain includes: performing corrosion and/or expansion processing on the N binarized images, The connected domain analysis is performed on the N binarized images after the corrosion and/or expansion processing to obtain the at least one first connected domain.
  • the above design can make the determined skin texture clearer and more accurate by performing corrosion and/or expansion processing on the binary image.
  • the texture depth may be determined according to the average gray value of the area where at least one first connected domain in the N image blocks is located and the average gray value of the N image blocks.
  • the texture depth can meet the requirements of the following formula:
  • F1 represents the texture depth
  • M1 represents the average gray value of the area where the at least one first connected domain in the N image blocks is located
  • M represents the average gray value of the N image blocks.
  • extracting the texture width from the area where the at least one first connected domain in the N image blocks is located includes: according to the outer contour of the second connected domain in the at least one first connected domain The length and area determine the texture width; wherein the second connected domain is the first connected domain with the longest outer contour length or the largest area in the at least one first connected domain.
  • determining the texture width according to the length and area of the outer contour of the second connected domain in the at least one first connected domain can be implemented in the following manner:
  • the texture width is determined by the following formula:
  • F2 represents the texture width
  • F1 represents the texture depth
  • S1 represents the area of the second connected domain
  • L1 represents the length of the outer contour of the second connected domain
  • L0 represents the sum of the length of the inner contour of the second connected domain
  • the texture width is determined by the following formula:
  • F2 represents the texture width
  • F1 represents the texture depth
  • S1 represents the area of the second connected domain
  • L1 represents the outer contour length of the second connected domain.
  • the above design provides a simple and effective way to extract the texture width.
  • extracting the wide texture density from the area where the at least one first connected domain in the N image blocks is located can be implemented in the following manner:
  • extracting the texture density from the area where the at least one first connected domain in the N image blocks is located may be implemented in the following manner: determining the first one included in the N image blocks The second ratio between the area sum of the connected domain and the area sum of the N image blocks; the second ratio is multiplied by the texture depth to determine the texture density, or the second ratio is Determined as the texture density.
  • the extraction of the texture features in the grayscale image may be achieved in the following manner: performing a histogram equalization process on the grayscale image to obtain an equalized image, and extracting the equalized image
  • the texture feature in can prevent the influence of uneven illumination on the detection result.
  • the determination of the roughness of the skin in the skin image to be processed according to the texture feature can be implemented in the following manner: an integrated learning algorithm model is used to determine the skin image in the skin image to be processed according to the texture feature The roughness of the skin.
  • an embodiment of the present application provides a skin roughness detection device, which includes units respectively used to execute the method according to the first aspect or any one of the designs of the first aspect.
  • an embodiment of the present application provides an electronic device including a processor and a memory; wherein the processor is coupled with the memory; wherein the memory is used to store program instructions; the processor is used to read program instructions stored in the memory, To achieve the first aspect and any possible design methods.
  • a computer storage medium provided by an embodiment of the present application, stores program instructions, and when the program instructions run on an electronic device, the electronic device executes the first aspect and any of its possible designs. method.
  • a computer program product provided by an embodiment of the present application, when the computer program product runs on an electronic device, causes the electronic device to execute the first aspect and any possible design method thereof.
  • the sixth aspect is a chip provided by an embodiment of the present application, which is coupled with a memory in an electronic device, and executes the first aspect and any possible design method thereof.
  • Coupled in the embodiments of the present application means that two components are directly or indirectly combined with each other.
  • FIG. 1 is a schematic structural diagram of a terminal device 100 in an embodiment of the application
  • FIG. 2 is a schematic flow chart of a method for detecting the roughness of the back of the hand in an embodiment of the application
  • FIG. 3 is a schematic diagram of a preview user interface in an embodiment of the application.
  • FIG. 4 is a schematic diagram of image blocks used for extracting texture features in an image of the back of the hand in an embodiment of the application;
  • Figure 5 is a schematic diagram of binarization processing in an embodiment of the application.
  • Figure 6 is a schematic diagram of connected domains in an embodiment of this application.
  • FIG. 7 and FIG. 8 are schematic diagrams of the detection result of the method provided by this application in the embodiment of the application.
  • Figure 9 is a schematic diagram of a skin roughness detection report in an embodiment of the application.
  • FIG. 10 is a schematic diagram of an electronic device 1000 provided by an embodiment of the application.
  • the embodiment of the present application proposes a skin roughness detection solution, which is suitable for electronic equipment, and the electronic equipment may be a terminal device.
  • the skin roughness detection function provided in the embodiments of the present application may be integrated in one or more applications of the terminal device, for example, integrated in a camera application.
  • the terminal device starts the camera application and displays a viewfinder interface.
  • the viewfinder interface may include a control. When the control is activated, the terminal device can start the skin roughness detection function provided by the embodiment of the present application.
  • the skin roughness detection function provided by the embodiments of the present application can also be integrated in an application dedicated to skin detection in a terminal device.
  • the application for skin detection can not only realize the skin roughness detection function, but also can realize the detection of wrinkles, pores, blackheads, etc. of the facial skin.
  • Skin roughness can be neck roughness, facial roughness, or back hand roughness.
  • the skin test application can also provide the user with a test result report.
  • the detection result report can include, but is not limited to, scoring of various features on the back of the hand, comprehensive analysis of the back of the hand, etc., and corresponding nursing or treatment suggestions can also be given according to the score of the back of the hand. It is understandable that the test result report can be presented to the user through the user interface.
  • the terminal device may be a portable terminal device containing functions such as a personal digital assistant and/or a music player, such as a mobile phone, a tablet computer, a wearable device with wireless communication function (such as a smart watch), Vehicle equipment, etc.
  • portable terminal equipment include, but are not limited to, carrying Or portable terminal equipment with other operating systems.
  • the aforementioned portable terminal device may also be, for example, a laptop computer with a touch-sensitive surface (such as a touch panel). It should also be understood that in some other embodiments of the present application, the aforementioned terminal device may also be a desktop computer with a touch-sensitive surface (such as a touch panel).
  • the terminal device may also have algorithmic computing capabilities (capable of running the skin roughness detection algorithm provided in the embodiments of the present application) and communication functions, without the need for image acquisition functions.
  • a terminal device receives an image sent by another device, and then runs the skin roughness detection algorithm provided in this embodiment of the application to detect the roughness of the skin in the image.
  • the terminal device itself has an image collection function and arithmetic operation function as an example.
  • FIG. 1 shows a schematic structural diagram of a possible terminal device 100.
  • the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 2, wireless communication Module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194 and so on.
  • the terminal device 100 in the embodiment of the present application may further include an antenna 1, a mobile communication module 150, a subscriber identification module (SIM) card interface 195, and so on.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, and a memory.
  • AP application processor
  • GPU graphics processing unit
  • ISP image signal processor
  • controller controller
  • memory e.g., RAM
  • Video codec digital signal processor
  • DSP digital signal processor
  • baseband processor e.g., baseband processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in the processor 110 may be a cache memory.
  • the memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may run the skin roughness detection algorithm provided in the embodiment of the present application to detect the roughness of the skin on the image.
  • the processor 110 may further include one or more interfaces.
  • the interface may be a universal serial bus (USB) interface 130.
  • the interface can also be an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transceiving transmission Universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) ) Interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transceiving transmission Universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • the embodiments of the present application may connect different modules of the terminal device 100 through interfaces, so that the terminal device 100 can implement different functions. For example, taking pictures, processing, etc. It should be noted that the embodiment of the present application does not limit the connection mode of the interface in the terminal device 100.
  • the USB interface 130 is an interface that complies with the USB standard specification.
  • the USB interface 130 may include a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the terminal device 100, and can also be used to transfer data between the terminal device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect to other terminal devices, such as AR devices.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the terminal device 100. While the charging management module 140 charges the battery 142, it can also supply power to the terminal device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the terminal device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the terminal device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, filter, amplify, etc. the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the terminal device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic wave signals via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the terminal device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the terminal device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), and quasi-zenith satellite system (quasi). -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the terminal device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the terminal device 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the terminal device 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the camera 193 may include photosensitive elements such as a lens group and an image sensor, where the lens group includes a plurality of lenses (convex lens or concave lens) for collecting the light signal reflected by the object to be photographed (such as the back of the hand), and the collected light signal Passed to the image sensor.
  • the image sensor generates an image of the object to be photographed (such as an image of the back of the hand) according to the light signal.
  • the camera 193 can send the image of the back of the hand to the processor 110, and the processor 110 runs the skin roughness detection algorithm provided by the embodiment of the present application to detect the roughness of the back of the hand in the image of the back of the hand.
  • the display screen 194 may display the detection report of the roughness of the back of the hand.
  • the camera 193 shown in FIG. 1 may include 1-N cameras, and the number of cameras is not limited in this application.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the terminal device 100 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the terminal device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card (for example, a Micro SD card) to expand the storage capacity of the terminal device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the terminal device 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a camera application, a skin detection application, etc.) required by a function, and the like.
  • the data storage area can store data created during the use of the terminal device 100 (for example, images captured by a camera, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • the internal memory 121 may also store the code of the skin roughness detection algorithm provided in the embodiment of the present application.
  • the code of the skin roughness detection algorithm stored in the internal memory 121 is executed by the processor 110, the skin roughness detection function is realized.
  • the code of the skin roughness detection algorithm provided in the embodiment of the present application can also be stored in an external memory.
  • the processor 110 may run the code of the skin roughness detection algorithm stored in the external memory through the external memory interface 120 to implement the corresponding wrinkle detection function.
  • the terminal device 100 can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, an application processor, and the like. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the terminal device 100 can listen to music or listen to a hands-free call through the speaker 170A.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the terminal device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the terminal device 100 may be provided with at least one microphone 170C.
  • the terminal device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals.
  • the terminal device 100 may also be provided with three, four or more microphones 170C to realize sound signal collection, noise reduction, sound source identification, and directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, etc. .
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the sensor module 180 includes an ambient light sensor 180L.
  • the sensor module 180 may also include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, Bone conduction sensor 180M and so on.
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the terminal device 100 determines the strength of the pressure according to the change in capacitance.
  • the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the terminal device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example, when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the terminal device 100.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the terminal device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device 100 in various directions (generally three axes).
  • Distance sensor 180F used to measure distance.
  • the terminal device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the terminal device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • Touch sensor 180K also called "touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the terminal device 100, which is different from the position of the display screen 194.
  • the button 190 may include a power-on button, a volume button, and the like.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the terminal device 100 may receive key input, and generate key signal input related to user settings and function control of the terminal device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations that act on different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the terminal device 100.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the terminal device 100.
  • the terminal device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • At least one refers to one or more
  • multiple refers to two or more.
  • And/or describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship.
  • the following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • At least one item (a) of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple .
  • words such as “first” and “second” are only used for the purpose of distinguishing description, and cannot be understood as indicating or implying relative importance, nor can it be understood as indicating Or imply the order.
  • the single connected domain involved in the embodiments of this application refers to: the inside of any simple closed curve belonging to a region D belongs to D, then D is called a single connected region, and a single connected domain can also be described as follows: any closed curve in D The area enclosed by the curve contains only the points in D. More generally speaking, a singly connected area is an area without "holes".
  • Multi-connected domain refers to a region B on the complex plane, if any simple closed curve is made in it, and the interior of the curve does not always belong to B, it is called multi-connected domain.
  • the following takes the roughness detection of the back of the hand as an example for description.
  • the method for detecting the roughness of the back of the hand may be executed by the terminal device 100, for example, by the processor 110 in the terminal device 100.
  • detecting the roughness of the back of the hand through the terminal device 100 shown in FIG. 1 may be the following process:
  • S201 The terminal device 100 acquires a grayscale image of the back image of the hand to be processed.
  • S201 can be implemented through the following two steps:
  • A1 get the image of the back of the hand to be processed.
  • the function of detecting the roughness of the back of the hand is integrated into the application dedicated to skin detection.
  • the application dedicated to skin detection is called "skin measurement application".
  • the skin measurement application can only integrate the skin roughness detection function.
  • the skin detection of the face can also be integrated, such as blackhead detection, pore detection, erythema detection, and so on.
  • the image of the back of the hand is acquired through the skin measurement application, and the skin detection method provided in the embodiment of the present application is executed based on the image of the back of the hand. As shown in FIG. 3, the display screen 194 of the terminal device 100 displays an icon 300 of the skin test application.
  • the terminal device 100 detects the operation on the icon 300, and in response to the operation on the icon 300, displays the user interface 310 of the skin test application on the display screen 194.
  • the user interface 310 of the skin test application includes a detection button 311.
  • the terminal device 100 detects the operation of the detection button 311.
  • the camera 193 is turned on, and the display screen 194 displays the photographing preview interface 320.
  • the photo preview interface 320 is used to display the images collected by the camera 193.
  • the photo preview interface 320 may include a preview area 321, and the preview area 321 is used to display the image collected by the camera 193. It should be understood that the image collected by the camera 193 may be an image of the back of the user's hand.
  • the camera 193 may be a front camera of the terminal device 100 or a rear camera of the terminal device 100.
  • the camera 193 in order to improve the quality of taking pictures, is the rear camera of the terminal device 100 when the pixels of the front camera are lower than the pixels of the rear camera.
  • the terminal device 100 automatically photographs the image collected by the camera 193 to obtain the back image of the hand when the ambient light meets the photographing conditions.
  • the detection button 311 in the embodiment of the present application may also be referred to as a photographing button, or other names, and the embodiment of the present application does not limit the name of the detection button 311.
  • the image of the back of the hand may also be an image already stored in the terminal device 100, for example, stored in the internal memory 121, so that the terminal device 100 obtains the image of the back of the hand from the internal memory 121.
  • it is stored in an external memory, so that the terminal device 100 obtains the image of the back of the hand from the external memory through the external memory interface 120.
  • the terminal device 100 After acquiring the image of the back of the hand, the terminal device 100 converts the image of the back of the hand into a grayscale image.
  • the image of the back of the hand collected by the terminal device 100 through the camera is a color image, and the color of the back of the hand image is converted into a grayscale image in A2.
  • the terminal device 100 extracts texture features in the grayscale image, where the texture features include at least one of texture depth, texture width, wide texture density, and texture density;
  • the texture depth is used to characterize the depth of the lines on the back of the hand
  • the texture width is used to characterize the width of the lines on the back of the hand
  • the wide texture density is used to characterize the density of the lines on the back of the hand whose line width reaches a preset threshold
  • the texture density is used to characterize the lines. Density in the back of the hand.
  • S203 The terminal device 100 determines the roughness of the back of the hand in the back of the hand image to be processed according to the texture feature.
  • the terminal device 100 when it extracts texture features in a grayscale image, it may first preprocess the grayscale image, and then extract the texture feature from the preprocessed image. Preprocessing, for example, can be histogram equalization processing, which can prevent uneven illumination from affecting the extraction of texture features.
  • the grayscale image can be enlarged, reduced, or segmented.
  • the grayscale image can be divided into multiple Image block, and then select several image blocks with more obvious texture features from multiple image blocks for subsequent processing of extracting texture features.
  • the process of preprocessing the grayscale image is as follows:
  • the terminal device 100 performs histogram equalization processing on the gray image to obtain an equalized image.
  • the terminal device 100 divides the equalized image into K image blocks, for example, divides the equalized image into 10 ⁇ 10 image blocks.
  • the terminal device 100 sorts the average gray values of the K image blocks, and obtains N image blocks sorted within a preset ranking range. For example, the average gray value is sorted in the 9th image blocks from 20th to 28th, as shown in Fig. 4.
  • the terminal device 100 obtains N image blocks from K image blocks, in addition to the method shown in B3, obtains N images with gray average values within a preset range from the K image blocks. Piece. If there are less than N image blocks with an average gray value within the preset range, the actual number of image blocks can be used as the reference for subsequent extraction of texture features. If there are more than N image blocks with the mean gray value in the preset range, several image blocks can be randomly removed to make the number reach N. Of course, if the image blocks with the gray mean value in the preset range exceed N , The actual number of image blocks can also be used as the reference for subsequent extraction of texture features.
  • the terminal device 100 extracts the texture feature from the N image blocks, K and N are both positive integers, and N is less than or equal to K.
  • the terminal device 100 performs binarization processing on the N image blocks respectively to obtain N binarized images.
  • the N image blocks when the N image blocks are binarized to obtain N binarized images, the N image blocks may be filtered first for smoothing and denoising the N image blocks.
  • the filtering method can be average filtering, median filtering, Gaussian filtering, bilateral filtering and other methods. Taking mean filtering as an example, the fuzzy kernel 91 ⁇ 91 can be used for mean filtering.
  • binarization is performed on the filtered N image blocks to obtain N binarized images. After the binarization process, the gray value of the grain area is set to 255, and the gray value of other areas is set to 0. For example, see FIG. 5, which is a schematic diagram of a binarized image after the image block has been binarized.
  • the terminal device 100 performs connected domain analysis on the N binarized images to obtain at least one first connected domain, where the at least one first connected domain is used to indicate that the skin texture area is in the N image blocks position.
  • the N binarized images when performing connected domain analysis on the N binarized images to obtain at least one first connected domain, may be corroded and/or expanded first, and then the corroded and /Or the connected domain analysis is performed on the N binarized images after the expansion process to obtain the at least one first connected domain.
  • a 5 ⁇ 5 corrosion core may be used to respectively perform corrosion processing on the N binarized images, and then perform connected component analysis on the N binarized images after the corrosion processing to obtain at least one first connected component.
  • the terminal device 100 extracts the texture feature from the area where the at least one first connected domain is located in the N image blocks.
  • the following exemplarily describes the implementation of determining the texture depth, texture width, wide texture density, or texture density in the texture feature.
  • the terminal device 100 may determine according to the average gray value of the area where the at least one first connected domain in the N image blocks is located and the average gray value of the N image blocks.
  • the texture depth can be determined by the following formula (1):
  • F1 represents texture depth
  • M1 represents the average gray value of the area where the at least one first connected domain in the N image blocks is located
  • M represents the average gray value of the N image blocks.
  • the terminal device 100 may extract the texture depth of each of the N image blocks, and then use the average value of the texture depths of the N image blocks as the texture depth of the grayscale image.
  • the first image block is any one of the N image blocks.
  • the texture depth of the first image block can be determined by the following formula (2):
  • E1 represents the texture depth of the first image block
  • X represents the average gray value of the area where the first connected domain in the first image block is located
  • X1 represents the average gray value of the first image block.
  • the terminal device 100 When the terminal device 100 extracts the texture depth in the grayscale image, it may also regard the N image blocks as a whole, directly determine the grayscale mean value of the area where the first connected domain of the N image blocks is located, and determine the N images The gray average value of the block, thereby determining the texture depth of the gray image based on the above formula (1).
  • the terminal device 100 may determine the texture width according to the length and area of the outer contour of the second connected domain in the at least one first connected domain.
  • the second connected domain is the first connected domain with the longest outer contour length or the largest area in the at least one first connected domain.
  • the texture width can be determined by the following formula (3) or formula (4):
  • F2 represents the texture width
  • F1 represents the texture depth
  • S1 represents the area of the second connected domain
  • L1 represents the length of the outer contour of the second connected domain
  • L0 represents the sum of the length of the inner contour of the second connected domain.
  • the white area is the second connected domain
  • L2 is the length of the outer ring of the white area
  • L0 is equal to the length of the inner ring of the white area, that is, the length of the black ellipse surrounded by the white area.
  • the area of the first connected domain is the area of the white area, that is, the area of the ring area.
  • the white area is the second connected domain
  • L2 is the length of the outer ring of the white area
  • L0 is equal to the sum of the lengths of the two black ellipses surrounded by the white area, that is, the sum of L3 and L4.
  • the texture width can be determined by the following formula (5) or formula (6):
  • F2 represents the texture width
  • F1 represents the texture depth
  • S1 represents the area of the second connected domain
  • L1 represents the outer contour length of the second connected domain.
  • S1 is the white area.
  • Area is the outline perimeter of the white area.
  • the terminal device 100 may determine from the at least one first connected domain at least one third connected domain whose outer contour length is greater than a preset threshold, and then determine K image blocks containing the third connected domain among the N image blocks, Furthermore, the area sum of the third connected domain included in each image block in the K image blocks is determined separately, and the ratio of the area sum to the area sum of the N image blocks obtains K ratios; finally, the average value of the K ratios is multiplied
  • the above texture depth is determined as the wide texture density, or the average of the K ratios is determined as the wide texture density.
  • the wide texture density can be determined by the following formula (7) or formula (8):
  • F3 represents the wide texture density
  • S2 represents the area sum of the third connected domain included in the K image blocks.
  • there are 3 image blocks including the third connected domain which are image block 1, image block 2, and image block 3.
  • Image block 1 includes 2 third connected domains
  • image block 2 includes 1 third connected domain.
  • 3 includes 3 third connected domains
  • S2 is the area of the 2 third connected domains in image block 1, the area of 1 third connected domain in image block 2, and the 3 third connected domains in image block 3.
  • the sum of the area of the domain that is, the sum of the area of the 6 third connected domains.
  • S represents the area sum of N image blocks.
  • the texture density can be determined by the following formula (9) or formula (10):
  • F4 represents the texture density
  • S3 represents the sum of the areas of the first connected domain included in the N image blocks
  • S is the sum of the areas of the N image blocks.
  • N is 4, which are image block 1-image block 4
  • image block 1 includes 2 first connected domains
  • image block 2 includes 3 first connected domains
  • image block 3 includes 3 first connected domains
  • image block 4 includes 1 first connected domain
  • S3 is the area of the 2 first connected domains in image block 1, the area of the 3 first connected domains in image block 2, and the 3 first connected domains in image block 3.
  • the area of the domain, and the sum of the area of one first connected domain in the image block 4 that is, the sum of the area of the nine first connected domains.
  • S represents the area sum of N image blocks.
  • the texture feature of the gray image may be determined in the following manner. Perform mean filtering on the grayscale image, and perform binarization processing on the grayscale image after the mean filtering, perform connected domain analysis on the binarized image to obtain at least one connected domain, and then extract the area where at least one connected domain is located Texture characteristics. For example, when determining the texture depth, the ratio of the gray value of the connected domain position in the gray image to the gray value of the gray image can be used as the texture depth. When determining the texture width, it can be determined according to at least one connected domain with the largest area or the longest outer contour length in at least one connected domain. For a specific determination method, refer to formula (3) or formula (4).
  • a connected domain whose outer contour length is greater than a preset threshold can be determined from the at least one connected domain, and the sum of the area of the connected domain whose outer contour length is greater than the preset threshold can be determined as compared with the gray scale image.
  • the ratio between the areas is the density of the wide texture, or the sum of the areas of connected domains whose outer contour length is greater than a preset threshold is determined, and the ratio between the area and the area of the gray image multiplied by the depth of the texture is determined to be the wide texture density.
  • the ratio between the area sum of at least one connected domain and the area of the gray-scale image may be determined as the texture density, or the area sum of at least one connected domain may be determined as the texture density.
  • the ratio between the areas of the image multiplied by the above-mentioned texture depth is determined as the texture density.
  • the terminal device 100 may use an integrated learning algorithm model to determine the roughness of the skin in the skin image to be processed according to the texture feature.
  • the integrated learning algorithm can be, for example, Adaboost, or Boosting, or bagging (bootstrap aggregation, bagging).
  • the ensemble learning algorithm model can be trained in the following way:
  • Multiple "skin experts" respectively determine the roughness of the back of the hands of each person in the crowd, that is, score the back of the hands, and can use a 100-point system, a 10-point system, or a 1-point system.
  • the average score of a person by multiple skin experts is used as a label for the roughness of the back of a person's hand.
  • the preset integrated learning algorithm model is trained, and the integrated learning algorithm model obtained through training can be used as a model for detecting the roughness of the skin in the skin image to be processed in the embodiment of the present application.
  • skin expert scoring system
  • AdaBoost AdaBoost to obtain a correlation coefficient of 0.88 in the test set through cross-validation, which preliminarily verifies the feasibility of the algorithm.
  • the test results are shown in Figure 7 and Figure 8.
  • FIG. 7 shows the test result of the method of dividing the image block
  • FIG. 8 shows the test result of the method of not using the image block.
  • the x-axis in Figures 7 and 8 represents the scoring result of the test set data by the skin expert, and the y-axis is the model test result.
  • the lines in Figure 7 and Figure 8 are ideal fitting results. The closer the black point set is to the red line, the higher the correlation.
  • the correlation test result in Fig. 7 is 0.73
  • the correlation test result in Fig. 8 is 0.88.
  • the method of segmenting image blocks is better than the method of non-segmenting image blocks.
  • the terminal device 100 displays the roughness value (that is, the scoring result) on the display screen, and can also display care suggestions for the back of the hand. For example, see Figure 9.
  • the method provided in the embodiments of the present application is introduced from the perspective of an electronic device as an execution subject.
  • the electronic device may include a hardware structure and/or a software module, and realize the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function of the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
  • FIG. 10 shows an electronic device 1000 provided in this application.
  • the electronic device 1000 includes at least one processor 1010 and a memory 1020, and may also include a display screen 1030 and a camera 1040.
  • the processor 1010 is coupled with the memory 1020, the display screen 1030, and the camera 1040.
  • the coupling in the embodiment of the present application is an indirect coupling or communication connection between devices, units, or modules, and may be in electrical, mechanical or other forms. Used for information exchange between devices, units or modules.
  • the memory 1020 is used to store program instructions
  • the camera 1040 is used to capture images
  • the display screen 1030 is used to display a photo preview interface when the camera 1040 starts shooting.
  • the photo preview interface includes the images collected by the camera 1040.
  • the display screen 1030 may also be used to display the user interfaces involved in the above embodiments, such as the user interface shown in FIG. 3, the interface shown in FIG. 9, and so on.
  • the processor 1010 is configured to call and execute the program instructions stored in the memory 1020, and execute the steps in the color spot detection method shown in FIG. 3 above.
  • the electronic device 1000 can be used to implement the skin roughness detection method shown in FIG. 2 in the embodiment of the present application, and related features can be referred to the above, which will not be repeated here.
  • the embodiments of the present application can be implemented by hardware, firmware, or a combination of them.
  • the above-mentioned functions can be stored in a computer-readable medium or transmitted as one or more instructions or codes on the computer-readable medium.
  • the computer-readable medium includes a computer storage medium and a communication medium, where the communication medium includes any medium that facilitates the transfer of a computer program from one place to another.
  • the storage medium may be any available medium that can be accessed by a computer.
  • computer-readable media can include RAM, ROM, electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory, CD- ROM) or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or any other media that can be used to carry or store desired program codes in the form of instructions or data structures and that can be accessed by a computer.
  • EEPROM electrically erasable programmable read-only memory
  • CD- ROM compact disc read-only memory
  • Any connection can suitably become a computer-readable medium.
  • disks and discs include compact discs (CDs), laser discs, optical discs, digital video discs (digital video discs, DVDs), floppy discs, and Blu-ray discs. Disks usually copy data magnetically, while disks use lasers to copy data optically. The above combination should also be included in the protection scope of the computer-readable medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Dermatology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

A skin roughness measurement method. According to a texture depth, a texture width, the density of wide texture, and a texture density extracted from a skin to be measured, a skin roughness measurement result is obtained by means of a method of using a machine learning model, and thus, a more accurate and more intuitive skin roughness measurement result is obtained.

Description

一种皮肤粗糙度检测方法及电子设备Method for detecting skin roughness and electronic equipment
相关申请的交叉引用Cross-references to related applications
本申请要求在2019年09月18日提交中国专利局、申请号为201910882738.8、申请名称为“一种皮肤粗糙度检测方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the Chinese Patent Office on September 18, 2019, the application number is 201910882738.8, and the application name is "a method for detecting skin roughness and electronic equipment", the entire content of which is incorporated by reference In this application.
技术领域Technical field
本申请实施例涉及图像处理技术领域,尤其涉及一种皮肤粗糙度检测方法及电子设备。The embodiments of the present application relate to the field of image processing technology, and in particular, to a skin roughness detection method and electronic equipment.
背景技术Background technique
人体皮肤表层因年龄、病症以及外界环境等因素的影响,容易导致皮嵴(Peaks)和皮沟(Valleys)的形成且呈现出不同深度和走向,从而形成了多种多样的皮肤纹理。皮肤粗糙程度(Skin Roughness)作为一项重要的皮肤纹理分析手段,可以在一定程度上反映出人体机能的健康状况。Due to the influence of age, disease, external environment and other factors on the surface of human skin, it is easy to cause the formation of peaks and valleys with different depths and directions, thus forming a variety of skin textures. Skin Roughness, as an important skin texture analysis method, can reflect the health of human body functions to a certain extent.
现有,一般是采用专业检测设备通过分析皮肤图像来检测及评价皮肤状况。专业检测设备售价较高,一般由专业机构采用,不适用普通大众。如今,随着移动终端成像能力的提高,基于移动终端的皮肤检测成为了可能,从而普通大众能够采用移动终端检测皮肤状态。但是目前并没有一种适用于移动终端的且有效的检测皮肤粗糙度的方式。Currently, professional detection equipment is generally used to detect and evaluate skin conditions by analyzing skin images. The price of professional testing equipment is relatively high, and it is generally adopted by professional organizations, not suitable for the general public. Nowadays, with the improvement of the imaging capabilities of mobile terminals, skin detection based on mobile terminals has become possible, so that the general public can use mobile terminals to detect skin conditions. However, there is currently no effective method for detecting skin roughness suitable for mobile terminals.
发明内容Summary of the invention
本申请实施例提供一种皮肤粗糙度检测方法及电子设备,用以实现通过电子设备检测皮肤的粗糙度。The embodiments of the present application provide a skin roughness detection method and electronic device, which are used to detect the roughness of the skin through the electronic device.
第一方面,本申请实施例提供一种皮肤粗糙度检测方法,该方法应用于电子设备或应用于电子设备中的芯片,该方法包括:获取待处理皮肤图像的灰度图像;然后从所述灰度图像中提取纹理特征,所述纹理特征中包括纹理深度、纹理宽度、宽纹理密度和纹理密度中的至少一项;所述纹理深度用于表征皮肤上纹路的深度,所述纹理宽度用于表征皮肤上纹路的宽度,所述宽纹理密度用于表征纹路宽度达到预设阈值的纹路在皮肤中的密度,所述纹理密度用于表征纹路在皮肤中的密度;进一步地根据所述纹理特征确定所述待处理皮肤图像中皮肤的粗糙度。上述方案,通过电子设备基于纹理特征来确定皮肤的粗糙度,使用简便,并不是使用简单的参数阈值来判断皮肤的粗糙度,稳定性较强,准确度较高。In the first aspect, an embodiment of the present application provides a method for detecting skin roughness, which is applied to an electronic device or a chip in an electronic device. The method includes: acquiring a grayscale image of a skin image to be processed; The texture feature is extracted from the gray-scale image, and the texture feature includes at least one of texture depth, texture width, wide texture density, and texture density; the texture depth is used to characterize the depth of the lines on the skin, and the texture width is In order to characterize the width of the lines on the skin, the wide texture density is used to characterize the density of the lines in the skin whose line width reaches a preset threshold, and the texture density is used to characterize the density of the lines in the skin; further according to the texture The feature determines the roughness of the skin in the skin image to be processed. In the above solution, the electronic device is used to determine the roughness of the skin based on the texture feature. It is easy to use and does not use a simple parameter threshold to determine the roughness of the skin. It has strong stability and high accuracy.
在一种可能的设计中,在提取所述灰度图像中的纹理特征,可以通过如下方式实现:将所述灰度图像划分为K个图像块,从所述K个图像块中获取灰度均值位于预设灰度范围内的N个图像块,从所述N个图像块中分别提取所述纹理特征,K和N均为正整数,N小于或者等于K;或者,将所述灰度图像划分为K个图像块,对所述K个图像块的灰度均值大小进行排序,获取排序在预设名次范围内的N个图像块,从所述N个图像块中提取所述纹理特征。通过上述设计从多个图像块中选择几个图像块用来提取纹理特征,降低环境光对提取纹理特征的影响,提高准确度。In a possible design, the extraction of texture features in the grayscale image can be achieved by dividing the grayscale image into K image blocks, and obtaining the grayscale from the K image blocks N image blocks with an average value within a preset grayscale range, extract the texture features from the N image blocks, K and N are both positive integers, and N is less than or equal to K; or, the grayscale The image is divided into K image blocks, the gray scale average size of the K image blocks is sorted, N image blocks sorted within a preset ranking range are obtained, and the texture feature is extracted from the N image blocks . Through the above design, several image blocks are selected from multiple image blocks to extract texture features, which reduces the influence of ambient light on the extraction of texture features and improves accuracy.
在一种可能的设计中,从所述N个图像块中提取纹理特征,可以通过如下方式实现:分别对所述N个图像块进行二值化处理得到N个二值化图像;对所述N个二值化图像进行连通域分析得到至少一个第一连通域,所述至少一个第一连通域用于指示皮肤的纹路区域在所述N个图像块中的位置;从所述N个图像块中所述至少一个第一连通域所在区域提取所述纹理特征。上述设计提供一种简单且有效的提取纹理特征的方式。In a possible design, the extraction of texture features from the N image blocks can be achieved by performing binarization processing on the N image blocks respectively to obtain N binarized images; N binarized images are subjected to connected domain analysis to obtain at least one first connected domain, where the at least one first connected domain is used to indicate the position of the skin texture area in the N image blocks; from the N images The texture feature is extracted from a region where the at least one first connected domain in the block is located. The above design provides a simple and effective way to extract texture features.
在一种可能的设计中,分别对所述N个图像块进行二值化处理得到N个二值化图像,包括:分别对所述N个图像块进行滤波,并对滤波后的N个图像块进行二值化处理得到N个二值化图像。上述设计,在二值化之前进行滤波处理,可以对图像平滑、去噪。In a possible design, performing binarization processing on the N image blocks to obtain N binarized images respectively includes: filtering the N image blocks respectively, and performing the filtered N images Block binarization processing to obtain N binarized images. In the above design, filtering is performed before binarization, which can smooth and denoise the image.
在一种可能的设计中,所述对所述N个二值化图像进行连通域分析得到至少一个第一连通域,包括:对所述N个二值化图像进行腐蚀和/或膨胀处理,并对腐蚀和/或膨胀处理后的N个二值化图像进行连通域分析得到所述至少一个第一连通域。In a possible design, the performing connected domain analysis on the N binarized images to obtain at least one first connected domain includes: performing corrosion and/or expansion processing on the N binarized images, The connected domain analysis is performed on the N binarized images after the corrosion and/or expansion processing to obtain the at least one first connected domain.
上述设计,通过对二值化图像进行腐蚀和/或膨胀处理,可以使得确定的皮肤纹路更清晰准确。The above design can make the determined skin texture clearer and more accurate by performing corrosion and/or expansion processing on the binary image.
在一种可能的设计中,所述纹理深度可以是根据所述N个图像块中至少一个第一连通域所在区域的灰度均值与所述N个图像块的灰度均值确定的。In a possible design, the texture depth may be determined according to the average gray value of the area where at least one first connected domain in the N image blocks is located and the average gray value of the N image blocks.
在一种可能的设计中,所述纹理深度可以符合如下公式要求:In a possible design, the texture depth can meet the requirements of the following formula:
F1=abs(M-M1)/M;F1=abs(M-M1)/M;
其中,F1表示纹理深度,M1表示所述N个图像块中所述至少一个第一连通域所在区域的灰度均值,M表示N个图像块的灰度均值。上述设计提供一种简单且有效的提取纹理深度的方式。Wherein, F1 represents the texture depth, M1 represents the average gray value of the area where the at least one first connected domain in the N image blocks is located, and M represents the average gray value of the N image blocks. The above design provides a simple and effective way to extract texture depth.
在一种可能的设计中,从所述N个图像块中所述至少一个第一连通域所在区域提取所述纹理宽度,包括:根据至少一个第一连通域中的第二连通域的外轮廓长度以及面积确定所述纹理宽度;其中,所述第二连通域为所述至少一个第一连通域中外轮廓长度最长或者面积最大的第一连通域。上述设计提供一种简单且有效的提取纹理宽度的方式。In a possible design, extracting the texture width from the area where the at least one first connected domain in the N image blocks is located includes: according to the outer contour of the second connected domain in the at least one first connected domain The length and area determine the texture width; wherein the second connected domain is the first connected domain with the longest outer contour length or the largest area in the at least one first connected domain. The above design provides a simple and effective way to extract the texture width.
在一种可能的设计中,根据至少一个第一连通域中的第二连通域的外轮廓长度以及面积确定所述纹理宽度,可以通过如下方式实现:In a possible design, determining the texture width according to the length and area of the outer contour of the second connected domain in the at least one first connected domain can be implemented in the following manner:
当所述第二连通域为多连通域时,通过如下公式确定所述纹理宽度:When the second connected domain is a multi-connected domain, the texture width is determined by the following formula:
F2=F1×S1/(L1+L0);或者,F2=S1/(L1+L0);F2=F1×S1/(L1+L0); or, F2=S1/(L1+L0);
其中,F2表示所述纹理宽度,F1表示纹理深度,S1表示第二连通域的面积,L1表示第二连通域的外轮廓长度,L0表示第二连通域的内轮廓的长度和;Wherein, F2 represents the texture width, F1 represents the texture depth, S1 represents the area of the second connected domain, L1 represents the length of the outer contour of the second connected domain, and L0 represents the sum of the length of the inner contour of the second connected domain;
当所述第二连通域为单连通域时,通过如下公式确定所述纹理宽度:When the second connected domain is a single connected domain, the texture width is determined by the following formula:
F2=F1×S1/L1;或者,F2=S1/L1;F2=F1×S1/L1; or, F2=S1/L1;
其中,F2表示所述纹理宽度,F1表示纹理深度,S1表示第二连通域的面积,L1表示第二连通域的外轮廓长度。Wherein, F2 represents the texture width, F1 represents the texture depth, S1 represents the area of the second connected domain, and L1 represents the outer contour length of the second connected domain.
上述设计提供一种简单且有效的提取纹理宽度的方式。The above design provides a simple and effective way to extract the texture width.
在一种可能的设计中,从所述N个图像块中所述至少一个第一连通域所在区域提取所述宽纹理密度,可以通过如下方式实现:In a possible design, extracting the wide texture density from the area where the at least one first connected domain in the N image blocks is located can be implemented in the following manner:
从所述至少一个第一连通域中确定外轮廓长度大于预设阈值的至少一个第三连通域;确定N个图像块中包含所述第二连通域的K个图像块;确定所述K个图像块中所包括的第三连通域的面积和,与所述N个图像块的面积和之间的第一比值;将所述第一比值乘上 所述纹理深度确定为所述宽纹理密度,或者,将所述第一比值确定为所述宽纹理密度。上述设计提供一种简单且有效的提取宽纹理密度的方式。Determine from the at least one first connected domain at least one third connected domain whose outer contour length is greater than a preset threshold; determine K image blocks containing the second connected domain among N image blocks; determine the K image blocks The first ratio between the sum of the areas of the third connected domain included in the image block and the sum of the areas of the N image blocks; the wide texture density is determined by multiplying the first ratio by the texture depth Or, the first ratio is determined as the wide texture density. The above design provides a simple and effective way to extract wide texture density.
在一种可能的设计中,从所述N个图像块中所述至少一个第一连通域所在区域提取所述纹理密度,可以通过如下方式实现:确定所述N个图像块所包括的第一连通域的面积和,与所述N个图像块的面积和之间的第二比值;将所述第二比值乘上所述纹理深度确定为所述纹理密度,或者,将所述第二比值确定为所述纹理密度。上述设计提供一种简单且有效的提取纹理密度的方式。In a possible design, extracting the texture density from the area where the at least one first connected domain in the N image blocks is located may be implemented in the following manner: determining the first one included in the N image blocks The second ratio between the area sum of the connected domain and the area sum of the N image blocks; the second ratio is multiplied by the texture depth to determine the texture density, or the second ratio is Determined as the texture density. The above design provides a simple and effective way to extract texture density.
在一种可能的设计中,所述提取所述灰度图像中的纹理特征,可以通过如下方式实现:对所述灰度图像进行直方图均衡化处理得到均衡化图像,提取所述均衡化图像中的纹理特征,可以防止光照不均对检测结果的影响。In a possible design, the extraction of the texture features in the grayscale image may be achieved in the following manner: performing a histogram equalization process on the grayscale image to obtain an equalized image, and extracting the equalized image The texture feature in, can prevent the influence of uneven illumination on the detection result.
在一种可能的设计中,根据所述纹理特征确定所述待处理皮肤图像中皮肤的粗糙度,可以通过如下方式实现:根据所述纹理特征采用集成学习算法模型确定所述待处理皮肤图像中皮肤的粗糙度。In a possible design, the determination of the roughness of the skin in the skin image to be processed according to the texture feature can be implemented in the following manner: an integrated learning algorithm model is used to determine the skin image in the skin image to be processed according to the texture feature The roughness of the skin.
第二方面,本申请实施例提供了一种皮肤粗糙度检测装置,包括分别用于执行第一方面或第一方面的任一设计所述的方法的单元。In a second aspect, an embodiment of the present application provides a skin roughness detection device, which includes units respectively used to execute the method according to the first aspect or any one of the designs of the first aspect.
第三方面,本申请实施例提供了一种电子设备,包括处理器、存储器;其中处理器与存储器相耦合;其中,存储器用于存储程序指令;处理器用于读取存储器中存储的程序指令,以实现第一方面及其任一可能的设计的方法。In a third aspect, an embodiment of the present application provides an electronic device including a processor and a memory; wherein the processor is coupled with the memory; wherein the memory is used to store program instructions; the processor is used to read program instructions stored in the memory, To achieve the first aspect and any possible design methods.
第四方面,本申请实施例提供的一种计算机存储介质,该计算机存储介质存储有程序指令,当程序指令在电子设备上运行时,使得电子设备执行第一方面及其任一可能的设计的方法。In a fourth aspect, a computer storage medium provided by an embodiment of the present application, the computer storage medium stores program instructions, and when the program instructions run on an electronic device, the electronic device executes the first aspect and any of its possible designs. method.
第五方面,本申请实施例提供的一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行第一方面及其任一可能的设计的方法。In the fifth aspect, a computer program product provided by an embodiment of the present application, when the computer program product runs on an electronic device, causes the electronic device to execute the first aspect and any possible design method thereof.
第六方面,本申请实施例提供的一种芯片,所述芯片与电子设备中的存储器耦合,执行第一方面及其任一可能的设计的方法。The sixth aspect is a chip provided by an embodiment of the present application, which is coupled with a memory in an electronic device, and executes the first aspect and any possible design method thereof.
另外,第二方面至第六方面所带来的技术效果可参见上述第一方面的描述,此处不再赘述。In addition, the technical effects brought by the second aspect to the sixth aspect can be referred to the description of the above-mentioned first aspect, which will not be repeated here.
需要说明的是,本申请实施例中“耦合”是指两个部件彼此直接或间接地结合。It should be noted that “coupled” in the embodiments of the present application means that two components are directly or indirectly combined with each other.
附图说明Description of the drawings
图1为本申请实施例中终端设备100结构示意图;FIG. 1 is a schematic structural diagram of a terminal device 100 in an embodiment of the application;
图2为本申请实施例中手背粗糙度检测方法流程示意图;2 is a schematic flow chart of a method for detecting the roughness of the back of the hand in an embodiment of the application;
图3为本申请实施例中一种预览用户界面的示意图;FIG. 3 is a schematic diagram of a preview user interface in an embodiment of the application;
图4为本申请实施例中手背图像中用于提取纹理特征的图像块示意图;4 is a schematic diagram of image blocks used for extracting texture features in an image of the back of the hand in an embodiment of the application;
图5为本申请实施例中二值化处理示意图;Figure 5 is a schematic diagram of binarization processing in an embodiment of the application;
图6为本申请实施例中连通域示意图;Figure 6 is a schematic diagram of connected domains in an embodiment of this application;
图7和图8为本申请实施例中采用本申请提供的方式的检测结果示意图;FIG. 7 and FIG. 8 are schematic diagrams of the detection result of the method provided by this application in the embodiment of the application;
图9为本申请实施例中皮肤粗糙度检测报告示意图;Figure 9 is a schematic diagram of a skin roughness detection report in an embodiment of the application;
图10为本申请实施例提供的电子设备1000示意图。FIG. 10 is a schematic diagram of an electronic device 1000 provided by an embodiment of the application.
具体实施方式detailed description
本申请实施例提出一种皮肤粗糙度检测方案,适用于电子设备,电子设备可以是终端设备。本申请实施例提供的皮肤粗糙度检测功能可以集成在终端设备的一个或多个应用中,比如集成在相机应用中。以相机应用为例,终端设备启动相机应用,显示取景界面,取景界面中可以包括一控件,该控件被激活时,终端设备可以启动本申请实施例提供的皮肤粗糙度检测功能。本申请实施例提供的皮肤粗糙度检测功能还可以集成在终端设备中的专门用于皮肤检测的应用中。作为一种示例,用于皮肤检测的应用不仅可以实现皮肤粗糙度检测功能,还可以实现面部皮肤的皱纹、毛孔、黑头等检测。皮肤粗糙度可以颈部粗糙度、面部粗糙度或者手背粗糙度。完成皮肤检测后,于皮肤检测的应用还可以为用户提供检测结果报告。以手背粗糙度检测为例,检测结果报告可以但不限于包括针对手背上各个特征的打分、对手背的综合分析等,还可以根据手背的打分给出相应的护理或者治疗建议。可以理解的是,检测结果报告可以通过用户界面呈现给用户。The embodiment of the present application proposes a skin roughness detection solution, which is suitable for electronic equipment, and the electronic equipment may be a terminal device. The skin roughness detection function provided in the embodiments of the present application may be integrated in one or more applications of the terminal device, for example, integrated in a camera application. Taking a camera application as an example, the terminal device starts the camera application and displays a viewfinder interface. The viewfinder interface may include a control. When the control is activated, the terminal device can start the skin roughness detection function provided by the embodiment of the present application. The skin roughness detection function provided by the embodiments of the present application can also be integrated in an application dedicated to skin detection in a terminal device. As an example, the application for skin detection can not only realize the skin roughness detection function, but also can realize the detection of wrinkles, pores, blackheads, etc. of the facial skin. Skin roughness can be neck roughness, facial roughness, or back hand roughness. After completing the skin test, the skin test application can also provide the user with a test result report. Taking the roughness detection of the back of the hand as an example, the detection result report can include, but is not limited to, scoring of various features on the back of the hand, comprehensive analysis of the back of the hand, etc., and corresponding nursing or treatment suggestions can also be given according to the score of the back of the hand. It is understandable that the test result report can be presented to the user through the user interface.
在本申请一些实施例中,终端设备可以是包含诸如个人数字助理和/或音乐播放器等功能的便携式终端设备,诸如手机、平板电脑、具备无线通讯功能的可穿戴设备(如智能手表)、车载设备等。便携式终端设备的示例性实施例包括但不限于搭载
Figure PCTCN2020115973-appb-000001
Figure PCTCN2020115973-appb-000002
或者其它操作系统的便携式终端设备。上述便携式终端设备也可以是诸如具有触敏表面(例如触控面板)的膝上型计算机(Laptop)等。还应当理解的是,在本申请其他一些实施例中,上述终端设备也可以是具有触敏表面(例如触控面板)的台式计算机。
In some embodiments of the present application, the terminal device may be a portable terminal device containing functions such as a personal digital assistant and/or a music player, such as a mobile phone, a tablet computer, a wearable device with wireless communication function (such as a smart watch), Vehicle equipment, etc. Exemplary embodiments of portable terminal equipment include, but are not limited to, carrying
Figure PCTCN2020115973-appb-000001
Figure PCTCN2020115973-appb-000002
Or portable terminal equipment with other operating systems. The aforementioned portable terminal device may also be, for example, a laptop computer with a touch-sensitive surface (such as a touch panel). It should also be understood that in some other embodiments of the present application, the aforementioned terminal device may also be a desktop computer with a touch-sensitive surface (such as a touch panel).
在本申请另一些实施例中,终端设备还可以具有算法运算能力(能够运行本申请实施例提供的皮肤粗糙度检测算法)和通信功能,而无需具有图像采集功能。比如,终端设备接收其它设备发送的图像,然后运行本申请实施例提供的皮肤粗糙度检测算法检测所述图像中皮肤的粗糙度。在下文中,以终端设备自身具有图像采集功能和算法运算功能为例。In other embodiments of the present application, the terminal device may also have algorithmic computing capabilities (capable of running the skin roughness detection algorithm provided in the embodiments of the present application) and communication functions, without the need for image acquisition functions. For example, a terminal device receives an image sent by another device, and then runs the skin roughness detection algorithm provided in this embodiment of the application to detect the roughness of the skin in the image. In the following, the terminal device itself has an image collection function and arithmetic operation function as an example.
图1示出了一种可能的终端设备100的结构示意图。终端设备100可以包括处理器110、外部存储器接口120、内部存储器121、通用串行总线(universal serial bus,USB)接口130、充电管理模块140、电源管理模块141、电池142、天线2、无线通信模块160、音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、传感器模块180、按键190、马达191、指示器192、摄像头193、以及显示屏194等。在另一些实施例中,本申请实施例中的终端设备100还可以包括天线1、移动通信模块150、以及用户标识模块(subscriber identification module,SIM)卡接口195等。FIG. 1 shows a schematic structural diagram of a possible terminal device 100. The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 2, wireless communication Module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194 and so on. In other embodiments, the terminal device 100 in the embodiment of the present application may further include an antenna 1, a mobile communication module 150, a subscriber identification module (SIM) card interface 195, and so on.
处理器110可以包括一个或多个处理单元。例如:处理器110可以包括应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、存储器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 110 may include one or more processing units. For example, the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, and a memory. , Video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Among them, the different processing units may be independent devices or integrated in one or more processors.
在一些实施例中,处理器110中还可以设置存储器,用于存储指令和数据。示例的,处理器110中的存储器可以为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。处理器110可以运行本申请实施例提供的皮肤粗糙度检测算法,以检测图像上皮肤的粗糙度。In some embodiments, a memory may also be provided in the processor 110 for storing instructions and data. For example, the memory in the processor 110 may be a cache memory. The memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved. The processor 110 may run the skin roughness detection algorithm provided in the embodiment of the present application to detect the roughness of the skin on the image.
在另一些实施例中,处理器110还可以包括一个或多个接口。例如,接口可以为通用串行总线(universal serial bus,USB)接口130。又例如,接口还可以为集成电路(inter-integrated circuit,I2C)接口、集成电路内置音频(inter-integrated circuit sound,I2S)接口、脉冲编码调制(pulse code modulation,PCM)接口、通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口、移动产业处理器接口(mobile industry processor interface,MIPI)、通用输入输出(general-purpose input/output,GPIO)接口、用户标识模块(subscriber identity module,SIM)接口等。可以理解的是,本申请实施例可以通过接口连接终端设备100的不同模块,从而使得终端设备100能够实现不同的功能。例如拍照、处理等。需要说明的是,本申请实施例对终端设备100中接口的连接方式不作限定。In other embodiments, the processor 110 may further include one or more interfaces. For example, the interface may be a universal serial bus (USB) interface 130. For another example, the interface can also be an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transceiving transmission Universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) ) Interface, etc. It can be understood that the embodiments of the present application may connect different modules of the terminal device 100 through interfaces, so that the terminal device 100 can implement different functions. For example, taking pictures, processing, etc. It should be noted that the embodiment of the present application does not limit the connection mode of the interface in the terminal device 100.
其中,USB接口130是符合USB标准规范的接口。例如,USB接口130可以包括Mini USB接口、Micro USB接口、USB Type C接口等。USB接口130可以用于连接充电器为终端设备100充电,也可以用于终端设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他终端设备,例如AR设备等。Among them, the USB interface 130 is an interface that complies with the USB standard specification. For example, the USB interface 130 may include a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on. The USB interface 130 can be used to connect a charger to charge the terminal device 100, and can also be used to transfer data between the terminal device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect to other terminal devices, such as AR devices.
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过终端设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为终端设备供电。The charging management module 140 is used to receive charging input from the charger. Among them, the charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive the charging input of the wired charger through the USB interface 130. In some embodiments of wireless charging, the charging management module 140 may receive the wireless charging input through the wireless charging coil of the terminal device 100. While the charging management module 140 charges the battery 142, it can also supply power to the terminal device through the power management module 141.
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110、内部存储器121、外部存储器、显示屏194、摄像头193和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电、阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160. The power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters. In some other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may also be provided in the same device.
终端设备100的无线通信功能可以通过天线1、天线2、移动通信模块150、无线通信模块160、调制解调处理器以及基带处理器等实现。The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
天线1和天线2用于发射和接收电磁波信号。终端设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。The antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in the terminal device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna can be used in combination with a tuning switch.
移动通信模块150可以提供应用在终端设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器、开关、功率放大器、低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波、放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。The mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like. The mobile communication module 150 can receive electromagnetic waves by the antenna 1, filter, amplify, etc. the received electromagnetic waves, and transmit them to the modem processor for demodulation. The mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1. In some embodiments, at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110. In some embodiments, at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后, 被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A、受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。The modem processor may include a modulator and a demodulator. Among them, the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal. The demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194. In some embodiments, the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
无线通信模块160可以提供应用在终端设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络)、蓝牙(bluetooth,BT)、全球导航卫星系统(global navigation satellite system,GNSS)、调频(frequency modulation,FM)、近距离无线通信技术(near field communication,NFC)、红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波信号,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。The wireless communication module 160 can provide applications on the terminal device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites. System (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic wave signals via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110. The wireless communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
在一些实施例中,终端设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得终端设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA)、时分码分多址(time-division code division multiple access,TD-SCDMA)、长期演进(long term evolution,LTE)、BT、GNSS、WLAN、NFC、FM、和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS)、全球导航卫星系统(global navigation satellite system,GLONASS)、北斗卫星导航系统(beidou navigation satellite system,BDS)、准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。In some embodiments, the antenna 1 of the terminal device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc. The GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), and quasi-zenith satellite system (quasi). -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
终端设备100通过GPU、显示屏194、以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The terminal device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
显示屏194用于显示图像、视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)、有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED)、Miniled、MicroLed、Micro-oLed、量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,终端设备100可以包括1个或N个显示屏194,N为大于1的正整数。The display screen 194 is used to display images, videos, and the like. The display screen 194 includes a display panel. The display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode). AMOLED, flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc. In some embodiments, the terminal device 100 may include one or N display screens 194, and N is a positive integer greater than one.
终端设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。The terminal device 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点、亮度、肤色进行算法优化。ISP还可以对拍摄场景的曝光、色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。The ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye. ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
摄像头193用于捕获静态图像或视频。通常,摄像头193可以包括感光元件比如镜头 组和图像传感器,其中,镜头组包括多个透镜(凸透镜或凹透镜),用于采集待拍摄物体(比如手背)反射的光信号,并将采集的光信号传递给图像传感器。图像传感器根据所述光信号生成待拍摄物体的图像(比如手背图像)。以手背图像为例,摄像头193采集到手背图像后,可以将手背图像发送给处理器110,处理器110运行本申请实施例提供的皮肤粗糙度检测算法,检测手背图像中手背的粗糙度。处理器110确定出手背图像的手背粗糙度之后,显示屏194可以显示该手背粗糙度的检测报告。图1所示的摄像头193可以包括1-N个摄像头,本申请对摄像头的数量不作限定。The camera 193 is used to capture still images or videos. Generally, the camera 193 may include photosensitive elements such as a lens group and an image sensor, where the lens group includes a plurality of lenses (convex lens or concave lens) for collecting the light signal reflected by the object to be photographed (such as the back of the hand), and the collected light signal Passed to the image sensor. The image sensor generates an image of the object to be photographed (such as an image of the back of the hand) according to the light signal. Taking the image of the back of the hand as an example, after the camera 193 collects the image of the back of the hand, it can send the image of the back of the hand to the processor 110, and the processor 110 runs the skin roughness detection algorithm provided by the embodiment of the present application to detect the roughness of the back of the hand in the image of the back of the hand. After the processor 110 determines the roughness of the back of the hand image, the display screen 194 may display the detection report of the roughness of the back of the hand. The camera 193 shown in FIG. 1 may include 1-N cameras, and the number of cameras is not limited in this application.
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当终端设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the terminal device 100 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
视频编解码器用于对数字视频压缩或解压缩。终端设备100可以支持一种或多种视频编解码器。这样,终端设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1、MPEG2、MPEG3、MPEG4等。Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现终端设备100的智能认知等应用,例如:图像识别、人脸识别、语音识别、文本理解等。NPU is a neural-network (NN) computing processor. By drawing on the structure of biological neural networks, for example, the transfer mode between human brain neurons, it can quickly process input information, and it can also continuously self-learn. Through the NPU, applications such as intelligent cognition of the terminal device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
外部存储器接口120可以用于连接外部存储卡(例如,Micro SD卡),实现扩展终端设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐、视频等文件保存在外部存储卡中。The external memory interface 120 may be used to connect an external memory card (for example, a Micro SD card) to expand the storage capacity of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行终端设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如相机应用、皮肤检测应用等)等。存储数据区可存储终端设备100使用过程中所创建的数据(比如摄像头采集的图像等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、通用闪存存储器(universal flash storage,UFS)等。内部存储器121还可以存储本申请实施例提供的皮肤粗糙度检测算法的代码。当内部存储器121中存储的皮肤粗糙度检测算法的代码被处理器110运行时,实现皮肤粗糙度检测功能。当然,本申请实施例提供的皮肤粗糙度检测算法的代码还可以存储在外部存储器中。这种情况下,处理器110可以通过外部存储器接口120运行存储在外部存储器中的皮肤粗糙度检测算法的代码,以实现相应的皱纹检测功能。The internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions. The processor 110 executes various functional applications and data processing of the terminal device 100 by running instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. Among them, the storage program area can store an operating system, at least one application program (such as a camera application, a skin detection application, etc.) required by a function, and the like. The data storage area can store data created during the use of the terminal device 100 (for example, images captured by a camera, etc.) and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like. The internal memory 121 may also store the code of the skin roughness detection algorithm provided in the embodiment of the present application. When the code of the skin roughness detection algorithm stored in the internal memory 121 is executed by the processor 110, the skin roughness detection function is realized. Of course, the code of the skin roughness detection algorithm provided in the embodiment of the present application can also be stored in an external memory. In this case, the processor 110 may run the code of the skin roughness detection algorithm stored in the external memory through the external memory interface 120 to implement the corresponding wrinkle detection function.
终端设备100可以通过音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D以及应用处理器等实现音频功能。例如音乐播放、录音等。The terminal device 100 can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, an application processor, and the like. For example, music playback, recording, etc.
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。The audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal. The audio module 170 can also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。终端设备100可以通过扬声器170A收听音乐、或收听免提通话。The speaker 170A, also called "speaker", is used to convert audio electrical signals into sound signals. The terminal device 100 can listen to music or listen to a hands-free call through the speaker 170A.
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当终端设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。The receiver 170B, also called "earpiece", is used to convert audio electrical signals into sound signals. When the terminal device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。终端设备100可以设置至少一个麦克风170C。在另一些实施例中,终端设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,终端设备100还可以设置三个、四个或更多麦克风170C,实现声音信号采集、降噪、还可以识别声音来源,实现定向录音功能等。The microphone 170C, also called "microphone", "microphone", is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In other embodiments, the terminal device 100 may also be provided with three, four or more microphones 170C to realize sound signal collection, noise reduction, sound source identification, and directional recording functions.
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动终端设备平台(open mobile terminal platform,OMTP)标准接口、美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口等。The earphone interface 170D is used to connect wired earphones. The earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, etc. .
传感器模块180包括环境光传感器180L。此外,传感器模块180还可以包括压力传感器180A、陀螺仪传感器180B、气压传感器180C、磁传感器180D、加速度传感器180E、距离传感器180F、接近光传感器180G、指纹传感器180H、温度传感器180J、触摸传感器180K、骨传导传感器180M等。The sensor module 180 includes an ambient light sensor 180L. In addition, the sensor module 180 may also include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, Bone conduction sensor 180M and so on.
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。终端设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,终端设备100根据压力传感器180A检测所述触摸操作强度。终端设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。The pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be provided on the display screen 194. There are many types of pressure sensors 180A, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors and so on. The capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the strength of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 180A. The terminal device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example, when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
陀螺仪传感器180B可以用于确定终端设备100的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。终端设备100可以利用磁传感器180D检测翻盖皮套的开合。加速度传感器180E可检测终端设备100在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。终端设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,终端设备100可以利用距离传感器180F测距以实现快速对焦。接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。环境光传感器180L用于感知环境光亮度。指纹传感器180H用于采集指纹。终端设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。温度传感器180J用于检测温度。The gyro sensor 180B may be used to determine the movement posture of the terminal device 100. The air pressure sensor 180C is used to measure air pressure. The magnetic sensor 180D includes a Hall sensor. The terminal device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster. The acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device 100 in various directions (generally three axes). Distance sensor 180F, used to measure distance. The terminal device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the terminal device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing. The proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode. The ambient light sensor 180L is used to sense the brightness of the ambient light. The fingerprint sensor 180H is used to collect fingerprints. The terminal device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on. The temperature sensor 180J is used to detect temperature.
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于终端设备100的表面,与显示屏194所处的位置不同。 按键190可以包括开机键、音量键等。按键190可以是机械按键。也可以是触摸式按键。终端设备100可以接收按键输入,产生与终端设备100的用户设置以及功能控制有关的键信号输入。Touch sensor 180K, also called "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”. The touch sensor 180K is used to detect touch operations acting on or near it. The touch sensor can pass the detected touch operation to the application processor to determine the type of touch event. The visual output related to the touch operation can be provided through the display screen 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the terminal device 100, which is different from the position of the display screen 194. The button 190 may include a power-on button, a volume button, and the like. The button 190 may be a mechanical button. It can also be a touch button. The terminal device 100 may receive key input, and generate key signal input related to user settings and function control of the terminal device 100.
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照、音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒、接收信息、闹钟、游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。The motor 191 can generate vibration prompts. The motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback. For example, touch operations that act on different applications (such as photographing, audio playback, etc.) can correspond to different vibration feedback effects. Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects. Different application scenarios (for example: time reminding, receiving information, alarm clock, games, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect can also support customization.
指示器192可以是指示灯,可以用于指示充电状态、电量变化,也可以用于指示消息、未接来电、通知等。The indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and the like.
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和终端设备100的接触和分离。The SIM card interface 195 is used to connect to the SIM card. The SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the terminal device 100.
可以理解的是,本申请实施例示意的结构并不构成对终端设备100的具体限定。在本申请另一些实施例中,终端设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件、软件或软件和硬件的组合实现。It can be understood that the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the terminal device 100. In other embodiments of the present application, the terminal device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components. The illustrated components can be implemented in hardware, software, or a combination of software and hardware.
另外,需要说明的是,本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。另外,需要理解的是,在本申请的描述中,“第一”、“第二”等词汇,仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。In addition, it should be noted that in this application, "at least one" refers to one or more, and "multiple" refers to two or more. "And/or" describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone, where A, B can be singular or plural. The character "/" generally indicates that the associated objects before and after are in an "or" relationship. "The following at least one item (a)" or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a). For example, at least one item (a) of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple . In addition, it should be understood that in the description of this application, words such as "first" and "second" are only used for the purpose of distinguishing description, and cannot be understood as indicating or implying relative importance, nor can it be understood as indicating Or imply the order.
本申请实施例涉及到的单连通域是指:属于一区域D内任一简单闭曲线的内部都属于D,则称D为单连通区域,单连通域也可以这样描述:D内任一封闭曲线所围成的区域内只含有D中的点。更通俗地说,单连通区域是没有“洞”的区域。多连通域是指,复平面上的一个区域B,如果在其中任作一条简单闭曲线,而曲线的内部不总属于B,就称为多连通域。The single connected domain involved in the embodiments of this application refers to: the inside of any simple closed curve belonging to a region D belongs to D, then D is called a single connected region, and a single connected domain can also be described as follows: any closed curve in D The area enclosed by the curve contains only the points in D. More generally speaking, a singly connected area is an area without "holes". Multi-connected domain refers to a region B on the complex plane, if any simple closed curve is made in it, and the interior of the curve does not always belong to B, it is called multi-connected domain.
为了方便描述本申请实施例提供的皮肤粗糙度检测算法,如下以手背粗糙度检测为例进行说明。手背粗糙度检测方法可以由终端设备100执行,比如由终端设备100中的处理器110来执行。In order to facilitate the description of the skin roughness detection algorithm provided by the embodiments of the present application, the following takes the roughness detection of the back of the hand as an example for description. The method for detecting the roughness of the back of the hand may be executed by the terminal device 100, for example, by the processor 110 in the terminal device 100.
在本申请一些实施例中,参见图2所示,通过图1所示的终端设备100检测手背粗糙度可以是如下的过程:In some embodiments of the present application, referring to FIG. 2, detecting the roughness of the back of the hand through the terminal device 100 shown in FIG. 1 may be the following process:
S201,终端设备100获取待处理手背图像的灰度图像。S201可以通过如下两步来实现:S201: The terminal device 100 acquires a grayscale image of the back image of the hand to be processed. S201 can be implemented through the following two steps:
A1,获取待处理手背图像。A1, get the image of the back of the hand to be processed.
以手背粗糙度检测功能集成于专门用于皮肤检测的应用中,后续为了描述方便,将该专门用于皮肤检测的应用称为“测肤应用”。测肤应用可以仅集成皮肤粗糙度检测功能。还可以在集成皮肤粗糙度检测功能的基础上,集成人脸的皮肤检测,比如黑头检测、毛孔检测、红斑检测等。通过测肤应用获取手背图像,并基于手背图像执行本申请实施例提供 的皮肤检测方法。如图3所示,终端设备100的显示屏194显示测肤应用的图标300。终端设备100检测到对图标300的操作,则响应于对图标300的操作,在显示屏194显示测肤应用的用户界面310。其中测肤应用的用户界面310包括检测按钮311。终端设备100检测到对检测按钮311的操作。则响应于对检测按钮311的操作,打开摄像头193,显示屏194显示拍照预览界面320。拍照预览界面320用于显示摄像头193采集的图像。示例的,拍照预览界面320可以包括预览区域321,预览区域321用于显示摄像头193采集的图像。应理解,摄像头193采集的图像可以为用户的手背图像。另外,摄像头193可以为终端设备100的前置摄像头,可以为终端设备100的后置摄像头。在一些实施例中,为了提高拍照的质量,在前置摄像头的像素低于后置摄像头的像素的情况下,摄像头193为终端设备100的后置摄像头。为了进一步提高拍照的质量,终端设备100当环境光照满足拍摄条件时,自动对摄像头193采集的图像进行拍照得到手背图像。需要说明是,本申请实施例中的检测按钮311又可称之为拍照按钮,或者其它名称,本申请实施例对检测按钮311的名称不作限定。The function of detecting the roughness of the back of the hand is integrated into the application dedicated to skin detection. For the convenience of description, the application dedicated to skin detection is called "skin measurement application". The skin measurement application can only integrate the skin roughness detection function. On the basis of the integrated skin roughness detection function, the skin detection of the face can also be integrated, such as blackhead detection, pore detection, erythema detection, and so on. The image of the back of the hand is acquired through the skin measurement application, and the skin detection method provided in the embodiment of the present application is executed based on the image of the back of the hand. As shown in FIG. 3, the display screen 194 of the terminal device 100 displays an icon 300 of the skin test application. The terminal device 100 detects the operation on the icon 300, and in response to the operation on the icon 300, displays the user interface 310 of the skin test application on the display screen 194. The user interface 310 of the skin test application includes a detection button 311. The terminal device 100 detects the operation of the detection button 311. In response to the operation of the detection button 311, the camera 193 is turned on, and the display screen 194 displays the photographing preview interface 320. The photo preview interface 320 is used to display the images collected by the camera 193. For example, the photo preview interface 320 may include a preview area 321, and the preview area 321 is used to display the image collected by the camera 193. It should be understood that the image collected by the camera 193 may be an image of the back of the user's hand. In addition, the camera 193 may be a front camera of the terminal device 100 or a rear camera of the terminal device 100. In some embodiments, in order to improve the quality of taking pictures, the camera 193 is the rear camera of the terminal device 100 when the pixels of the front camera are lower than the pixels of the rear camera. In order to further improve the quality of photographing, the terminal device 100 automatically photographs the image collected by the camera 193 to obtain the back image of the hand when the ambient light meets the photographing conditions. It should be noted that the detection button 311 in the embodiment of the present application may also be referred to as a photographing button, or other names, and the embodiment of the present application does not limit the name of the detection button 311.
在另一种可能的示例中,手背图像还可以是终端设备100中已存储有的图像,比如存储在内部存储器121中,从而终端设备100从内部存储器121中获取手背图像。再比如,存储在外部存储器中,从而终端设备100通过外部存储器接口120从外部存储器中获取手背图像。In another possible example, the image of the back of the hand may also be an image already stored in the terminal device 100, for example, stored in the internal memory 121, so that the terminal device 100 obtains the image of the back of the hand from the internal memory 121. For another example, it is stored in an external memory, so that the terminal device 100 obtains the image of the back of the hand from the external memory through the external memory interface 120.
A2,终端设备100在获取到手背图像后,将手背图像转换为灰度图像。A2: After acquiring the image of the back of the hand, the terminal device 100 converts the image of the back of the hand into a grayscale image.
终端设备100通过摄像头采集的手背图像为彩色图像,在A2中将彩色的手背图像转换为灰度图像。The image of the back of the hand collected by the terminal device 100 through the camera is a color image, and the color of the back of the hand image is converted into a grayscale image in A2.
S202,终端设备100提取灰度图像中的纹理特征,纹理特征中包括纹理深度、纹理宽度、宽纹理密度和纹理密度中的至少一项;S202: The terminal device 100 extracts texture features in the grayscale image, where the texture features include at least one of texture depth, texture width, wide texture density, and texture density;
其中,纹理深度用于表征手背上纹路的深度,纹理宽度用于表征手背上纹路的宽度,宽纹理密度用于表征纹路宽度达到预设阈值的纹路在手背中的密度,纹理密度用于表征纹路在手背中的密度。Among them, the texture depth is used to characterize the depth of the lines on the back of the hand, the texture width is used to characterize the width of the lines on the back of the hand, the wide texture density is used to characterize the density of the lines on the back of the hand whose line width reaches a preset threshold, and the texture density is used to characterize the lines. Density in the back of the hand.
S203,终端设备100根据纹理特征确定待处理手背图像中手背的粗糙度。S203: The terminal device 100 determines the roughness of the back of the hand in the back of the hand image to be processed according to the texture feature.
作为一种示例,终端设备100在提取灰度图像中的纹理特征时,可以对先对灰度图像进行预处理,然后从预处理后的图像中提取纹理特征。预处理,比如可以是直方图均衡化处理,可以防止光照不均对提取纹理特征的影响,再比如对灰度图像进行放大、缩小或者分割等处理,例如,可以将灰度图像分割为多个图像块,然后从多个图像块中选择纹理特征较明显的几个图像块,用于后续提取纹理特征的处理。As an example, when the terminal device 100 extracts texture features in a grayscale image, it may first preprocess the grayscale image, and then extract the texture feature from the preprocessed image. Preprocessing, for example, can be histogram equalization processing, which can prevent uneven illumination from affecting the extraction of texture features. For example, the grayscale image can be enlarged, reduced, or segmented. For example, the grayscale image can be divided into multiple Image block, and then select several image blocks with more obvious texture features from multiple image blocks for subsequent processing of extracting texture features.
以终端设备100对灰度图像进行直方图均衡化处理以及分割处理为例,对灰度图像进行预处理的过程如下:Taking the terminal device 100 performing histogram equalization processing and segmentation processing on the grayscale image as an example, the process of preprocessing the grayscale image is as follows:
B1,终端设备100对灰度图像进行直方图均衡化处理得到均衡化图像。B1, the terminal device 100 performs histogram equalization processing on the gray image to obtain an equalized image.
B2,终端设备100将均衡化图像划分为K个图像块,比如将均衡化图像划分为10×10的图像块。B2. The terminal device 100 divides the equalized image into K image blocks, for example, divides the equalized image into 10×10 image blocks.
B3,终端设备100对所述K个图像块的灰度均值大小进行排序,获取排序在预设名次范围内的N个图像块。比如,灰度均值排序在第20-28的9个图像块,参见图4所示。B3. The terminal device 100 sorts the average gray values of the K image blocks, and obtains N image blocks sorted within a preset ranking range. For example, the average gray value is sorted in the 9th image blocks from 20th to 28th, as shown in Fig. 4.
示例性地,终端设备100在从K个图像块中获取N个图像块,除了采用B3所示的方式外,从所述K个图像块中获取灰度均值位于预设范围内的N个图像块。如果灰度均值在 预设范围内的图像块不足N个时,可以以实际数量的图像块作为后续提取纹理特征的基准。如果灰度均值在预设范围内的图像块超过N个时,可以从随机除去几个图像块,使得数量达到N个,当然,如果灰度均值在预设范围内的图像块超过N个时,也可以以实际数量的图像块作为后续提取纹理特征的基准。Exemplarily, when the terminal device 100 obtains N image blocks from K image blocks, in addition to the method shown in B3, obtains N images with gray average values within a preset range from the K image blocks. Piece. If there are less than N image blocks with an average gray value within the preset range, the actual number of image blocks can be used as the reference for subsequent extraction of texture features. If there are more than N image blocks with the mean gray value in the preset range, several image blocks can be randomly removed to make the number reach N. Of course, if the image blocks with the gray mean value in the preset range exceed N , The actual number of image blocks can also be used as the reference for subsequent extraction of texture features.
B4,终端设备100从所述N个图像块中提取所述纹理特征,K和N均为正整数,N小于或者等于K。B4. The terminal device 100 extracts the texture feature from the N image blocks, K and N are both positive integers, and N is less than or equal to K.
在从N个图像块提取纹理特征时,可以通过如下几个步骤来实现:When extracting texture features from N image blocks, it can be achieved through the following steps:
C1,终端设备100分别对所述N个图像块进行二值化处理得到N个二值化图像。C1. The terminal device 100 performs binarization processing on the N image blocks respectively to obtain N binarized images.
可选地,在分别对所述N个图像块进行二值化处理得到N个二值化图像时,可以先对N个图像块进行滤波,用于对N个图像块分别进行平滑、去噪。滤波采用的方式可以是均值滤波、中值滤波、高斯滤波、双边滤波等方法。以均值滤波为例,可以采用模糊核91×91进行均值滤波。然后对滤波后的N个图像块进行二值化处理得到N个二值化图像。二值化处理后纹路区域的灰度值设置为255,其它区域的灰度值设置为0。例如,参见图5所示,为图像块经过二值化处理后的二值化图像示意图。Optionally, when the N image blocks are binarized to obtain N binarized images, the N image blocks may be filtered first for smoothing and denoising the N image blocks. . The filtering method can be average filtering, median filtering, Gaussian filtering, bilateral filtering and other methods. Taking mean filtering as an example, the fuzzy kernel 91×91 can be used for mean filtering. Then binarization is performed on the filtered N image blocks to obtain N binarized images. After the binarization process, the gray value of the grain area is set to 255, and the gray value of other areas is set to 0. For example, see FIG. 5, which is a schematic diagram of a binarized image after the image block has been binarized.
C2,终端设备100对所述N个二值化图像进行连通域分析得到至少一个第一连通域,所述至少一个第一连通域用于指示皮肤的纹路区域在所述N个图像块中的位置。C2. The terminal device 100 performs connected domain analysis on the N binarized images to obtain at least one first connected domain, where the at least one first connected domain is used to indicate that the skin texture area is in the N image blocks position.
可选地,在执行对所述N个二值化图像进行连通域分析得到至少一个第一连通域时,可以先对N个二值化图像进行腐蚀和/或膨胀处理,然后再对腐蚀和/或膨胀处理后的N个二值化图像进行连通域分析得到所述至少一个第一连通域。例如,可以采用5×5的腐蚀核分别对N个二值化图像进行腐蚀处理,然后再对腐蚀处理后的N个二值化图像进行连通域分析得到至少一个第一连通域。Optionally, when performing connected domain analysis on the N binarized images to obtain at least one first connected domain, the N binarized images may be corroded and/or expanded first, and then the corroded and /Or the connected domain analysis is performed on the N binarized images after the expansion process to obtain the at least one first connected domain. For example, a 5×5 corrosion core may be used to respectively perform corrosion processing on the N binarized images, and then perform connected component analysis on the N binarized images after the corrosion processing to obtain at least one first connected component.
C3,终端设备100从所述N个图像块中所述至少一个第一连通域所在区域提取所述纹理特征。C3. The terminal device 100 extracts the texture feature from the area where the at least one first connected domain is located in the N image blocks.
如下示例性地对分别确定纹理特征中的纹理深度、纹理宽度、宽纹理密度或者纹理密度的实现方式进行说明。The following exemplarily describes the implementation of determining the texture depth, texture width, wide texture density, or texture density in the texture feature.
1)纹理深度:1) Texture depth:
终端设备100可以根据所述N个图像块中至少一个第一连通域所在区域的灰度均值与所述N个图像块的灰度均值确定的。比如,所述纹理深度可以通过如下公式(1)确定:The terminal device 100 may determine according to the average gray value of the area where the at least one first connected domain in the N image blocks is located and the average gray value of the N image blocks. For example, the texture depth can be determined by the following formula (1):
F1=abs(M-M1)/M     公式(1)F1=abs(M-M1)/M Formula (1)
其中,F1表示纹理深度,M1表示所述N个图像块中的所述至少一个第一连通域所在区域的灰度均值,M表示所述N个图像块的灰度均值。Wherein, F1 represents texture depth, M1 represents the average gray value of the area where the at least one first connected domain in the N image blocks is located, and M represents the average gray value of the N image blocks.
终端设备100在提取灰度图像中的纹理深度时,可以提取N个图像块中每个图像块的纹理深度,然后将N个图像块的纹理深度的平均值作为所述灰度图像的纹理深度。例如,以第一图像块为例,第一图像块为所述N个图像块中的任一个。When extracting the texture depth in the grayscale image, the terminal device 100 may extract the texture depth of each of the N image blocks, and then use the average value of the texture depths of the N image blocks as the texture depth of the grayscale image. . For example, taking the first image block as an example, the first image block is any one of the N image blocks.
第一图像块的纹理深度可以通过如下公式(2)确定:The texture depth of the first image block can be determined by the following formula (2):
E1=abs(X-X1)/X       公式(2)E1=abs(X-X1)/X Formula (2)
其中,E1表示第一图像块的纹理深度,X表示所述第一图像块中的第一连通域所在区域的灰度均值,X1表示所述第一图像块的灰度均值。通过上述公式(2)分别确定N个图像块的纹理深度后,确定N个图像块的纹理深度的平均值。Wherein, E1 represents the texture depth of the first image block, X represents the average gray value of the area where the first connected domain in the first image block is located, and X1 represents the average gray value of the first image block. After the texture depths of the N image blocks are respectively determined by the above formula (2), the average value of the texture depths of the N image blocks is determined.
终端设备100在提取灰度图像中的纹理深度时,还可以将N个图像块看成一个整体, 直接确定N个图像块中的第一连通域所在区域的灰度均值,并确定N个图像块的灰度均值,从而基于上述公式(1)确定灰度图像的纹理深度。When the terminal device 100 extracts the texture depth in the grayscale image, it may also regard the N image blocks as a whole, directly determine the grayscale mean value of the area where the first connected domain of the N image blocks is located, and determine the N images The gray average value of the block, thereby determining the texture depth of the gray image based on the above formula (1).
2)纹理宽度:2) Texture width:
终端设备100可以根据至少一个第一连通域中的第二连通域的外轮廓长度以及面积确定所述纹理宽度。其中,所述第二连通域为所述至少一个第一连通域中外轮廓长度最长或者面积最大的第一连通域。The terminal device 100 may determine the texture width according to the length and area of the outer contour of the second connected domain in the at least one first connected domain. Wherein, the second connected domain is the first connected domain with the longest outer contour length or the largest area in the at least one first connected domain.
比如,当所述第二连通域为多连通域时,可以通过如下公式(3)或者公式(4)确定所述纹理宽度:For example, when the second connected domain is a multi-connected domain, the texture width can be determined by the following formula (3) or formula (4):
F2=F1×S1/(L1+L0)     公式(3)F2=F1×S1/(L1+L0) Formula (3)
F2=S1/(L1+L0)     公式(4)F2=S1/(L1+L0) Formula (4)
其中,F2表示所述纹理宽度,F1表示纹理深度,S1表示第二连通域的面积,L1表示第二连通域的外轮廓长度,L0表示第二连通域的内轮廓的长度和。Wherein, F2 represents the texture width, F1 represents the texture depth, S1 represents the area of the second connected domain, L1 represents the length of the outer contour of the second connected domain, and L0 represents the sum of the length of the inner contour of the second connected domain.
参见图6中(1)所示,白色区域为第二连通域,则L2为白色区域的外环长度,L0等于白色区域的内环长度,即白色区域所包围的黑色椭圆的长度。第一连通域的面积为白色区域部分的面积,即环状区域的面积。参见图6中(2)所示,白色区域为第二连通域,则L2为白色区域的外环长度,L0等于白色区域所包围的两个黑色椭圆的长度和,即L3和L4的和。As shown in Figure 6 (1), the white area is the second connected domain, then L2 is the length of the outer ring of the white area, and L0 is equal to the length of the inner ring of the white area, that is, the length of the black ellipse surrounded by the white area. The area of the first connected domain is the area of the white area, that is, the area of the ring area. As shown in (2) in Figure 6, the white area is the second connected domain, then L2 is the length of the outer ring of the white area, and L0 is equal to the sum of the lengths of the two black ellipses surrounded by the white area, that is, the sum of L3 and L4.
当所述第一连通域为单连通时,可以通过如下公式(5)或者公式(6)确定所述纹理宽度:When the first connected domain is simply connected, the texture width can be determined by the following formula (5) or formula (6):
F2=F1×S1/L1     公式(5)F2=F1×S1/L1 Formula (5)
F2=S1/L1      公式(6)F2=S1/L1 Formula (6)
其中,F2表示所述纹理宽度,F1表示纹理深度,S1表示第二连通域的面积,L1表示第二连通域的外轮廓长度,例如参见图6中(3)所示,S1为白色区域的面积,L1为白色区域的轮廓周长。Among them, F2 represents the texture width, F1 represents the texture depth, S1 represents the area of the second connected domain, and L1 represents the outer contour length of the second connected domain. For example, see (3) in Figure 6, where S1 is the white area. Area, L1 is the outline perimeter of the white area.
3)宽纹理密度:3) Wide texture density:
终端设备100可以从所述至少一个第一连通域中确定外轮廓长度大于预设阈值的至少一个第三连通域,然后确定N个图像块中包含所述第三连通域的K个图像块,进一步地分别确定所述K个图像块中每个图像块包括的第三连通域的面积和,与N个图像块的面积和比值得到K个比值;最后将所述K个比值的平均值乘上所述纹理深度确定为所述宽纹理密度,或者,将所述K个比值的平均值确定为所述宽纹理密度。The terminal device 100 may determine from the at least one first connected domain at least one third connected domain whose outer contour length is greater than a preset threshold, and then determine K image blocks containing the third connected domain among the N image blocks, Furthermore, the area sum of the third connected domain included in each image block in the K image blocks is determined separately, and the ratio of the area sum to the area sum of the N image blocks obtains K ratios; finally, the average value of the K ratios is multiplied The above texture depth is determined as the wide texture density, or the average of the K ratios is determined as the wide texture density.
比如,可以通过如下公式(7)或者公式(8)确定所述宽纹理密度:For example, the wide texture density can be determined by the following formula (7) or formula (8):
F3=F1×S2/S      公式(7)F3=F1×S2/S Formula (7)
F3=S2/S      公式(8)F3=S2/S Formula (8)
其中,F3表示宽纹理密度,S2表示K个图像块中所包括的第三连通域的面积和。比如有3个图像块包括第三连通域,分别为图像块1、图像块2和图像块3,图像块1包括2个第三连通域、图像块2包括1个第三连通域,图像块3包括3个第三连通域,则S2为图像块1中的2个第三连通域的面积、图像块2中的1个第三连通域的面积以及图像块3中的3个第三连通域的面积之和,即6个第三连通域的面积之和。S表示N个图像块的面积和。Among them, F3 represents the wide texture density, and S2 represents the area sum of the third connected domain included in the K image blocks. For example, there are 3 image blocks including the third connected domain, which are image block 1, image block 2, and image block 3. Image block 1 includes 2 third connected domains, and image block 2 includes 1 third connected domain. 3 includes 3 third connected domains, then S2 is the area of the 2 third connected domains in image block 1, the area of 1 third connected domain in image block 2, and the 3 third connected domains in image block 3. The sum of the area of the domain, that is, the sum of the area of the 6 third connected domains. S represents the area sum of N image blocks.
3)纹理密度:3) Texture density:
确定所述N个图像块所包括的第一连通域的面积和,与所述N个图像块的面积和之间的第二比值;然后将所述第二比值乘上所述纹理深度确定为所述纹理密度,或者,将所述第二比值确定为所述宽纹理密度。Determine the second ratio between the sum of the areas of the first connected domains included in the N image blocks and the sum of the areas of the N image blocks; and then multiply the second ratio by the texture depth to determine as The texture density, or, the second ratio is determined as the wide texture density.
比如,可以通过如下公式(9)或者公式(10)确定所述纹理密度:For example, the texture density can be determined by the following formula (9) or formula (10):
F4=F1×S3/S    公式(9)F4=F1×S3/S Formula (9)
F4=S3/S     公式(10)F4=S3/S Formula (10)
其中,F4表示纹理密度,S3表示N个图像块中所包括的第一连通域的面积和,S为N个图像块的面积和。比如N为4,分别为图像块1-图像块4,图像块1包括2个第一连通域、图像块2包括3个第一连通域、图像块3包括3个第一连通域、图像块4包括1个第一连通域,则S3为图像块1中的2个第一连通域的面积、图像块2中的3个第一连通域的面积、图像块3中的3个第一连通域的面积,以及图像块4中的1个第一连通域的面积之和,即9个第一连通域的面积之和。S表示N个图像块的面积和。Wherein, F4 represents the texture density, S3 represents the sum of the areas of the first connected domain included in the N image blocks, and S is the sum of the areas of the N image blocks. For example, N is 4, which are image block 1-image block 4, image block 1 includes 2 first connected domains, image block 2 includes 3 first connected domains, image block 3 includes 3 first connected domains, image block 4 includes 1 first connected domain, then S3 is the area of the 2 first connected domains in image block 1, the area of the 3 first connected domains in image block 2, and the 3 first connected domains in image block 3. The area of the domain, and the sum of the area of one first connected domain in the image block 4, that is, the sum of the area of the nine first connected domains. S represents the area sum of N image blocks.
作为一种示例,在终端设备100不对灰度图像分割处理时,可以通过如下方式确定灰度图像的纹理特征。针对灰度图像进行均值滤波,并对均值滤波后的灰度图像进行二值化处理,对所述二值化图像进行连通域分析得到至少一个连通域,然后至少一个连通域所在区域提取所述纹理特征。比如确定纹理深度时,可以将灰度图像中连通域位置的灰度均值与灰度图像的灰度均值的比值作为纹理深度。确定纹理宽度时,可以根据至少一个连通域中面积最大或者外轮廓长度最长的连通域来确定,具体确定方式参见公式(3)或公式(4)。在确定宽纹理密度时,可以从所述至少一个连通域中确定外轮廓长度大于预设阈值的连通域,确定外轮廓长度大于预设阈值的连通域的面积和,与所述灰度图像的面积之间的比值为宽纹理密度,或者确定外轮廓长度大于预设阈值的连通域的面积和,与所述灰度图像的面积之间的比值乘上所述纹理深度确定为所述宽纹理密度。在确定纹理密度时,可以将至少一个连通域的面积和,与所述灰度图像的面积之间的比值确定为纹理密度,或者将可以将至少一个连通域的面积和,与所述灰度图像的面积之间的比值乘上上述纹理深度确定为纹理密度。As an example, when the terminal device 100 does not segment the gray image, the texture feature of the gray image may be determined in the following manner. Perform mean filtering on the grayscale image, and perform binarization processing on the grayscale image after the mean filtering, perform connected domain analysis on the binarized image to obtain at least one connected domain, and then extract the area where at least one connected domain is located Texture characteristics. For example, when determining the texture depth, the ratio of the gray value of the connected domain position in the gray image to the gray value of the gray image can be used as the texture depth. When determining the texture width, it can be determined according to at least one connected domain with the largest area or the longest outer contour length in at least one connected domain. For a specific determination method, refer to formula (3) or formula (4). When determining the wide texture density, a connected domain whose outer contour length is greater than a preset threshold can be determined from the at least one connected domain, and the sum of the area of the connected domain whose outer contour length is greater than the preset threshold can be determined as compared with the gray scale image. The ratio between the areas is the density of the wide texture, or the sum of the areas of connected domains whose outer contour length is greater than a preset threshold is determined, and the ratio between the area and the area of the gray image multiplied by the depth of the texture is determined to be the wide texture density. When determining the texture density, the ratio between the area sum of at least one connected domain and the area of the gray-scale image may be determined as the texture density, or the area sum of at least one connected domain may be determined as the texture density. The ratio between the areas of the image multiplied by the above-mentioned texture depth is determined as the texture density.
作为一种示例,终端设备100在根据所述纹理特征确定所述待处理皮肤图像中皮肤的粗糙度时,可以采用集成学习算法模型,根据纹理特征确定待处理皮肤图像中皮肤的粗糙度。集成学习算法比如可以是Adaboost、或者Boosting、或者套袋法(bootstrap aggregating,bagging)。As an example, when the terminal device 100 determines the roughness of the skin in the skin image to be processed according to the texture feature, it may use an integrated learning algorithm model to determine the roughness of the skin in the skin image to be processed according to the texture feature. The integrated learning algorithm can be, for example, Adaboost, or Boosting, or bagging (bootstrap aggregation, bagging).
集成学习算法模型可以采用如下方式训练得到:The ensemble learning algorithm model can be trained in the following way:
采集在由多个年龄段构成的人群的手背图像,构成训练样本。由多个“皮肤专家”分别确定人群中每个人的手背粗糙度,即对手背情况进行打分,可以采用百分制或者10分制或者1分制等等。多个皮肤专家对某个人打分的平均值,作为某个人的手背粗糙度值的标签。基于训练样本以及对应的手背粗糙度标签,对预设的集成学习算法模型进行训练,训练得到的集成学习算法模型,则可以作为本申请实施例检测待处理皮肤图像中皮肤的粗糙度的模型。Collect images of the backs of the hands of people of multiple age groups to form training samples. Multiple "skin experts" respectively determine the roughness of the back of the hands of each person in the crowd, that is, score the back of the hands, and can use a 100-point system, a 10-point system, or a 1-point system. The average score of a person by multiple skin experts is used as a label for the roughness of the back of a person's hand. Based on the training samples and the corresponding roughness label of the back of the hand, the preset integrated learning algorithm model is trained, and the integrated learning algorithm model obtained through training can be used as a model for detecting the roughness of the skin in the skin image to be processed in the embodiment of the present application.
申请人以150人构成的人群进行实验、根据“皮肤专家”打分制,选择三位真人专家分别对150人的手背粗糙度进行盲评打分,每个人对应的三位真人专家的打分平均值作为该个人的粗糙度得分。以150人手背图像作为训练样本,以94人作为验证样本进行模型验证。实验利用AdaBoost,通过交叉验证的方式得到测试集相关系数0.88,初步验证了本 算法的可行性,测试结果参见图7和图8所示。图7为采用分割图像块的方式的测试结果,图8为未采用分割图像块的方式的测试结果。图7和图8中x轴代表皮肤专家对测试集数据打分结果,y轴为模型测试结果。图7和图8中的线条为理想拟合结果,黑色点集越接近红线,代表相关性越高。图7中的相关性测试结果为0.73,图8中的相关性测试结果为0.88,分割图像块的方式对比非分割图像块的方式,测试结果更佳。The applicant conducted the experiment with a population of 150 people. According to the "skin expert" scoring system, three human experts were selected to blindly evaluate the roughness of the back of the hands of 150 people. The average score of the three human experts corresponding to each person was taken as The roughness score of the individual. 150 people’s back images were used as training samples, and 94 people were used as verification samples for model verification. The experiment uses AdaBoost to obtain a correlation coefficient of 0.88 in the test set through cross-validation, which preliminarily verifies the feasibility of the algorithm. The test results are shown in Figure 7 and Figure 8. FIG. 7 shows the test result of the method of dividing the image block, and FIG. 8 shows the test result of the method of not using the image block. The x-axis in Figures 7 and 8 represents the scoring result of the test set data by the skin expert, and the y-axis is the model test result. The lines in Figure 7 and Figure 8 are ideal fitting results. The closer the black point set is to the red line, the higher the correlation. The correlation test result in Fig. 7 is 0.73, and the correlation test result in Fig. 8 is 0.88. The method of segmenting image blocks is better than the method of non-segmenting image blocks.
进一步地,终端设备100在确定手背图像中手背的粗糙度后,在显示屏上显示粗糙度值(即打分结果),还可以显示对手背皮肤的护理建议等。比如,参见图9所示。Further, after determining the roughness of the back of the hand in the image of the back of the hand, the terminal device 100 displays the roughness value (that is, the scoring result) on the display screen, and can also display care suggestions for the back of the hand. For example, see Figure 9.
上述涉及的各个实施例可以相互结合使用,也可以单独使用。The various embodiments mentioned above can be used in combination with each other or used alone.
上述本申请提供的实施例中,从电子设备作为执行主体的角度对本申请实施例提供的方法进行了介绍。为了实现上述本申请实施例提供的方法中的各功能,电子设备可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。In the above-mentioned embodiments provided in the present application, the method provided in the embodiments of the present application is introduced from the perspective of an electronic device as an execution subject. In order to realize the functions in the methods provided in the above embodiments of the present application, the electronic device may include a hardware structure and/or a software module, and realize the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function of the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
基于相同的构思,图10所示为本申请提供的一种电子设备1000。示例的,电子设备1000包括至少一个处理器1010、存储器1020,还可以包括显示屏1030和摄像头1040。其中,处理器1010与存储器1020、显示屏1030和摄像头1040耦合,本申请实施例中的耦合是装置、单元或模块之间的间接耦合或通信连接,可以是电性,机械或其它的形式,用于装置、单元或模块之间的信息交互。Based on the same concept, FIG. 10 shows an electronic device 1000 provided in this application. For example, the electronic device 1000 includes at least one processor 1010 and a memory 1020, and may also include a display screen 1030 and a camera 1040. The processor 1010 is coupled with the memory 1020, the display screen 1030, and the camera 1040. The coupling in the embodiment of the present application is an indirect coupling or communication connection between devices, units, or modules, and may be in electrical, mechanical or other forms. Used for information exchange between devices, units or modules.
具体的,存储器1020用于存储程序指令,摄像头1040用于拍摄图像,显示屏1030用于在摄像头1040启动拍摄时显示拍照预览界面,拍照预览界面包括摄像头1040采集的图像。显示屏1030还可以用于显示上述实施例中所涉及的用户界面,如图3所示的用户界面、图9所示的界面等等。处理器1010用于调用并执行存储器1020中存储的程序指令,执行上述图3所示的色斑检测方法中的步骤。Specifically, the memory 1020 is used to store program instructions, the camera 1040 is used to capture images, and the display screen 1030 is used to display a photo preview interface when the camera 1040 starts shooting. The photo preview interface includes the images collected by the camera 1040. The display screen 1030 may also be used to display the user interfaces involved in the above embodiments, such as the user interface shown in FIG. 3, the interface shown in FIG. 9, and so on. The processor 1010 is configured to call and execute the program instructions stored in the memory 1020, and execute the steps in the color spot detection method shown in FIG. 3 above.
应理解,该电子设备1000可以用于实现本申请实施例的如图2所示的皮肤粗糙度检测方法,相关特征可以参照上文,此处不再赘述。It should be understood that the electronic device 1000 can be used to implement the skin roughness detection method shown in FIG. 2 in the embodiment of the present application, and related features can be referred to the above, which will not be repeated here.
所属领域的技术人员可以清楚地了解到本申请实施例可以用硬件实现,或固件实现,或它们的组合方式来实现。当使用软件实现时,可以将上述功能存储在计算机可读介质中或作为计算机可读介质上的一个或多个指令或代码进行传输。计算机可读介质包括计算机存储介质和通信介质,其中通信介质包括便于从一个地方向另一个地方传送计算机程序的任何介质。存储介质可以是计算机能够存取的任何可用介质。以此为例但不限于:计算机可读介质可以包括RAM、ROM、电可擦可编程只读存储器(electrically erasable programmable read only memory,EEPROM)、只读光盘(compact disc read-Only memory,CD-ROM)或其他光盘存储、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质。此外。任何连接可以适当的成为计算机可读介质。例如,如果软件是使用同轴电缆、光纤光缆、双绞线、数字用户线(digital subscriber line,DSL)或者诸如红外线、无线电和微波之类的无线技术从网站、服务器或者其他远程源传输的,那么同轴电缆、光纤光缆、双绞线、DSL或者诸如红外线、无线和微波之类的无线技术包括在所属介质的定影中。如本申请实施例所使用的,盘(disk)和碟(disc)包括压缩光碟(compact disc,CD)、激光碟、光碟、数字通用光碟(digital video disc,DVD)、软盘和蓝光光碟,其中盘通常磁性的复制数据, 而碟则用激光来光学的复制数据。上面的组合也应当包括在计算机可读介质的保护范围之内。Those skilled in the art can clearly understand that the embodiments of the present application can be implemented by hardware, firmware, or a combination of them. When implemented by software, the above-mentioned functions can be stored in a computer-readable medium or transmitted as one or more instructions or codes on the computer-readable medium. The computer-readable medium includes a computer storage medium and a communication medium, where the communication medium includes any medium that facilitates the transfer of a computer program from one place to another. The storage medium may be any available medium that can be accessed by a computer. Take this as an example but not limited to: computer-readable media can include RAM, ROM, electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory, CD- ROM) or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or any other media that can be used to carry or store desired program codes in the form of instructions or data structures and that can be accessed by a computer. In addition. Any connection can suitably become a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technology such as infrared, radio, and microwave, Then coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, wireless, and microwave are included in the fixing of the media. As used in the embodiments of this application, disks and discs include compact discs (CDs), laser discs, optical discs, digital video discs (digital video discs, DVDs), floppy discs, and Blu-ray discs. Disks usually copy data magnetically, while disks use lasers to copy data optically. The above combination should also be included in the protection scope of the computer-readable medium.
总之,以上所述仅为本申请的实施例而已,并非用于限定本申请的保护范围。凡根据本申请的揭露,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。In short, the above descriptions are only examples of the present application, and are not used to limit the protection scope of the present application. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of this application shall be included in the protection scope of this application.

Claims (17)

  1. 一种皮肤粗糙度检测方法,其特征在于,应用于电子设备中,包括:A method for detecting skin roughness, which is characterized in that it is applied to an electronic device and includes:
    获取待处理皮肤图像的灰度图像;Acquiring a grayscale image of the skin image to be processed;
    提取所述灰度图像中的纹理特征,所述纹理特征中包括纹理深度、纹理宽度、宽纹理密度和纹理密度中的至少一项;Extracting texture features in the grayscale image, where the texture features include at least one of texture depth, texture width, wide texture density, and texture density;
    其中,所述纹理深度用于表征皮肤上纹路的深度,所述纹理宽度用于表征皮肤上纹路的宽度,所述宽纹理密度用于表征纹路宽度达到预设阈值的纹路在皮肤中的密度,所述纹理密度用于表征纹路在皮肤中的密度;Wherein, the texture depth is used to characterize the depth of the lines on the skin, the texture width is used to characterize the width of the lines on the skin, and the wide texture density is used to characterize the density of the lines in the skin whose line width reaches a preset threshold. The texture density is used to characterize the density of lines in the skin;
    根据所述纹理特征确定所述待处理皮肤图像中皮肤的粗糙度。The roughness of the skin in the skin image to be processed is determined according to the texture feature.
  2. 如权利要求1所述的方法,其特征在于,提取所述灰度图像中的纹理特征,包括:The method of claim 1, wherein extracting texture features in the grayscale image comprises:
    将所述灰度图像划分为K个图像块,从所述K个图像块中获取灰度均值位于预设灰度范围内的N个图像块,从所述N个图像块中分别提取所述纹理特征,K和N均为正整数,N小于或者等于K;或者,Divide the gray-scale image into K image blocks, obtain N image blocks with a gray average value within a preset gray-scale range from the K image blocks, and extract the N image blocks from the N image blocks. Texture feature, K and N are both positive integers, and N is less than or equal to K; or,
    将所述灰度图像划分为K个图像块,对所述K个图像块的灰度均值大小进行排序,获取排序在预设名次范围内的N个图像块,从所述N个图像块中提取所述纹理特征。Divide the gray image into K image blocks, sort the average gray values of the K image blocks, obtain N image blocks sorted within a preset ranking range, and select from the N image blocks The texture feature is extracted.
  3. 如权利要求2所述的方法,其特征在于,从所述N个图像块中提取纹理特征,包括:3. The method of claim 2, wherein extracting texture features from the N image blocks comprises:
    分别对所述N个图像块进行二值化处理得到N个二值化图像;Performing binarization processing on the N image blocks respectively to obtain N binarized images;
    对所述N个二值化图像进行连通域分析得到至少一个第一连通域,所述至少一个第一连通域用于指示皮肤的纹路区域在所述N个图像块中的位置;Performing connected domain analysis on the N binarized images to obtain at least one first connected domain, where the at least one first connected domain is used to indicate the position of the skin texture area in the N image blocks;
    从所述N个图像块中所述至少一个第一连通域所在区域提取所述纹理特征。The texture feature is extracted from the area where the at least one first connected domain is located in the N image blocks.
  4. 如权利要求3所述的方法,其特征在于,分别对所述N个图像块进行二值化处理得到N个二值化图像,包括:The method according to claim 3, wherein the binarization process is performed on the N image blocks to obtain N binarized images, comprising:
    分别对所述N个图像块进行滤波,并对滤波后的N个图像块进行二值化处理得到N个二值化图像。The N image blocks are respectively filtered, and the filtered N image blocks are binarized to obtain N binarized images.
  5. 如权利要求3或4所述的方法,其特征在于,所述对所述N个二值化图像进行连通域分析得到至少一个第一连通域,包括:The method according to claim 3 or 4, wherein the performing connected component analysis on the N binarized images to obtain at least one first connected component comprises:
    对所述N个二值化图像进行腐蚀和/或膨胀处理,并对腐蚀和/或膨胀处理后的N个二值化图像进行连通域分析得到所述至少一个第一连通域。Corrosion and/or expansion processing is performed on the N binarized images, and connected domain analysis is performed on the N binarized images after the corrosion and/or expansion processing to obtain the at least one first connected domain.
  6. 如权利要求3-5任一项所述的方法,其特征在于,所述纹理深度是根据所述N个图像块中至少一个第一连通域所在区域的灰度均值与所述N个图像块的灰度均值确定的。The method according to any one of claims 3-5, wherein the texture depth is based on the gray average value of the area where at least one first connected domain is located in the N image blocks and the N image blocks The mean value of gray is determined.
  7. 如权利要求6所述的方法,其特征在于,所述纹理深度符合如下公式要求:The method of claim 6, wherein the texture depth meets the requirements of the following formula:
    F1=abs(M-M1)/M;F1=abs(M-M1)/M;
    其中,F1表示纹理深度,M1表示所述N个图像块中所述至少一个第一连通域所在区域的灰度均值,M表示N个图像块的灰度均值。Wherein, F1 represents the texture depth, M1 represents the average gray value of the area where the at least one first connected domain in the N image blocks is located, and M represents the average gray value of the N image blocks.
  8. 如权利要求3-7任一项所述的方法,其特征在于,从所述N个图像块中所述至少一个第一连通域所在区域提取所述纹理宽度,包括:7. The method according to any one of claims 3-7, wherein extracting the texture width from the area where the at least one first connected domain in the N image blocks is located comprises:
    根据至少一个第一连通域中的第二连通域的外轮廓长度以及面积确定所述纹理宽度;Determining the texture width according to the outer contour length and area of the second connected domain in the at least one first connected domain;
    其中,所述第二连通域为所述至少一个第一连通域中外轮廓长度最长或者面积最大的 第一连通域。Wherein, the second connected domain is the first connected domain with the longest outer contour length or the largest area in the at least one first connected domain.
  9. 如权利要求8所述的方法,其特征在于,根据至少一个第一连通域中的第二连通域的外轮廓长度以及面积确定所述纹理宽度,包括:The method according to claim 8, wherein the determining the texture width according to the outer contour length and the area of the second connected domain in the at least one first connected domain comprises:
    当所述第二连通域为多连通域时,所述纹理宽度符合如下公式要求:When the second connected domain is a multi-connected domain, the texture width meets the requirements of the following formula:
    F2=F1×S1/(L1+L0);或者,F2=S1/(L1+L0);F2=F1×S1/(L1+L0); or, F2=S1/(L1+L0);
    其中,F2表示所述纹理宽度,F1表示纹理深度,S1表示第二连通域的面积,L1表示第二连通域的外轮廓长度,L0表示第二连通域的内轮廓的长度和;Wherein, F2 represents the texture width, F1 represents the texture depth, S1 represents the area of the second connected domain, L1 represents the length of the outer contour of the second connected domain, and L0 represents the sum of the length of the inner contour of the second connected domain;
    当所述第二连通域为单连通域时,所述纹理宽度符合如下公式要求:When the second connected domain is a single connected domain, the texture width meets the requirements of the following formula:
    F2=F1×S1/L1;或者,F2=S1/L1;F2=F1×S1/L1; or, F2=S1/L1;
    其中,F2表示所述纹理宽度,F1表示纹理深度,S1表示第二连通域的面积,L1表示第二连通域的外轮廓长度。Wherein, F2 represents the texture width, F1 represents the texture depth, S1 represents the area of the second connected domain, and L1 represents the outer contour length of the second connected domain.
  10. 如权利要求3-9任一项所述的方法,其特征在于,从所述N个图像块中所述至少一个第一连通域所在区域提取所述宽纹理密度,包括:9. The method according to any one of claims 3-9, wherein extracting the wide texture density from the area where the at least one first connected domain in the N image blocks is located comprises:
    从所述至少一个第一连通域中确定外轮廓长度大于预设阈值的至少一个第三连通域;Determine, from the at least one first connected domain, at least one third connected domain whose outer contour length is greater than a preset threshold;
    确定N个图像块中包含所述第二连通域的K个图像块;Determining K image blocks of the second connected domain among the N image blocks;
    确定所述K个图像块中所包括的第三连通域的面积和,与所述N个图像块的面积和之间的第一比值;Determining a first ratio between the sum of the areas of the third connected domains included in the K image blocks and the sum of the areas of the N image blocks;
    将所述第一比值乘上所述纹理深度确定为所述宽纹理密度,或者,将所述第一比值确定为所述宽纹理密度。The first ratio multiplied by the texture depth is determined as the wide texture density, or the first ratio is determined as the wide texture density.
  11. 如权利要求3-10任一项所述的方法,其特征在于,从所述N个图像块中所述至少一个第一连通域所在区域提取所述纹理密度,包括:The method according to any one of claims 3-10, wherein extracting the texture density from the area where the at least one first connected domain in the N image blocks is located comprises:
    确定所述N个图像块所包括的第一连通域的面积和,与所述N个图像块的面积和之间的第二比值;Determining a second ratio between the sum of the areas of the first connected domains included in the N image blocks and the sum of the areas of the N image blocks;
    将所述第二比值乘上所述纹理深度确定为所述纹理密度,或者,将所述第二比值确定为所述宽纹理密度。The second ratio multiplied by the texture depth is determined as the texture density, or the second ratio is determined as the wide texture density.
  12. 如权利要求1-11任一项所述的方法,其特征在于,所述提取所述灰度图像中的纹理特征,包括:The method according to any one of claims 1-11, wherein the extracting the texture feature in the grayscale image comprises:
    对所述灰度图像进行直方图均衡化处理得到均衡化图像,提取所述均衡化图像中的纹理特征。Performing a histogram equalization process on the grayscale image to obtain an equalized image, and extract texture features in the equalized image.
  13. 如权利要求1-12任一项所述的方法,其特征在于,根据所述纹理特征确定所述待处理皮肤图像中皮肤的粗糙度,包括:The method according to any one of claims 1-12, wherein determining the roughness of the skin in the skin image to be processed according to the texture feature comprises:
    根据所述纹理特征采用集成学习算法模型确定所述待处理皮肤图像中皮肤的粗糙度。Using an integrated learning algorithm model according to the texture feature to determine the roughness of the skin in the skin image to be processed.
  14. 一种终端设备,其特征在于,包括处理器、存储器;其中所述处理器与所述存储器耦合;A terminal device, characterized by comprising a processor and a memory; wherein the processor is coupled with the memory;
    所述存储器,用于存储程序指令;The memory is used to store program instructions;
    所述处理器,用于读取所述存储器中存储的所述程序指令,以实现本申请实施例如权利要求1至13任一所述的方法。The processor is configured to read the program instructions stored in the memory to implement the method according to any one of claims 1 to 13 in an embodiment of the present application.
  15. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有程序指令,当所述程序指令在电子设备上运行时,使得所述电子设备执行权利要求1至13任一所述的方法。A computer storage medium, wherein the computer storage medium stores program instructions, and when the program instructions run on an electronic device, the electronic device executes the method according to any one of claims 1 to 13.
  16. 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行权利要求1至13任一所述的方法。A computer program product, characterized in that, when the computer program product runs on an electronic device, the electronic device is caused to execute the method according to any one of claims 1 to 13.
  17. 一种芯片,其特征在于,所述芯片与电子设备中的存储器耦合,所述芯片读取并执行所述存储器中存储的程序指令时,使得所述电子设备执行权利要求1至13任一所述的方法。A chip, characterized in that the chip is coupled with a memory in an electronic device, and when the chip reads and executes the program instructions stored in the memory, the electronic device executes any one of claims 1 to 13 The method described.
PCT/CN2020/115973 2019-09-18 2020-09-17 Skin roughness measurement method and electronic device WO2021052436A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910882738.8A CN112603259B (en) 2019-09-18 2019-09-18 Skin roughness detection method and electronic equipment
CN201910882738.8 2019-09-18

Publications (1)

Publication Number Publication Date
WO2021052436A1 true WO2021052436A1 (en) 2021-03-25

Family

ID=74883387

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/115973 WO2021052436A1 (en) 2019-09-18 2020-09-17 Skin roughness measurement method and electronic device

Country Status (2)

Country Link
CN (1) CN112603259B (en)
WO (1) WO2021052436A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117392121B (en) * 2023-12-07 2024-03-08 西安鼎福十方网络智能科技有限公司 Percutaneous drug delivery treatment control method and system based on image recognition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004230117A (en) * 2003-01-30 2004-08-19 Pola Chem Ind Inc Method for discriminating wrinkle by replica image
CN102036607A (en) * 2008-05-23 2011-04-27 宝丽化学工业有限公司 Method for automatically judging skin texture and/or crease
CN105844236A (en) * 2016-03-22 2016-08-10 重庆医科大学 Skin image information processing-based age testing method
CN106384075A (en) * 2016-03-11 2017-02-08 株式会社爱茉莉太平洋 A skin block mass based skin texture evaluating device and evaluating method
CN109299632A (en) * 2017-07-25 2019-02-01 上海中科顶信医学影像科技有限公司 Skin detecting method, system, equipment and storage medium
CN110008925A (en) * 2019-04-15 2019-07-12 中国医学科学院皮肤病医院 A kind of skin automatic testing method based on integrated study

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4909686B2 (en) * 2006-09-08 2012-04-04 学校法人東京理科大学 Epidermal tissue quantification apparatus and program
US10281267B2 (en) * 2014-11-10 2019-05-07 Shiseido Company, Ltd. Method for evaluating flow of skin, method for examining skin glow improvers, and skin glow improver
CN107157447B (en) * 2017-05-15 2020-03-20 北京工商大学 Skin surface roughness detection method based on image RGB color space
KR102049040B1 (en) * 2017-08-07 2019-11-27 (재)예수병원유지재단 Apparatus for diagnosing biometric roughness
CN108154510A (en) * 2018-01-17 2018-06-12 深圳市亿图视觉自动化技术有限公司 Method for detecting surface defects of products, device and computer readable storage medium
CN109325468B (en) * 2018-10-18 2022-06-03 广州智颜科技有限公司 Image processing method and device, computer equipment and storage medium
CN110210448B (en) * 2019-06-13 2022-09-13 广州纳丽生物科技有限公司 Intelligent face skin aging degree identification and evaluation method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004230117A (en) * 2003-01-30 2004-08-19 Pola Chem Ind Inc Method for discriminating wrinkle by replica image
CN102036607A (en) * 2008-05-23 2011-04-27 宝丽化学工业有限公司 Method for automatically judging skin texture and/or crease
CN106384075A (en) * 2016-03-11 2017-02-08 株式会社爱茉莉太平洋 A skin block mass based skin texture evaluating device and evaluating method
CN105844236A (en) * 2016-03-22 2016-08-10 重庆医科大学 Skin image information processing-based age testing method
CN109299632A (en) * 2017-07-25 2019-02-01 上海中科顶信医学影像科技有限公司 Skin detecting method, system, equipment and storage medium
CN110008925A (en) * 2019-04-15 2019-07-12 中国医学科学院皮肤病医院 A kind of skin automatic testing method based on integrated study

Also Published As

Publication number Publication date
CN112603259A (en) 2021-04-06
CN112603259B (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN112215802B (en) Skin detection method and electronic equipment
WO2020134877A1 (en) Skin detection method and electronic device
US20220180485A1 (en) Image Processing Method and Electronic Device
CN113170037B (en) Method for shooting long exposure image and electronic equipment
WO2020015144A1 (en) Photographing method and electronic device
CN114140365B (en) Event frame-based feature point matching method and electronic equipment
CN111566693B (en) Wrinkle detection method and electronic equipment
WO2021052436A1 (en) Skin roughness measurement method and electronic device
CN111417982B (en) Color spot detection method and electronic equipment
CN114241347A (en) Skin sensitivity display method and device, electronic equipment and readable storage medium
WO2023005318A1 (en) Physiological detection signal quality evaluation method, electronic device and storage medium
CN111460942B (en) Proximity detection method and device, computer readable medium and terminal equipment
CN115633114A (en) Display method and device of address book letters and terminal equipment
CN111557007B (en) Method for detecting opening and closing states of eyes and electronic equipment
CN113538226A (en) Image texture enhancement method, device, equipment and computer readable storage medium
CN117499797B (en) Image processing method and related equipment
CN114945176B (en) Clipboard access control method, electronic equipment and storage medium
CN114942741B (en) Data transmission method and electronic equipment
CN117953508A (en) OCR (optical character recognition) method for text image, electronic equipment and medium
CN117726543A (en) Image processing method and device
CN116030787A (en) Age-based sound generation method and device
CN117710695A (en) Image data processing method and electronic equipment
CN117523077A (en) Virtual image generation method and device
CN116157842A (en) Reconstruction method and device of character three-dimensional model

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20866410

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20866410

Country of ref document: EP

Kind code of ref document: A1