WO2021052436A1 - Procédé de mesure de rugosité cutanée et dispositif électronique - Google Patents

Procédé de mesure de rugosité cutanée et dispositif électronique Download PDF

Info

Publication number
WO2021052436A1
WO2021052436A1 PCT/CN2020/115973 CN2020115973W WO2021052436A1 WO 2021052436 A1 WO2021052436 A1 WO 2021052436A1 CN 2020115973 W CN2020115973 W CN 2020115973W WO 2021052436 A1 WO2021052436 A1 WO 2021052436A1
Authority
WO
WIPO (PCT)
Prior art keywords
texture
connected domain
image
image blocks
skin
Prior art date
Application number
PCT/CN2020/115973
Other languages
English (en)
Chinese (zh)
Inventor
于野
胡宏伟
董辰
郜文美
姚烨
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021052436A1 publication Critical patent/WO2021052436A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/442Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the embodiments of the present application relate to the field of image processing technology, and in particular, to a skin roughness detection method and electronic equipment.
  • Skin Roughness as an important skin texture analysis method, can reflect the health of human body functions to a certain extent.
  • the embodiments of the present application provide a skin roughness detection method and electronic device, which are used to detect the roughness of the skin through the electronic device.
  • an embodiment of the present application provides a method for detecting skin roughness, which is applied to an electronic device or a chip in an electronic device.
  • the method includes: acquiring a grayscale image of a skin image to be processed;
  • the texture feature is extracted from the gray-scale image, and the texture feature includes at least one of texture depth, texture width, wide texture density, and texture density;
  • the texture depth is used to characterize the depth of the lines on the skin, and the texture width is
  • the wide texture density is used to characterize the density of the lines in the skin whose line width reaches a preset threshold, and the texture density is used to characterize the density of the lines in the skin; further according to the texture
  • the feature determines the roughness of the skin in the skin image to be processed.
  • the electronic device is used to determine the roughness of the skin based on the texture feature. It is easy to use and does not use a simple parameter threshold to determine the roughness of the skin. It has strong stability and high accuracy.
  • the extraction of texture features in the grayscale image can be achieved by dividing the grayscale image into K image blocks, and obtaining the grayscale from the K image blocks N image blocks with an average value within a preset grayscale range, extract the texture features from the N image blocks, K and N are both positive integers, and N is less than or equal to K; or, the grayscale The image is divided into K image blocks, the gray scale average size of the K image blocks is sorted, N image blocks sorted within a preset ranking range are obtained, and the texture feature is extracted from the N image blocks .
  • several image blocks are selected from multiple image blocks to extract texture features, which reduces the influence of ambient light on the extraction of texture features and improves accuracy.
  • the extraction of texture features from the N image blocks can be achieved by performing binarization processing on the N image blocks respectively to obtain N binarized images; N binarized images are subjected to connected domain analysis to obtain at least one first connected domain, where the at least one first connected domain is used to indicate the position of the skin texture area in the N image blocks; from the N images The texture feature is extracted from a region where the at least one first connected domain in the block is located.
  • performing binarization processing on the N image blocks to obtain N binarized images respectively includes: filtering the N image blocks respectively, and performing the filtered N images Block binarization processing to obtain N binarized images.
  • filtering is performed before binarization, which can smooth and denoise the image.
  • the performing connected domain analysis on the N binarized images to obtain at least one first connected domain includes: performing corrosion and/or expansion processing on the N binarized images, The connected domain analysis is performed on the N binarized images after the corrosion and/or expansion processing to obtain the at least one first connected domain.
  • the above design can make the determined skin texture clearer and more accurate by performing corrosion and/or expansion processing on the binary image.
  • the texture depth may be determined according to the average gray value of the area where at least one first connected domain in the N image blocks is located and the average gray value of the N image blocks.
  • the texture depth can meet the requirements of the following formula:
  • F1 represents the texture depth
  • M1 represents the average gray value of the area where the at least one first connected domain in the N image blocks is located
  • M represents the average gray value of the N image blocks.
  • extracting the texture width from the area where the at least one first connected domain in the N image blocks is located includes: according to the outer contour of the second connected domain in the at least one first connected domain The length and area determine the texture width; wherein the second connected domain is the first connected domain with the longest outer contour length or the largest area in the at least one first connected domain.
  • determining the texture width according to the length and area of the outer contour of the second connected domain in the at least one first connected domain can be implemented in the following manner:
  • the texture width is determined by the following formula:
  • F2 represents the texture width
  • F1 represents the texture depth
  • S1 represents the area of the second connected domain
  • L1 represents the length of the outer contour of the second connected domain
  • L0 represents the sum of the length of the inner contour of the second connected domain
  • the texture width is determined by the following formula:
  • F2 represents the texture width
  • F1 represents the texture depth
  • S1 represents the area of the second connected domain
  • L1 represents the outer contour length of the second connected domain.
  • the above design provides a simple and effective way to extract the texture width.
  • extracting the wide texture density from the area where the at least one first connected domain in the N image blocks is located can be implemented in the following manner:
  • extracting the texture density from the area where the at least one first connected domain in the N image blocks is located may be implemented in the following manner: determining the first one included in the N image blocks The second ratio between the area sum of the connected domain and the area sum of the N image blocks; the second ratio is multiplied by the texture depth to determine the texture density, or the second ratio is Determined as the texture density.
  • the extraction of the texture features in the grayscale image may be achieved in the following manner: performing a histogram equalization process on the grayscale image to obtain an equalized image, and extracting the equalized image
  • the texture feature in can prevent the influence of uneven illumination on the detection result.
  • the determination of the roughness of the skin in the skin image to be processed according to the texture feature can be implemented in the following manner: an integrated learning algorithm model is used to determine the skin image in the skin image to be processed according to the texture feature The roughness of the skin.
  • an embodiment of the present application provides a skin roughness detection device, which includes units respectively used to execute the method according to the first aspect or any one of the designs of the first aspect.
  • an embodiment of the present application provides an electronic device including a processor and a memory; wherein the processor is coupled with the memory; wherein the memory is used to store program instructions; the processor is used to read program instructions stored in the memory, To achieve the first aspect and any possible design methods.
  • a computer storage medium provided by an embodiment of the present application, stores program instructions, and when the program instructions run on an electronic device, the electronic device executes the first aspect and any of its possible designs. method.
  • a computer program product provided by an embodiment of the present application, when the computer program product runs on an electronic device, causes the electronic device to execute the first aspect and any possible design method thereof.
  • the sixth aspect is a chip provided by an embodiment of the present application, which is coupled with a memory in an electronic device, and executes the first aspect and any possible design method thereof.
  • Coupled in the embodiments of the present application means that two components are directly or indirectly combined with each other.
  • FIG. 1 is a schematic structural diagram of a terminal device 100 in an embodiment of the application
  • FIG. 2 is a schematic flow chart of a method for detecting the roughness of the back of the hand in an embodiment of the application
  • FIG. 3 is a schematic diagram of a preview user interface in an embodiment of the application.
  • FIG. 4 is a schematic diagram of image blocks used for extracting texture features in an image of the back of the hand in an embodiment of the application;
  • Figure 5 is a schematic diagram of binarization processing in an embodiment of the application.
  • Figure 6 is a schematic diagram of connected domains in an embodiment of this application.
  • FIG. 7 and FIG. 8 are schematic diagrams of the detection result of the method provided by this application in the embodiment of the application.
  • Figure 9 is a schematic diagram of a skin roughness detection report in an embodiment of the application.
  • FIG. 10 is a schematic diagram of an electronic device 1000 provided by an embodiment of the application.
  • the embodiment of the present application proposes a skin roughness detection solution, which is suitable for electronic equipment, and the electronic equipment may be a terminal device.
  • the skin roughness detection function provided in the embodiments of the present application may be integrated in one or more applications of the terminal device, for example, integrated in a camera application.
  • the terminal device starts the camera application and displays a viewfinder interface.
  • the viewfinder interface may include a control. When the control is activated, the terminal device can start the skin roughness detection function provided by the embodiment of the present application.
  • the skin roughness detection function provided by the embodiments of the present application can also be integrated in an application dedicated to skin detection in a terminal device.
  • the application for skin detection can not only realize the skin roughness detection function, but also can realize the detection of wrinkles, pores, blackheads, etc. of the facial skin.
  • Skin roughness can be neck roughness, facial roughness, or back hand roughness.
  • the skin test application can also provide the user with a test result report.
  • the detection result report can include, but is not limited to, scoring of various features on the back of the hand, comprehensive analysis of the back of the hand, etc., and corresponding nursing or treatment suggestions can also be given according to the score of the back of the hand. It is understandable that the test result report can be presented to the user through the user interface.
  • the terminal device may be a portable terminal device containing functions such as a personal digital assistant and/or a music player, such as a mobile phone, a tablet computer, a wearable device with wireless communication function (such as a smart watch), Vehicle equipment, etc.
  • portable terminal equipment include, but are not limited to, carrying Or portable terminal equipment with other operating systems.
  • the aforementioned portable terminal device may also be, for example, a laptop computer with a touch-sensitive surface (such as a touch panel). It should also be understood that in some other embodiments of the present application, the aforementioned terminal device may also be a desktop computer with a touch-sensitive surface (such as a touch panel).
  • the terminal device may also have algorithmic computing capabilities (capable of running the skin roughness detection algorithm provided in the embodiments of the present application) and communication functions, without the need for image acquisition functions.
  • a terminal device receives an image sent by another device, and then runs the skin roughness detection algorithm provided in this embodiment of the application to detect the roughness of the skin in the image.
  • the terminal device itself has an image collection function and arithmetic operation function as an example.
  • FIG. 1 shows a schematic structural diagram of a possible terminal device 100.
  • the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 2, wireless communication Module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194 and so on.
  • the terminal device 100 in the embodiment of the present application may further include an antenna 1, a mobile communication module 150, a subscriber identification module (SIM) card interface 195, and so on.
  • SIM subscriber identification module
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, and a memory.
  • AP application processor
  • GPU graphics processing unit
  • ISP image signal processor
  • controller controller
  • memory e.g., RAM
  • Video codec digital signal processor
  • DSP digital signal processor
  • baseband processor e.g., baseband processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in the processor 110 may be a cache memory.
  • the memory can store instructions or data that the processor 110 has just used or used cyclically. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may run the skin roughness detection algorithm provided in the embodiment of the present application to detect the roughness of the skin on the image.
  • the processor 110 may further include one or more interfaces.
  • the interface may be a universal serial bus (USB) interface 130.
  • the interface can also be an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transceiving transmission Universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) ) Interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transceiving transmission Universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • the embodiments of the present application may connect different modules of the terminal device 100 through interfaces, so that the terminal device 100 can implement different functions. For example, taking pictures, processing, etc. It should be noted that the embodiment of the present application does not limit the connection mode of the interface in the terminal device 100.
  • the USB interface 130 is an interface that complies with the USB standard specification.
  • the USB interface 130 may include a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the terminal device 100, and can also be used to transfer data between the terminal device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect to other terminal devices, such as AR devices.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive the wireless charging input through the wireless charging coil of the terminal device 100. While the charging management module 140 charges the battery 142, it can also supply power to the terminal device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be provided in the processor 110.
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the terminal device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the terminal device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, filter, amplify, etc. the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing. After the low-frequency baseband signal is processed by the baseband processor, it is passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 194.
  • the modem processor may be an independent device. In other embodiments, the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the terminal device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic wave signals via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the terminal device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the terminal device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), and quasi-zenith satellite system (quasi). -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the terminal device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, connected to the display 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the terminal device 100 may include one or N display screens 194, and N is a positive integer greater than one.
  • the terminal device 100 can implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, and an application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193.
  • the camera 193 is used to capture still images or videos.
  • the camera 193 may include photosensitive elements such as a lens group and an image sensor, where the lens group includes a plurality of lenses (convex lens or concave lens) for collecting the light signal reflected by the object to be photographed (such as the back of the hand), and the collected light signal Passed to the image sensor.
  • the image sensor generates an image of the object to be photographed (such as an image of the back of the hand) according to the light signal.
  • the camera 193 can send the image of the back of the hand to the processor 110, and the processor 110 runs the skin roughness detection algorithm provided by the embodiment of the present application to detect the roughness of the back of the hand in the image of the back of the hand.
  • the display screen 194 may display the detection report of the roughness of the back of the hand.
  • the camera 193 shown in FIG. 1 may include 1-N cameras, and the number of cameras is not limited in this application.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the terminal device 100 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the terminal device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card (for example, a Micro SD card) to expand the storage capacity of the terminal device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the terminal device 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, at least one application program (such as a camera application, a skin detection application, etc.) required by a function, and the like.
  • the data storage area can store data created during the use of the terminal device 100 (for example, images captured by a camera, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • the internal memory 121 may also store the code of the skin roughness detection algorithm provided in the embodiment of the present application.
  • the code of the skin roughness detection algorithm stored in the internal memory 121 is executed by the processor 110, the skin roughness detection function is realized.
  • the code of the skin roughness detection algorithm provided in the embodiment of the present application can also be stored in an external memory.
  • the processor 110 may run the code of the skin roughness detection algorithm stored in the external memory through the external memory interface 120 to implement the corresponding wrinkle detection function.
  • the terminal device 100 can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, an application processor, and the like. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the terminal device 100 can listen to music or listen to a hands-free call through the speaker 170A.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the terminal device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the terminal device 100 may be provided with at least one microphone 170C.
  • the terminal device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals.
  • the terminal device 100 may also be provided with three, four or more microphones 170C to realize sound signal collection, noise reduction, sound source identification, and directional recording functions.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface, etc. .
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA, CTIA
  • the sensor module 180 includes an ambient light sensor 180L.
  • the sensor module 180 may also include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, Bone conduction sensor 180M and so on.
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may include at least two parallel plates with conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the terminal device 100 determines the strength of the pressure according to the change in capacitance.
  • the terminal device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the terminal device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example, when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the terminal device 100.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the terminal device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the terminal device 100 in various directions (generally three axes).
  • Distance sensor 180F used to measure distance.
  • the terminal device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the terminal device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the terminal device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • Touch sensor 180K also called "touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch screen is composed of the touch sensor 180K and the display screen 194, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the terminal device 100, which is different from the position of the display screen 194.
  • the button 190 may include a power-on button, a volume button, and the like.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the terminal device 100 may receive key input, and generate key signal input related to user settings and function control of the terminal device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations that act on different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the terminal device 100.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the terminal device 100.
  • the terminal device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • At least one refers to one or more
  • multiple refers to two or more.
  • And/or describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship.
  • the following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • At least one item (a) of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple .
  • words such as “first” and “second” are only used for the purpose of distinguishing description, and cannot be understood as indicating or implying relative importance, nor can it be understood as indicating Or imply the order.
  • the single connected domain involved in the embodiments of this application refers to: the inside of any simple closed curve belonging to a region D belongs to D, then D is called a single connected region, and a single connected domain can also be described as follows: any closed curve in D The area enclosed by the curve contains only the points in D. More generally speaking, a singly connected area is an area without "holes".
  • Multi-connected domain refers to a region B on the complex plane, if any simple closed curve is made in it, and the interior of the curve does not always belong to B, it is called multi-connected domain.
  • the following takes the roughness detection of the back of the hand as an example for description.
  • the method for detecting the roughness of the back of the hand may be executed by the terminal device 100, for example, by the processor 110 in the terminal device 100.
  • detecting the roughness of the back of the hand through the terminal device 100 shown in FIG. 1 may be the following process:
  • S201 The terminal device 100 acquires a grayscale image of the back image of the hand to be processed.
  • S201 can be implemented through the following two steps:
  • A1 get the image of the back of the hand to be processed.
  • the function of detecting the roughness of the back of the hand is integrated into the application dedicated to skin detection.
  • the application dedicated to skin detection is called "skin measurement application".
  • the skin measurement application can only integrate the skin roughness detection function.
  • the skin detection of the face can also be integrated, such as blackhead detection, pore detection, erythema detection, and so on.
  • the image of the back of the hand is acquired through the skin measurement application, and the skin detection method provided in the embodiment of the present application is executed based on the image of the back of the hand. As shown in FIG. 3, the display screen 194 of the terminal device 100 displays an icon 300 of the skin test application.
  • the terminal device 100 detects the operation on the icon 300, and in response to the operation on the icon 300, displays the user interface 310 of the skin test application on the display screen 194.
  • the user interface 310 of the skin test application includes a detection button 311.
  • the terminal device 100 detects the operation of the detection button 311.
  • the camera 193 is turned on, and the display screen 194 displays the photographing preview interface 320.
  • the photo preview interface 320 is used to display the images collected by the camera 193.
  • the photo preview interface 320 may include a preview area 321, and the preview area 321 is used to display the image collected by the camera 193. It should be understood that the image collected by the camera 193 may be an image of the back of the user's hand.
  • the camera 193 may be a front camera of the terminal device 100 or a rear camera of the terminal device 100.
  • the camera 193 in order to improve the quality of taking pictures, is the rear camera of the terminal device 100 when the pixels of the front camera are lower than the pixels of the rear camera.
  • the terminal device 100 automatically photographs the image collected by the camera 193 to obtain the back image of the hand when the ambient light meets the photographing conditions.
  • the detection button 311 in the embodiment of the present application may also be referred to as a photographing button, or other names, and the embodiment of the present application does not limit the name of the detection button 311.
  • the image of the back of the hand may also be an image already stored in the terminal device 100, for example, stored in the internal memory 121, so that the terminal device 100 obtains the image of the back of the hand from the internal memory 121.
  • it is stored in an external memory, so that the terminal device 100 obtains the image of the back of the hand from the external memory through the external memory interface 120.
  • the terminal device 100 After acquiring the image of the back of the hand, the terminal device 100 converts the image of the back of the hand into a grayscale image.
  • the image of the back of the hand collected by the terminal device 100 through the camera is a color image, and the color of the back of the hand image is converted into a grayscale image in A2.
  • the terminal device 100 extracts texture features in the grayscale image, where the texture features include at least one of texture depth, texture width, wide texture density, and texture density;
  • the texture depth is used to characterize the depth of the lines on the back of the hand
  • the texture width is used to characterize the width of the lines on the back of the hand
  • the wide texture density is used to characterize the density of the lines on the back of the hand whose line width reaches a preset threshold
  • the texture density is used to characterize the lines. Density in the back of the hand.
  • S203 The terminal device 100 determines the roughness of the back of the hand in the back of the hand image to be processed according to the texture feature.
  • the terminal device 100 when it extracts texture features in a grayscale image, it may first preprocess the grayscale image, and then extract the texture feature from the preprocessed image. Preprocessing, for example, can be histogram equalization processing, which can prevent uneven illumination from affecting the extraction of texture features.
  • the grayscale image can be enlarged, reduced, or segmented.
  • the grayscale image can be divided into multiple Image block, and then select several image blocks with more obvious texture features from multiple image blocks for subsequent processing of extracting texture features.
  • the process of preprocessing the grayscale image is as follows:
  • the terminal device 100 performs histogram equalization processing on the gray image to obtain an equalized image.
  • the terminal device 100 divides the equalized image into K image blocks, for example, divides the equalized image into 10 ⁇ 10 image blocks.
  • the terminal device 100 sorts the average gray values of the K image blocks, and obtains N image blocks sorted within a preset ranking range. For example, the average gray value is sorted in the 9th image blocks from 20th to 28th, as shown in Fig. 4.
  • the terminal device 100 obtains N image blocks from K image blocks, in addition to the method shown in B3, obtains N images with gray average values within a preset range from the K image blocks. Piece. If there are less than N image blocks with an average gray value within the preset range, the actual number of image blocks can be used as the reference for subsequent extraction of texture features. If there are more than N image blocks with the mean gray value in the preset range, several image blocks can be randomly removed to make the number reach N. Of course, if the image blocks with the gray mean value in the preset range exceed N , The actual number of image blocks can also be used as the reference for subsequent extraction of texture features.
  • the terminal device 100 extracts the texture feature from the N image blocks, K and N are both positive integers, and N is less than or equal to K.
  • the terminal device 100 performs binarization processing on the N image blocks respectively to obtain N binarized images.
  • the N image blocks when the N image blocks are binarized to obtain N binarized images, the N image blocks may be filtered first for smoothing and denoising the N image blocks.
  • the filtering method can be average filtering, median filtering, Gaussian filtering, bilateral filtering and other methods. Taking mean filtering as an example, the fuzzy kernel 91 ⁇ 91 can be used for mean filtering.
  • binarization is performed on the filtered N image blocks to obtain N binarized images. After the binarization process, the gray value of the grain area is set to 255, and the gray value of other areas is set to 0. For example, see FIG. 5, which is a schematic diagram of a binarized image after the image block has been binarized.
  • the terminal device 100 performs connected domain analysis on the N binarized images to obtain at least one first connected domain, where the at least one first connected domain is used to indicate that the skin texture area is in the N image blocks position.
  • the N binarized images when performing connected domain analysis on the N binarized images to obtain at least one first connected domain, may be corroded and/or expanded first, and then the corroded and /Or the connected domain analysis is performed on the N binarized images after the expansion process to obtain the at least one first connected domain.
  • a 5 ⁇ 5 corrosion core may be used to respectively perform corrosion processing on the N binarized images, and then perform connected component analysis on the N binarized images after the corrosion processing to obtain at least one first connected component.
  • the terminal device 100 extracts the texture feature from the area where the at least one first connected domain is located in the N image blocks.
  • the following exemplarily describes the implementation of determining the texture depth, texture width, wide texture density, or texture density in the texture feature.
  • the terminal device 100 may determine according to the average gray value of the area where the at least one first connected domain in the N image blocks is located and the average gray value of the N image blocks.
  • the texture depth can be determined by the following formula (1):
  • F1 represents texture depth
  • M1 represents the average gray value of the area where the at least one first connected domain in the N image blocks is located
  • M represents the average gray value of the N image blocks.
  • the terminal device 100 may extract the texture depth of each of the N image blocks, and then use the average value of the texture depths of the N image blocks as the texture depth of the grayscale image.
  • the first image block is any one of the N image blocks.
  • the texture depth of the first image block can be determined by the following formula (2):
  • E1 represents the texture depth of the first image block
  • X represents the average gray value of the area where the first connected domain in the first image block is located
  • X1 represents the average gray value of the first image block.
  • the terminal device 100 When the terminal device 100 extracts the texture depth in the grayscale image, it may also regard the N image blocks as a whole, directly determine the grayscale mean value of the area where the first connected domain of the N image blocks is located, and determine the N images The gray average value of the block, thereby determining the texture depth of the gray image based on the above formula (1).
  • the terminal device 100 may determine the texture width according to the length and area of the outer contour of the second connected domain in the at least one first connected domain.
  • the second connected domain is the first connected domain with the longest outer contour length or the largest area in the at least one first connected domain.
  • the texture width can be determined by the following formula (3) or formula (4):
  • F2 represents the texture width
  • F1 represents the texture depth
  • S1 represents the area of the second connected domain
  • L1 represents the length of the outer contour of the second connected domain
  • L0 represents the sum of the length of the inner contour of the second connected domain.
  • the white area is the second connected domain
  • L2 is the length of the outer ring of the white area
  • L0 is equal to the length of the inner ring of the white area, that is, the length of the black ellipse surrounded by the white area.
  • the area of the first connected domain is the area of the white area, that is, the area of the ring area.
  • the white area is the second connected domain
  • L2 is the length of the outer ring of the white area
  • L0 is equal to the sum of the lengths of the two black ellipses surrounded by the white area, that is, the sum of L3 and L4.
  • the texture width can be determined by the following formula (5) or formula (6):
  • F2 represents the texture width
  • F1 represents the texture depth
  • S1 represents the area of the second connected domain
  • L1 represents the outer contour length of the second connected domain.
  • S1 is the white area.
  • Area is the outline perimeter of the white area.
  • the terminal device 100 may determine from the at least one first connected domain at least one third connected domain whose outer contour length is greater than a preset threshold, and then determine K image blocks containing the third connected domain among the N image blocks, Furthermore, the area sum of the third connected domain included in each image block in the K image blocks is determined separately, and the ratio of the area sum to the area sum of the N image blocks obtains K ratios; finally, the average value of the K ratios is multiplied
  • the above texture depth is determined as the wide texture density, or the average of the K ratios is determined as the wide texture density.
  • the wide texture density can be determined by the following formula (7) or formula (8):
  • F3 represents the wide texture density
  • S2 represents the area sum of the third connected domain included in the K image blocks.
  • there are 3 image blocks including the third connected domain which are image block 1, image block 2, and image block 3.
  • Image block 1 includes 2 third connected domains
  • image block 2 includes 1 third connected domain.
  • 3 includes 3 third connected domains
  • S2 is the area of the 2 third connected domains in image block 1, the area of 1 third connected domain in image block 2, and the 3 third connected domains in image block 3.
  • the sum of the area of the domain that is, the sum of the area of the 6 third connected domains.
  • S represents the area sum of N image blocks.
  • the texture density can be determined by the following formula (9) or formula (10):
  • F4 represents the texture density
  • S3 represents the sum of the areas of the first connected domain included in the N image blocks
  • S is the sum of the areas of the N image blocks.
  • N is 4, which are image block 1-image block 4
  • image block 1 includes 2 first connected domains
  • image block 2 includes 3 first connected domains
  • image block 3 includes 3 first connected domains
  • image block 4 includes 1 first connected domain
  • S3 is the area of the 2 first connected domains in image block 1, the area of the 3 first connected domains in image block 2, and the 3 first connected domains in image block 3.
  • the area of the domain, and the sum of the area of one first connected domain in the image block 4 that is, the sum of the area of the nine first connected domains.
  • S represents the area sum of N image blocks.
  • the texture feature of the gray image may be determined in the following manner. Perform mean filtering on the grayscale image, and perform binarization processing on the grayscale image after the mean filtering, perform connected domain analysis on the binarized image to obtain at least one connected domain, and then extract the area where at least one connected domain is located Texture characteristics. For example, when determining the texture depth, the ratio of the gray value of the connected domain position in the gray image to the gray value of the gray image can be used as the texture depth. When determining the texture width, it can be determined according to at least one connected domain with the largest area or the longest outer contour length in at least one connected domain. For a specific determination method, refer to formula (3) or formula (4).
  • a connected domain whose outer contour length is greater than a preset threshold can be determined from the at least one connected domain, and the sum of the area of the connected domain whose outer contour length is greater than the preset threshold can be determined as compared with the gray scale image.
  • the ratio between the areas is the density of the wide texture, or the sum of the areas of connected domains whose outer contour length is greater than a preset threshold is determined, and the ratio between the area and the area of the gray image multiplied by the depth of the texture is determined to be the wide texture density.
  • the ratio between the area sum of at least one connected domain and the area of the gray-scale image may be determined as the texture density, or the area sum of at least one connected domain may be determined as the texture density.
  • the ratio between the areas of the image multiplied by the above-mentioned texture depth is determined as the texture density.
  • the terminal device 100 may use an integrated learning algorithm model to determine the roughness of the skin in the skin image to be processed according to the texture feature.
  • the integrated learning algorithm can be, for example, Adaboost, or Boosting, or bagging (bootstrap aggregation, bagging).
  • the ensemble learning algorithm model can be trained in the following way:
  • Multiple "skin experts" respectively determine the roughness of the back of the hands of each person in the crowd, that is, score the back of the hands, and can use a 100-point system, a 10-point system, or a 1-point system.
  • the average score of a person by multiple skin experts is used as a label for the roughness of the back of a person's hand.
  • the preset integrated learning algorithm model is trained, and the integrated learning algorithm model obtained through training can be used as a model for detecting the roughness of the skin in the skin image to be processed in the embodiment of the present application.
  • skin expert scoring system
  • AdaBoost AdaBoost to obtain a correlation coefficient of 0.88 in the test set through cross-validation, which preliminarily verifies the feasibility of the algorithm.
  • the test results are shown in Figure 7 and Figure 8.
  • FIG. 7 shows the test result of the method of dividing the image block
  • FIG. 8 shows the test result of the method of not using the image block.
  • the x-axis in Figures 7 and 8 represents the scoring result of the test set data by the skin expert, and the y-axis is the model test result.
  • the lines in Figure 7 and Figure 8 are ideal fitting results. The closer the black point set is to the red line, the higher the correlation.
  • the correlation test result in Fig. 7 is 0.73
  • the correlation test result in Fig. 8 is 0.88.
  • the method of segmenting image blocks is better than the method of non-segmenting image blocks.
  • the terminal device 100 displays the roughness value (that is, the scoring result) on the display screen, and can also display care suggestions for the back of the hand. For example, see Figure 9.
  • the method provided in the embodiments of the present application is introduced from the perspective of an electronic device as an execution subject.
  • the electronic device may include a hardware structure and/or a software module, and realize the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function of the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
  • FIG. 10 shows an electronic device 1000 provided in this application.
  • the electronic device 1000 includes at least one processor 1010 and a memory 1020, and may also include a display screen 1030 and a camera 1040.
  • the processor 1010 is coupled with the memory 1020, the display screen 1030, and the camera 1040.
  • the coupling in the embodiment of the present application is an indirect coupling or communication connection between devices, units, or modules, and may be in electrical, mechanical or other forms. Used for information exchange between devices, units or modules.
  • the memory 1020 is used to store program instructions
  • the camera 1040 is used to capture images
  • the display screen 1030 is used to display a photo preview interface when the camera 1040 starts shooting.
  • the photo preview interface includes the images collected by the camera 1040.
  • the display screen 1030 may also be used to display the user interfaces involved in the above embodiments, such as the user interface shown in FIG. 3, the interface shown in FIG. 9, and so on.
  • the processor 1010 is configured to call and execute the program instructions stored in the memory 1020, and execute the steps in the color spot detection method shown in FIG. 3 above.
  • the electronic device 1000 can be used to implement the skin roughness detection method shown in FIG. 2 in the embodiment of the present application, and related features can be referred to the above, which will not be repeated here.
  • the embodiments of the present application can be implemented by hardware, firmware, or a combination of them.
  • the above-mentioned functions can be stored in a computer-readable medium or transmitted as one or more instructions or codes on the computer-readable medium.
  • the computer-readable medium includes a computer storage medium and a communication medium, where the communication medium includes any medium that facilitates the transfer of a computer program from one place to another.
  • the storage medium may be any available medium that can be accessed by a computer.
  • computer-readable media can include RAM, ROM, electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory, CD- ROM) or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or any other media that can be used to carry or store desired program codes in the form of instructions or data structures and that can be accessed by a computer.
  • EEPROM electrically erasable programmable read-only memory
  • CD- ROM compact disc read-only memory
  • Any connection can suitably become a computer-readable medium.
  • disks and discs include compact discs (CDs), laser discs, optical discs, digital video discs (digital video discs, DVDs), floppy discs, and Blu-ray discs. Disks usually copy data magnetically, while disks use lasers to copy data optically. The above combination should also be included in the protection scope of the computer-readable medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Dermatology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

Procédé de mesure de rugosité cutanée. Selon une profondeur de texture, une largeur de texture, la densité de la texture large, et une densité de texture extraite d'une peau à mesurer, un résultat de mesure de rugosité cutanée est obtenu au moyen d'un procédé d'utilisation d'un modèle d'apprentissage machine, et ainsi, un résultat de mesure de rugosité cutanée plus précis et plus intuitif est obtenu.
PCT/CN2020/115973 2019-09-18 2020-09-17 Procédé de mesure de rugosité cutanée et dispositif électronique WO2021052436A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910882738.8 2019-09-18
CN201910882738.8A CN112603259B (zh) 2019-09-18 2019-09-18 一种皮肤粗糙度检测方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021052436A1 true WO2021052436A1 (fr) 2021-03-25

Family

ID=74883387

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/115973 WO2021052436A1 (fr) 2019-09-18 2020-09-17 Procédé de mesure de rugosité cutanée et dispositif électronique

Country Status (2)

Country Link
CN (1) CN112603259B (fr)
WO (1) WO2021052436A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117392121B (zh) * 2023-12-07 2024-03-08 西安鼎福十方网络智能科技有限公司 一种基于图像识别的经皮给药治疗控制方法及系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004230117A (ja) * 2003-01-30 2004-08-19 Pola Chem Ind Inc シワ・レプリカ像の鑑別法
CN102036607A (zh) * 2008-05-23 2011-04-27 宝丽化学工业有限公司 用于自动判断皮肤肌理和/或皱纹的方法
CN105844236A (zh) * 2016-03-22 2016-08-10 重庆医科大学 基于皮肤图像信息处理的年龄测试方法
CN106384075A (zh) * 2016-03-11 2017-02-08 株式会社爱茉莉太平洋 基于皮肤团块的皮肤纹理的评估装置及评估方法
CN109299632A (zh) * 2017-07-25 2019-02-01 上海中科顶信医学影像科技有限公司 皮肤检测方法、系统、设备及存储介质
CN110008925A (zh) * 2019-04-15 2019-07-12 中国医学科学院皮肤病医院 一种基于集成学习的皮肤自动检测方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4909686B2 (ja) * 2006-09-08 2012-04-04 学校法人東京理科大学 表皮組織定量化装置及びプログラム
TW201944981A (zh) * 2014-11-10 2019-12-01 日商資生堂股份有限公司 皮膚光澤之評價方法、皮膚光澤提升劑之探索方法及皮膚光澤提升劑
CN107157447B (zh) * 2017-05-15 2020-03-20 北京工商大学 基于图像rgb颜色空间的皮肤表面粗糙度的检测方法
KR102049040B1 (ko) * 2017-08-07 2019-11-27 (재)예수병원유지재단 생체용 거칠기 진단장치
CN108154510A (zh) * 2018-01-17 2018-06-12 深圳市亿图视觉自动化技术有限公司 产品表面缺陷检测方法、装置及计算机可读存储介质
CN109325468B (zh) * 2018-10-18 2022-06-03 广州智颜科技有限公司 一种图像处理方法、装置、计算机设备和存储介质
CN110210448B (zh) * 2019-06-13 2022-09-13 广州纳丽生物科技有限公司 一种智能人脸皮肤老化程度的识别与评估方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004230117A (ja) * 2003-01-30 2004-08-19 Pola Chem Ind Inc シワ・レプリカ像の鑑別法
CN102036607A (zh) * 2008-05-23 2011-04-27 宝丽化学工业有限公司 用于自动判断皮肤肌理和/或皱纹的方法
CN106384075A (zh) * 2016-03-11 2017-02-08 株式会社爱茉莉太平洋 基于皮肤团块的皮肤纹理的评估装置及评估方法
CN105844236A (zh) * 2016-03-22 2016-08-10 重庆医科大学 基于皮肤图像信息处理的年龄测试方法
CN109299632A (zh) * 2017-07-25 2019-02-01 上海中科顶信医学影像科技有限公司 皮肤检测方法、系统、设备及存储介质
CN110008925A (zh) * 2019-04-15 2019-07-12 中国医学科学院皮肤病医院 一种基于集成学习的皮肤自动检测方法

Also Published As

Publication number Publication date
CN112603259B (zh) 2022-04-19
CN112603259A (zh) 2021-04-06

Similar Documents

Publication Publication Date Title
CN112215802B (zh) 一种皮肤检测方法和电子设备
US20220180485A1 (en) Image Processing Method and Electronic Device
WO2020134877A1 (fr) Procédé de détection de la peau et dispositif électronique
CN113170037B (zh) 一种拍摄长曝光图像的方法和电子设备
WO2020015144A1 (fr) Procédé de photographie et dispositif électronique
CN114140365B (zh) 基于事件帧的特征点匹配方法及电子设备
CN111566693B (zh) 一种皱纹检测方法及电子设备
WO2021052436A1 (fr) Procédé de mesure de rugosité cutanée et dispositif électronique
CN111417982B (zh) 一种色斑检测方法及电子设备
CN115437601B (zh) 图像排序方法、电子设备、程序产品及介质
CN114241347A (zh) 皮肤敏感度的显示方法、装置、电子设备及可读存储介质
CN111557007B (zh) 一种检测眼睛睁闭状态的方法及电子设备
WO2023005318A1 (fr) Procédé d'évaluation de la qualité d'un signal de détection physiologique, dispositif électronique et support de stockage
CN111460942B (zh) 接近检测方法及装置、计算机可读介质及终端设备
CN115633114A (zh) 通信录字母的显示方法、装置和终端设备
WO2024082976A1 (fr) Procédé de reconnaissance d'ocr pour image de texte, dispositif électronique et support
CN117499797B (zh) 图像处理方法及相关设备
CN114945176B (zh) 剪切板访问管控方法、电子设备及存储介质
CN117726543A (zh) 图像处理方法和装置
CN117710701A (zh) 一种追踪物体的方法、装置和电子设备
CN116030787A (zh) 基于年龄的声音生成方法和装置
CN117710695A (zh) 图像数据的处理方法及电子设备
CN117523077A (zh) 一种虚拟形象生成方法及装置
CN116157842A (zh) 一种人物三维模型的重建方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20866410

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20866410

Country of ref document: EP

Kind code of ref document: A1