WO2021190387A1 - 检测结果输出的方法、电子设备及介质 - Google Patents

检测结果输出的方法、电子设备及介质 Download PDF

Info

Publication number
WO2021190387A1
WO2021190387A1 PCT/CN2021/081452 CN2021081452W WO2021190387A1 WO 2021190387 A1 WO2021190387 A1 WO 2021190387A1 CN 2021081452 W CN2021081452 W CN 2021081452W WO 2021190387 A1 WO2021190387 A1 WO 2021190387A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
detection result
information
electronic device
Prior art date
Application number
PCT/CN2021/081452
Other languages
English (en)
French (fr)
Inventor
王强
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to KR1020227036902A priority Critical patent/KR20220157485A/ko
Priority to EP21776269.9A priority patent/EP4131067A4/en
Priority to JP2022557197A priority patent/JP7467667B2/ja
Publication of WO2021190387A1 publication Critical patent/WO2021190387A1/zh
Priority to US17/948,530 priority patent/US20230014409A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/442Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7221Determining signal validity, reliability or quality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the embodiments of the present invention relate to the field of Internet technology, and in particular to a method, electronic equipment, and media for outputting detection results.
  • electronic devices can obtain the user's skin quality data by collecting the user's facial image information, and then detect the user's skin quality data, generate the skin quality detection result, and display it to the user.
  • the electronic device will obtain multiple captured images from different angles.
  • the electronic device in the prior art detects the skin quality information in the collected image, it will perform skin quality detection on each collected image from different angles to obtain the skin quality detection result.
  • the skin quality test results of the same user generally do not differ due to different shooting angles.
  • the method of performing skin quality detection on each captured image in the prior art greatly reduces the output speed of the detection result of the electronic device, and the user experience is not good.
  • the embodiments of the present invention provide a detection result output method, electronic device, and medium, which can increase the output speed of the detection result of the electronic device and improve the user experience.
  • an embodiment of the present invention provides a method for outputting a detection result, which is applied to an electronic device, and includes:
  • the first detection result is output, where the target detection result is the detection result corresponding to the target image, and the first detection result is the detection result corresponding to the first image. Test results.
  • an electronic device including:
  • An acquiring module for acquiring first image information of a first object, where the first image information includes skin quality information of the first object;
  • the first output module is configured to output the target detection result when the degree of matching between the first image and the target image meets the first preset condition
  • the second output module is used to output the first detection result when the degree of matching between the first image and the target image does not meet the first preset condition, where the target detection result is the detection result corresponding to the target image, and the first detection The result is the detection result corresponding to the first image.
  • an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored on the memory and running on the processor.
  • the processor executes the computer program instructions to implement the detection as in the first aspect. Steps of the method for outputting results.
  • an embodiment of the present invention provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, the steps of the detection result output method as in the first aspect are realized.
  • the electronic device obtains the first image information including the skin quality information of the first object, and determines the degree of matching between the first image and the target image.
  • the degree of matching does not meet the first preset condition
  • the first detection result corresponding to the first image is output, and when the degree of matching between the first image and the target image meets the first preset condition, it is no longer necessary
  • the skin quality detection is performed on the first object, but the detection result corresponding to the target object is directly output, thereby improving the output speed of the skin quality detection result of the electronic device and improving the user experience.
  • FIG. 1 is a schematic flowchart of a method for outputting a detection result according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of a method for outputting a detection result according to another embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of a method for outputting a detection result according to another embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of an electronic device provided by an embodiment of the present invention.
  • Fig. 5 is a schematic diagram of the hardware structure of an electronic device implementing various embodiments of the present invention.
  • the embodiments of the present invention provide a method, electronic device, and medium that can increase the output speed of the detection result of the electronic device and improve the user experience of the detection result output.
  • FIG. 1 is a schematic flowchart of a method for outputting a detection result provided by an embodiment of the present invention. As shown in FIG. 1, the method for outputting detection results applied to an electronic device includes: S101, S102, and S103.
  • S101 Acquire first image information of a first object.
  • the first image information includes skin quality information of the first object, and the first image information also includes a first image.
  • the first object may be a human
  • the skin quality information may include, but is not limited to, the following items: skin moisture, elasticity index, and oiliness index.
  • the electronic device may obtain the first image of the first object through the camera component, and may also store the first image in an image database corresponding to the first object.
  • the target detection result is the detection result corresponding to the target object.
  • the target image may be an image acquired last time by the target object stored in the electronic device, or may be another image corresponding to the target object stored in the electronic device.
  • the degree of matching between the first image and the target image satisfies the first preset condition, which may be that the degree of matching between the scene information of the first image and the scene information of the target image is greater than the predetermined condition.
  • Setting the matching degree threshold may also be that the image similarity between the first object and the target object is greater than the preset similarity threshold.
  • the first image includes a first object
  • the target image includes a target object.
  • the scene information includes at least one of image background information and image shooting location information.
  • the image background can be regarded as an image area outside the object in the processed image, that is, the image in the first image other than the first object.
  • the image in the target image except the target object, and the image background information is the parameter information such as the brightness, chroma, or sharpness of the part of the image;
  • the image shooting location information is the geographic location of the electronic device when the photo was taken information.
  • the matching degree between the scene information of the first image and the scene information of the target image when the matching degree between the scene information of the first image and the scene information of the target image is greater than the preset matching degree threshold, it may be the difference between the brightness of the first image and the brightness of the target image.
  • the matching degree is greater than the preset matching degree threshold; it can also be that the matching degree between the chromaticity of the first image and the chromaticity in the target image is greater than the preset matching degree threshold; it can also be the definition of the first image and the target image.
  • the degree of matching between the definitions of is greater than the preset matching degree threshold; it may also be that the degree of matching between the image shooting position of the first image and the image shooting position in the target image is greater than the preset matching degree threshold.
  • the first object may be the facial image of the user in the first image
  • the target object is the facial image of the user in the target image
  • the image similarity between the first object and the target object is greater than the preset similarity
  • a degree threshold it can be determined that the degree of matching between the first image and the target image satisfies the first preset condition.
  • the electronic device after acquiring the first image information, can determine the degree of matching between the first image and the target image. And when the degree of matching between the first image and the target image meets the first preset condition, it can be considered that the degree of matching between the first image and the target image is relatively high. Information is detected, but the target detection result corresponding to the target image is directly output, thereby improving the output efficiency of the detection result of the electronic device.
  • the degree of matching between the first image and the target image that does not satisfy the first preset condition may include any one of the following:
  • the matching degree between the scene information of the first image and the scene information of the target image is less than or equal to the preset matching degree threshold, and the image similarity between the first image and the target image is less than or equal to the preset similarity threshold.
  • the degree of matching between the first image and the target image does not meet the first preset condition, it can be considered that the degree of matching between the first image and the target image is low. Accuracy, the skin quality information in the first image can be detected, and the first detection result corresponding to the first image is finally output.
  • the electronic device obtains the first image information including the skin quality information of the first object, and determines the degree of matching between the first image and the target image.
  • the degree of matching does not meet the first preset condition
  • the first detection result corresponding to the first image is output, and when the degree of matching between the first image and the target image meets the first preset condition, it is no longer necessary
  • the skin quality detection is performed on the first object, but the detection result corresponding to the target object is directly output, thereby improving the output speed of the skin quality detection result of the electronic device and improving the user experience.
  • the output target detection result may also be a detection result obtained after information fusion processing is performed on the first detection result and the second detection result.
  • the first detection result is the detection result corresponding to the skin quality information in the first image
  • the second detection result is the detection result corresponding to the skin quality information in the target image.
  • the output target detection result is the detection result after the first detection result and the detection result information corresponding to the skin quality information in the target image, it is avoided to a certain extent that the same user is in the same
  • the skin quality detection performed in the scene will result in a large difference in the skin quality detection result, which makes the target detection result output by the electronic device more accurate, thereby improving the user experience.
  • the method for outputting the detection result further includes the following steps:
  • the target image stored here may include the parameter values of the target object in the target image.
  • the target object is a person
  • the size and distance of the five sense organs of the target object may be stored, or may include the parameters of the target image.
  • the background information and the shooting position information of the target image are used to improve the comparison speed when comparing with the first image information.
  • the electronic device may also detect the facial image of the first object.
  • the electronic device may also detect the facial image of the first object.
  • FIG. 2 is a schematic flowchart of a detection result output method provided by another embodiment of the present invention.
  • the method includes: S201 to S206.
  • the electronic device before the electronic device acquires the first image, it also needs to turn on the image acquisition mode, for example, turn on the camera.
  • S204 Acquire first image information of the first object when the second preset condition is satisfied.
  • the first image includes skin quality information of the first object, and the first image information also includes the first image.
  • the target detection result is the detection result corresponding to the target image.
  • the first detection result is the detection result corresponding to the first image.
  • S204-S206 and S101-S103 are the same steps, which will not be repeated here.
  • the electronic device detects the facial image of the first object before acquiring the first image, so as to obtain an image more suitable for skin quality detection, thereby improving the accuracy of the skin quality detection report.
  • FIG. 3 is a schematic flowchart of a detection result output method provided by another embodiment of the present invention.
  • the method for outputting the detection result includes: S301 to S305.
  • S301 Acquire first image information of a first object.
  • the electronic device can also perform the steps described in S201-S203 above, and perform S301 again when the facial image of the first object meets the second preset condition.
  • S302 Determine whether the matching degree between the scene information of the first image and the scene information of the target image is greater than a preset matching degree threshold. If yes, execute S304, if not, execute S305.
  • the higher the preset matching degree threshold the better the match between the scene information of the first image and the scene information of the target image.
  • the preset matching degree threshold can be set according to the requirements of the actual application scenario, and there is no restriction here.
  • S303 Determine whether the image similarity between the first object and the target object is greater than a preset similarity threshold. If yes, execute S304, if not, execute S305.
  • the image histogram may be used for calculation when calculating the image similarity between the first image and the target image.
  • the image histogram is a histogram used to represent the brightness distribution in a digital image, plotting the number of pixels of each brightness value in the image
  • the image similarity between the first image and the target image may be calculated through the image gradient histogram, the image brightness histogram, and the image color histogram.
  • the target detection result is the detection result corresponding to the target image.
  • the first detection result is a result corresponding to the skin quality information in the first image.
  • the electronic device obtains the first image information including the skin quality information of the first object, and judges the matching degree between the first image and the target image from two angles of scene information or image similarity.
  • the degree of matching between the first image and the target image does not meet the first preset condition
  • the first detection result corresponding to the first image is output, and the degree of matching between the first image and the target image meets the first preset Under conditions, it is possible to no longer perform skin quality detection on the first object, but directly output the detection result corresponding to the target object, thereby improving the output speed of the skin quality detection result of the electronic device and improving the user experience.
  • the present invention also provides a specific implementation of the electronic device. See Figure 4.
  • FIG. 4 is a schematic structural diagram of an electronic device provided by an embodiment of the present invention. As shown in FIG. 4, the electronic device 400 includes:
  • the obtaining module 410 is configured to obtain first image information of a first object, where the first image information includes skin quality information of the first object;
  • the first output module 420 is configured to output the target detection result when the degree of matching between the first image and the target image meets the first preset condition
  • the second output module 430 is configured to output a first detection result when the degree of matching between the first image and the target image does not meet the first preset condition, where the target detection result is the detection result corresponding to the target image, and the first The detection result is the detection result corresponding to the first image.
  • the electronic device obtains the first image information including the skin quality information of the first object, and determines the degree of matching between the first image and the target image.
  • the degree of matching does not meet the first preset condition
  • the first detection result corresponding to the first image is output, and when the degree of matching between the first image and the target image meets the first preset condition, it is no longer necessary Perform skin quality detection on the skin quality information of the first object, but directly output the detection result corresponding to the target object, thereby improving the output speed of the skin quality detection result of the electronic device and improving the user experience.
  • the degree of matching between the first image and the target image satisfies the first preset condition, including:
  • the matching degree between the scene information of the first image and the scene information of the target image is greater than a preset matching degree threshold
  • the image similarity between the first object and the target object is greater than a preset similarity threshold, wherein the first image includes the first object, and the target image includes the target object.
  • the scene information includes at least one of the following:
  • Image background information and image shooting location information are included in
  • the electronic device 400 further includes:
  • the detection module is used to detect whether the facial image of the first object meets the second preset condition
  • the obtaining module 410 is specifically used for:
  • the first image of the first object is acquired, where the second preset condition includes at least one of the following: the completeness of the facial image is greater than the preset completeness threshold, and the facial image The illumination value of is greater than the preset illumination threshold and the facial image sharpness is greater than the preset sharpness threshold.
  • the electronic device 400 further includes:
  • the storage module is used to store the target image and the detection result corresponding to the target image.
  • FIG. 5 is a schematic diagram of the hardware structure of an electronic device implementing various embodiments of the present invention.
  • the electronic device 600 includes but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and Power 611 and other components.
  • a radio frequency unit 601 includes but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and Power 611 and other components.
  • the electronic device may include more or fewer components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the processor 610 is configured to obtain first image information of the first object, where the first image information includes skin quality information of the first object; in the case where the degree of matching between the first image and the target image satisfies the first preset condition , Output the target detection result; when the matching degree between the first image and the target image does not meet the first preset condition, output the first detection result, where the target detection result is the detection result corresponding to the target image, and the first detection The result is the detection result corresponding to the first image.
  • the electronic device obtains the first image information including the skin quality information of the first object, and determines the degree of matching between the first image and the target image, and determines the difference between the first image and the target image.
  • the degree of matching does not meet the first preset condition
  • the first detection result corresponding to the first image is output, and when the degree of matching between the first image and the target image meets the first preset condition, there is no need to stop
  • the skin quality detection is performed on the first object, but the detection result corresponding to the target object is directly output, thereby improving the output speed of the skin quality detection result of the electronic device and improving the user experience.
  • the radio frequency unit 601 can be used to receive and send signals during information transmission or communication. Specifically, the downlink data from the base station is received and sent to the processor 610 for processing; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 601 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 602, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 603 can convert the audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into audio signals and output them as sounds. Moreover, the audio output unit 603 may also provide audio output related to a specific function performed by the electronic device 600 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 604 is used to receive audio or video signals.
  • the input unit 604 may include a graphics processing unit (GPU) 6041 and a microphone 6042.
  • the graphics processor 6041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame may be displayed on the display unit 606.
  • the image frame processed by the graphics processor 6041 may be stored in the memory 609 (or other storage medium) or sent via the radio frequency unit 601 or the network module 602.
  • the microphone 6042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 601 for output in the case of a telephone call mode.
  • the electronic device 600 further includes at least one sensor 605, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 6061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 6061 and the display panel 6061 when the electronic device 600 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 605 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 606 is used to display information input by the user or information provided to the user.
  • the display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 607 may be used to receive inputted number or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the user input unit 607 includes a touch panel 6071 and other input devices 6072.
  • the touch panel 6071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 6071 or near the touch panel 6071. operate).
  • the touch panel 6071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 610, the command sent by the processor 610 is received and executed.
  • the touch panel 6071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 607 may also include other input devices 6072.
  • other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 6071 can cover the display panel 6061.
  • the touch panel 6071 detects a touch operation on or near it, it is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 determines the type of touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 6061.
  • the touch panel 6071 and the display panel 6061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 6071 and the display panel 6061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 608 is an interface for connecting an external device and the electronic device 600.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 608 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 600 or can be used to connect the electronic device 600 to an external device. Transfer data between devices.
  • the memory 609 can be used to store software programs and various data.
  • the memory 609 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 609 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 610 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes the software programs and/or modules stored in the memory 609, and calls the data stored in the memory 609. , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
  • the processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 610.
  • the electronic device 600 may also include a power source 611 (such as a battery) for supplying power to various components.
  • a power source 611 such as a battery
  • the power source 611 may be logically connected to the processor 610 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. And other functions.
  • the electronic device 600 includes some functional modules not shown, which will not be repeated here.
  • the embodiment of the present invention also provides an electronic device, including a processor 610, a memory 609, and a computer program stored on the memory 609 and running on the processor 610.
  • an electronic device including a processor 610, a memory 609, and a computer program stored on the memory 609 and running on the processor 610.
  • the computer program is executed by the processor 610,
  • the various processes of the above method embodiments for outputting the detection results are realized, and the same technical effects can be achieved. In order to avoid repetition, details are not repeated here.
  • the embodiment of the present invention also provides an electronic device, which is configured to perform each process of the foregoing detection result output method embodiment, and can achieve the same technical effect. In order to avoid repetition, it will not be repeated here.
  • the embodiment of the present invention also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • examples of the computer-readable storage medium include non-transitory computer-readable storage media, such as read-only memory (Read-Only Memory, ROM for short), Random Access Memory (RAM for short), and magnetic CD or CD, etc.
  • the embodiment of the present invention also provides a computer program product, which can be executed by a processor to realize the various processes of the above-mentioned detection result output method embodiment, and can achieve the same technical effect. In order to avoid repetition, it is not here. Go into details again.
  • Such a processor can be, but is not limited to, a general-purpose processor, a dedicated processor, a special application processor, or a field programmable logic circuit. It can also be understood that each block in the block diagram and/or flowchart and the combination of the blocks in the block diagram and/or flowchart can also be implemented by dedicated hardware that performs specified functions or actions, or can be implemented by dedicated hardware and A combination of computer instructions.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present invention.
  • a terminal which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Theoretical Computer Science (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Epidemiology (AREA)
  • Artificial Intelligence (AREA)
  • Primary Health Care (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Dermatology (AREA)
  • Physiology (AREA)
  • Quality & Reliability (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Business, Economics & Management (AREA)
  • Dentistry (AREA)

Abstract

一种检测结果输出的方法、电子设备及介质。检测结果输出的方法,包括:获取第一对象的第一图像信息(S101),其中,第一图像信息包括第一对象的肤质信息;在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果(S102);在第一图像与目标图像的匹配度不满足第一预设条件的情况下,输出第一检测结果(S103),其中,目标检测结果为目标图像对应的检测结果,第一检测结果为第一图像对应的检测结果。

Description

检测结果输出的方法、电子设备及介质
相关申请的交叉引用
本申请主张在2020年03月25日在中国提交的中国专利申请号202010218894.7的优先权,其全部内容通过引用包含于此。
技术领域
本发明实施例涉及互联网技术领域,尤其涉及一种检测结果输出的方法、电子设备及介质。
背景技术
互联网技术的不断发展,电子设备可以通过采集用户的面部图像信息,获取到用户的肤质数据,进而对用户的肤质数据进行检测,生成肤质检测结果,并展示给用户。
但是,由于用户拍摄角度的不同,电子设备会得到多张不同角度的采集图像。然而现有技术中电子设备在对采集图像中的肤质信息进行检测的时候,会对每一张不同角度的采集图像均进行肤质检测,得到肤质检测结果。而同一个用户的肤质检测结果一般是不会由于拍摄角度的不同而不同。
因此,现有技术中对每一张采集图像都进行肤质检测的方式,大大降低了电子设备的检测结果的输出速度,用户的使用体验不佳。
发明内容
本发明实施例提供一种检测结果输出的方法、电子设备及介质,能够提高电子设备的检测结果的输出速度,提高用户的使用体验。
第一方面,本发明实施例提供了一种检测结果输出的方法,应用于电子设备,包括:
获取第一对象的第一图像信息,其中,第一图像信息包括第一对象的肤质信息;
在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果;
在第一图像与目标图像的匹配度不满足第一预设条件的情况下,输出第一检测结果,其中,目标检测结果为目标图像对应的检测结果,第一检测结果为第一图像对应的检测结果。
第二方面,本发明实施例提供了一种电子设备,包括:
获取模块,用于获取第一对象的第一图像信息,其中,第一图像信息包括第一对象的肤质信息;
第一输出模块,用于在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果;
第二输出模块,用于在第一图像与目标图像的匹配度不满足第一预设条件的情况下,输出第一检测结果,其中,目标检测结果为目标图像对应的检测结果,第一检测结果为第一图像对应的检测结果。
第三方面,本发明实施例提供了一种电子设备,包括处理器、存储器及存储在存储器上并可在处理器上运行的计算机程序,处理器执行计算机程序指令时实现如第一方面的检测结果输出的方法的步骤。
第四方面,本发明实施例提供了一种计算机可读存储介质,计算机可读存储介质上存储计算机程序,计算机程序被处理器执行时实现如第一方面的检测结果输出的方法的步骤。
在本发明实施例中,电子设备通过获取第一对象的包括肤质信息的第一图像信息,并通过判断第一图像和目标图像之间的匹配度,在第一图像和目标图像之间的匹配度不满足第一预设条件的情况下,输出第一图像对应的第一检测结果,在第一图像和目标图像之间的匹配度满足第一预设条件的情况下,就可以不再对第一对象进行肤质检测,而是直接输出目标对象对应的检测结果,从而提高电子设备的肤质检测结果的输出速度,提高用户的使用体验。
附图说明
从下面结合附图对本发明的具体实施方式的描述中可以更好地理解本发明,其中,相同或相似的附图标记表示相同或相似的特征。
图1为本发明一个实施例提供的检测结果输出的方法的流程示意图;
图2为本发明另一个实施例提供的检测结果输出的方法的流程示意图;
图3为本发明又一个实施例提供的检测结果输出的方法的流程示意图;
图4为本发明一个实施例提供的电子设备的结构示意图;
图5为实现本发明各个实施例的电子设备的硬件结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
为了解决现有技术中存在的问题,本发明实施例提供了一种能够提高电子设备的检测结果的输出速度,提高用户的使用体验的检测结果输出的方法、电子设备及介质。
图1为本发明一个实施例提供的检测结果输出的方法的流程示意图。如图1所示,该应用于电子设备的检测结果输出的方法包括:S101、S102和S103。
S101,获取第一对象的第一图像信息。
其中,第一图像信息包括第一对象的肤质信息,第一图像信息中还包括有第一图像。
可选地,在本发明的一些实施例中,第一对象可以为人,肤质信息可以包括但不限于以下各项:皮肤湿度、弹性指数和油性指数等。
电子设备可以是通过摄像组件获取到第一对象第一图像,并且还可以 将第一图像存储至该第一对象对应的图像数据库中。
S102,在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果。
其中,目标检测结果为目标对象对应的检测结果。
可以理解的是,目标图像可以为是电子设备中存储的目标对象上一次获取到的图像,也可以是电子设备中存储的该目标对象对应的其他图像。
可选的,在本发明的一些实施例中,第一图像与目标图像的匹配度满足第一预设条件,可以是第一图像的场景信息和目标图像的场景信息之间的匹配度大于预设匹配度阈值,也可以是第一对象和目标对象之间的图像相似度大于预设相似度阈值。其中,所述第一图像中包括第一对象,所述目标图像中包括目标对象。
可以理解的,所述场景信息包括图像背景信息和图像拍摄位置信息中的至少一项,图像背景可以看作处理图像中对象外的图像区域,也就是第一图像中除了第一对象外的图像以及目标图像中除了目标对象外的图像,而图像背景信息则是该部分图像的亮度、色度或者清晰度等参数信息;图像拍摄位置信息则是拍摄该张照片时电子设备所处的地理位置信息。
在一些实施例中,在第一图像的场景信息和目标图像的场景信息之间的匹配度大于预设匹配度阈值的情况下,可以是第一图像的亮度和目标图像中的亮度之间的匹配度大于预设匹配度阈值;也可以是第一图像的色度和目标图像中的色度之间的匹配度大于预设匹配度阈值;还可以是第一图像的清晰度和目标图像中的清晰度之间的匹配度大于预设匹配度阈值;还可以是第一图像的图像拍摄位置和目标图像中的图像拍摄位置之间的匹配度大于预设匹配度阈值。
在另一些实施例中,第一对象可以为第一图像中用户的面部图像,目标对象为目标图像中的用户的面部图像,在第一对象和目标对象之间的图像相似度大于预设相似度阈值的情况下,即可确定第一图像和目标图像的匹配度满足第一预设条件。
在本发明实施例中,电子设备在获取到第一图像信息之后,就可以判断第一图像和目标图像之间的匹配度。并且在第一图像与目标图像的匹配 度满足第一预设条件的情况下,就可以认为该第一图像和目标图像匹配度较高,此时就可以无需再次对第一图像中的肤质信息进行检测,而是直接输出目标图像对应的目标检测结果,从而提高电子设备的检测结果输出效率。
S103,在第一图像与目标图像的匹配度不满足第一预设条件的情况下,输出第一检测结果,其中,第一检测结果为第一图像对应的检测结果。
可选的,在本发明的一些实施例中,第一图像与目标图像的匹配度不满足第一预设条件可以包括以下任意一项:
第一图像的场景信息和目标图像的场景信息之间的匹配度小于或等于预设匹配度阈值,和,第一图像和目标图像之间的图像相似度小于或等于预设相似度阈值。
在本发明实施例中,若第一图像与目标图像的匹配度不满足第一预设条件,则可以认为第一图像与目标图像之间的匹配度较低,因此为了保证肤质检测结果的准确性,可以对第一图像中的肤质信息进行检测,最终输出第一图像对应的第一检测结果。
在本发明实施例中,电子设备通过获取第一对象的包括肤质信息的第一图像信息,并通过判断第一图像和目标图像之间的匹配度,在第一图像和目标图像之间的匹配度不满足第一预设条件的情况下,输出第一图像对应的第一检测结果,在第一图像和目标图像之间的匹配度满足第一预设条件的情况下,就可以不再对第一对象进行肤质检测,而是直接输出目标对象对应的检测结果,从而提高电子设备的肤质检测结果的输出速度,提高用户的使用体验。
在本发明的一些实施例中,为了可以使最终输出的目标检测结果更加精准,输出的目标检测结果还可以是将第一检测结果和第二检测结果进行信息融合处理后得到的检测结果。第一检测结果为第一图像中的肤质信息对应的检测结果,第二检测结果为目标图像中的肤质信息对应的检测结果。
在本发明实施例中,由于输出的目标检测结果为经过第一检测结果和目标图像中的肤质信息对应的检测结果信息融合处理后的检测结果,在一定程度上避免同一个用户在同一个场景下进行的肤质检测会得到差异较大 的肤质检测结果的情况发生,使得电子设备输出的目标检测结果更加精准,进而提高用户的使用体验。
为了可以及时获取到目标图像和目标图像对应的检测结果,在本发明的一些实施例中,在S101之前,该检测结果输出的方法还包括如下步骤:
存储目标图像和目标图像对应的检测结果。
可以理解的,这里存储的目标图像可以包括目标图像中目标对象的参数值,例如,在目标对象为人的情况下,可以存储有该目标对象的五官大小和距离等参数,也可以包括目标图像的背景信息和目标图像的拍摄位置信息,以提高在与第一图像信息进行比对时的对比速度。
在本发明的另一些实施例中,在S101之前,电子设备还可以对第一对象的面部图像进行检测,具体请参见图2所示的检测结果输出的方法。
如图2所示,图2为本发明的另一个实施例提供的检测结果输出的方法的流程示意图。该方法包括:S201至S206。
S201,开启图像采集模式。
可选的,在本发明的一些实施例中,电子设备在获取第一图像之前,还需要开启图像采集模式,例如,开启摄像头。
S202,预览第一对象的面部图像。
S203,检测第一对象的面部图像是否满足第二预设条件,其中,第二预设条件可以包括以下至少一项:面部图像的完整度大于预设完整度阈值、面部图像中的光照值大于预设光照阈值和面部图像清晰度大于预设清晰度阈值,若是,则执行S204,若否,则持续执行S203。
S204,在满足第二预设条件的情况下,获取第一对象的第一图像信息。
其中,第一图像包括第一对象的肤质信息,第一图像信息中还包括第一图像。
S205,在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果。
其中,目标检测结果为目标图像对应的检测结果。
S206,在第一图像与目标图像的匹配度不满足第一预设条件的情况下,输出第一检测结果。
其中,第一检测结果为第一图像对应的检测结果。
其中,S204-S206与S101-S103为相同的步骤,在此不再赘述。
在本发明实施例中,电子设备通过在获取第一图像之前,对第一对象的面部图像进行检测,可以得到更加符合进行肤质检测的图像,从而提高肤质检测报告的准确性。
下面结合图3所示的检测结果输出的方法流程示意图,详细介绍本发明又一个实施例提供的检测结果输出的方法。
如图3所示,图3为本发明又一个实施例提供的检测结果输出的方法的流程示意图。该检测结果输出的方法包括:S301至S305。
S301,获取第一对象的第一图像信息。
其中,应当理解的是,在S301之前,电子设备同样可以执行如上述S201-S203所述的步骤,并且在第一对象的面部图像满足第二预设条件的情况下再执行S301。
S302,判断第一图像的场景信息和目标图像的场景信息之间的匹配度是否大于预设匹配度阈值。若是,则执行S304,若否,则执行S305。
应当理解的是,预设匹配度阈值越高,则表示第一图像的场景信息和目标图像的场景信息之间越匹配。其中,预设匹配度阈值可以按照实景应用场景的需求进行设定,在此并不做任何的限制。
S303,判断第一对象和目标对象的图像相似度是否大于预设相似度阈值。若是,则执行S304,若否,则执行S305。
可选的,在本发明的一些实施例中,在计算第一图像和目标图像之间的图像相似度时可以利用图像直方图进行计算。其中,图像直方图是用以表示数字图像中亮度分布的直方图,标绘了图像中每个亮度值的像素数
例如,可以是通过图像梯度直方图、图像亮度直方图和图像色彩直方图计算第一图像和所述目标图像之间的图像相似度。
S304,输出目标检测结果。其中,目标检测结果为目标图像对应的检测结果。
S305,输出第一检测结果。其中,第一检测结果为第一图像中的肤质信息对应的结果。
在本发明实施例中,电子设备通过获取第一对象的包括肤质信息的第一图像信息,并通过场景信息或图像相似度两个角度判断第一图像和目标图像之间的匹配度,在第一图像和目标图像之间的匹配度不满足第一预设条件的情况下,输出第一图像对应的第一检测结果,在第一图像和目标图像之间的匹配度满足第一预设条件的情况下,就可以不再对第一对象进行肤质检测,而是直接输出目标对象对应的检测结果,从而提高电子设备的肤质检测结果的输出速度,提高用户的使用体验。
基于上述实施例提供的检测结果输出的方法的具体实现方式,相应地,本发明还提供了一种电子设备的具体实现方式。请参见图4。
图4为本发明一个实施例提供的电子设备的结构示意图。如图4所示,该电子设备400包括:
获取模块410,用于获取第一对象的第一图像信息,其中,第一图像信息包括第一对象的肤质信息;
第一输出模块420,用于在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果;
第二输出模块430,用于在第一图像与目标图像的匹配度不满足第一预设条件的情况下,输出第一检测结果,其中,目标检测结果为目标图像对应的检测结果,第一检测结果为第一图像对应的检测结果。
在本发明实施例中,电子设备通过获取第一对象的包括肤质信息的第一图像信息,并通过判断第一图像和目标图像之间的匹配度,在第一图像和目标图像之间的匹配度不满足第一预设条件的情况下,输出第一图像对应的第一检测结果,在第一图像和目标图像之间的匹配度满足第一预设条件的情况下,就可以不再对第一对象的肤质信息进行肤质检测,而是直接输出目标对象对应的检测结果,从而提高电子设备的肤质检测结果的输出速度,提高用户的使用体验。
可选的,在本发明的一些实施例中,第一图像与目标图像的匹配度满足第一预设条件,包括:
所述第一图像的场景信息与所述目标图像的场景信息间的匹配度大于预设匹配度阈值;
或者,所述第一对象和目标对象的图像相似度大于预设相似度阈值,其中,所述第一图像中包括第一对象,所述目标图像中包括目标对象。
可选的,在本发明的一些实施例中,所述场景信息包括以下至少一项:
图像背景信息和图像拍摄位置信息。
可选的,在本发明的一些实施例中,电子设备400还包括:
检测模块,用于检测第一对象的面部图像是否满足第二预设条件;
获取模块410具体用于:
在面部图像满足第二预设条件的情况下,获取第一对象的第一图像,其中,第二预设条件包括以下至少一项:面部图像的完整度大于预设完整度阈值、面部图像中的光照值大于预设光照阈值和面部图像清晰度大于预设清晰度阈值。
可选的,在本发明的一些实施例中,电子设备400还包括:
存储模块,用于存储目标图像和目标图像对应的检测结果。
下面结合图5,详细介绍本发明各个实施例的一种电子设备的硬件结构。
如图5所示,图5为实现本发明各个实施例的电子设备的硬件结构示意图。
该电子设备600包括但不限于:射频单元601、网络模块602、音频输出单元603、输入单元604、传感器605、显示单元606、用户输入单元607、接口单元608、存储器609、处理器610、以及电源611等部件。本领域技术人员可以理解,图5中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本发明实施例中,电子设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中,处理器610,用于获取第一对象第一图像信息,其中,第一图像信息包括第一对象的肤质信息;在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果;在第一图像与目标图像的匹配度不满足第一预设条件的情况下,输出第一检测结果,其中,目标检测结 果为目标图像对应的检测结果,第一检测结果为第一图像对应的检测结果。
在本发明实施例中,电子设备通过获取第一对象的包括肤质信息的第一图像信息,并通过判断第一图像和目标图像之间的匹配度,在第一图像和目标图像之间的匹配度不满足第一预设条件的情况下,输出第一图像对应的第一检测结果,在第一图像和目标图像之间的匹配度满足第一预设条件的情况下,就可以不再对第一对象进行肤质检测,而是直接输出目标对象对应的检测结果,从而提高电子设备的肤质检测结果的输出速度,提高用户的使用体验。
应理解的是,本发明实施例中,射频单元601可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器610处理;另外,将上行的数据发送给基站。通常,射频单元601包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元601还可以通过无线通信系统与网络和其他设备通信。
电子设备通过网络模块602为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元603可以将射频单元601或网络模块602接收的或者在存储器609中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元603还可以提供与电子设备600执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元603包括扬声器、蜂鸣器以及受话器等。
输入单元604用于接收音频或视频信号。输入单元604可以包括图形处理器(Graphics Processing Unit,GPU)6041和麦克风6042,图形处理器6041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元606上。经图形处理器6041处理后的图像帧可以存储在存储器609(或其它存储介质)中或者经由射频单元601或网络模块602进行发送。麦克风6042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元 601发送到移动通信基站的格式输出。
电子设备600还包括至少一种传感器605,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板6061的亮度,接近传感器可在电子设备600移动到耳边时,关闭显示面板6061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别电子设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器605还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元606用于显示由用户输入的信息或提供给用户的信息。显示单元606可包括显示面板6061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板6061。
用户输入单元607可用于接收输入的数字或字符信息,以及产生与电子设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元607包括触控面板6071以及其他输入设备6072。触控面板6071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板6071上或在触控面板6071附近的操作)。触控面板6071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器610,接收处理器610发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板6071。除了触控面板6071,用户输入单元607还可以包括其他输入设备6072。具体地,其他输入设备6072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板6071可覆盖在显示面板6061上,当触控面板6071检测到在其上或附近的触摸操作后,传送给处理器610以确定触摸事件的类型,随后处理器610根据触摸事件的类型在显示面板6061上提供相应的视觉输出。虽然在图5中,触控面板6071与显示面板6061是作为两个独立的部件来实现电子设备的输入和输出功能,但是在某些实施例中,可以将触控面板6071与显示面板6061集成而实现电子设备的输入和输出功能,具体此处不做限定。
接口单元608为外部装置与电子设备600连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元608可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到电子设备600内的一个或多个元件或者可以用于在电子设备600和外部装置之间传输数据。
存储器609可用于存储软件程序以及各种数据。存储器609可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器609可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器610是电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器609内的软件程序和/或模块,以及调用存储在存储器609内的数据,执行电子设备的各种功能和处理数据,从而对电子设备进行整体监控。处理器610可包括一个或多个处理单元;优选的,处理器610可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器610中。
电子设备600还可以包括给各个部件供电的电源611(比如电池),优选的,电源611可以通过电源管理系统与处理器610逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,电子设备600包括一些未示出的功能模块,在此不再赘述。
优选的,本发明实施例还提供一种电子设备,包括处理器610,存储器609,存储在存储器609上并可在所述处理器610上运行的计算机程序,该计算机程序被处理器610执行时实现上述检测结果输出的方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本发明实施例还提供一种电子设备,被配置成用于执行上述检测结果输出的方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本发明实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述检测结果输出的方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质的示例包括非暂态计算机可读存储介质,如只读存储器(Read-Only Memory,简称ROM)、随机存取存储器(Random Access Memory,简称RAM)、磁碟或者光盘等。
本发明实施例还提供一种计算机程序产品,所述计算机程序产品可被处理器执行以实现上述检测结果输出的方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
上面参考根据本发明的实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本发明的各方面。应当理解,流程图和/或框图中的每个方框以及流程图和/或框图中各方框的组合可以由计算机程序指令实现。这些计算机程序指令可被提供给通用计算机、专用计算机、或其它可编程数据处理装置的处理器,以产生一种机器,使得经由计算机或其它可编程数据处理装置的处理器执行的这些指令使能对流程图和/或框图的一个或多个方框中指定的功能/动作的实现。这种处理器可以是但不限于是通用处理器、专用处理器、特殊应用处理器或者现场可编程逻辑电路。还可理解,框图和/或流程图中的每个方框以及框图和/或流程图中的方框的 组合,也可以由执行指定的功能或动作的专用硬件来实现,或可由专用硬件和计算机指令的组合来实现。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。
上面结合附图对本发明的实施例进行了描述,但是本发明并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本发明的启示下,在不脱离本发明宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本发明的保护之内。

Claims (20)

  1. 一种检测结果输出的方法,应用于电子设备,包括:
    获取第一对象的第一图像信息,其中,所述第一图像信息包括所述第一对象的肤质信息;
    在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果;
    在所述第一图像与所述目标图像的匹配度不满足所述第一预设条件的情况下,输出第一检测结果,其中,所述目标检测结果为所述目标图像对应的检测结果,所述第一检测结果为所述第一图像对应的检测结果。
  2. 根据权利要求1所述的方法,其中,所述第一图像与目标图像的匹配度满足第一预设条件,包括:
    所述第一图像的场景信息与所述目标图像的场景信息间的匹配度大于预设匹配度阈值;
    或者,所述第一对象和目标对象的图像相似度大于预设相似度阈值,其中,所述第一图像中包括第一对象,所述目标图像中包括目标对象。
  3. 根据权利要求2所述的方法,其中,所述场景信息包括以下至少一项:
    图像背景信息和图像拍摄位置信息。
  4. 根据权利要求3所述的方法,其中,所述图像背景信息包括以下至少一项:
    图像中除对象外的图像的亮度信息、色度信息和清晰度信息。
  5. 根据权利要求3所述的方法,其中,所述图像拍摄位置信息,包括:
    拍摄图像时电子设备所处的地理位置信息。
  6. 根据权利要求1所述的方法,其中,所述获取第一对象的第一图像信息之前,还包括:
    检测第一对象的面部图像是否满足第二预设条件;
    所述获取第一对象的第一图像,包括:
    在所述面部图像满足所述第二预设条件的情况下,获取所述第一对象的第一图像信息,其中,所述第二预设条件包括以下至少一项:面部图像的完整度大于预设完整度阈值、面部图像中的光照值大于预设光照阈值和面部图像清晰度大于预设清晰度阈值。
  7. 根据权利要求1所述的方法,其中,在所述获取第一对象的第一图像之前,还包括:
    存储所述目标图像和所述目标检测结果。
  8. 根据权利要求7所述的方法,其中,所述存储所述目标图像,包括:
    存储所述目标对象的参数值、所述目标图像的背景信息和所述目标图像的拍摄位置信息至少其中之一。
  9. 一种电子设备,包括:
    获取模块,用于获取第一对象的第一图像信息,其中,所述第一图像信息包括所述第一对象的肤质信息;
    第一输出模块,用于在第一图像与目标图像的匹配度满足第一预设条件的情况下,输出目标检测结果;
    第二输出模块,用于在所述第一图像与所述目标图像的匹配度不满足所述第一预设条件的情况下,输出第一检测结果,其中,所述目标检测结果为所述目标图像对应的检测结果,所述第一检测结果为所述第一图像对应的检测结果。
  10. 根据权利要求9所述的电子设备,其中,所述第一图像与目标图像的匹配度满足第一预设条件,包括:
    所述第一图像的场景信息与所述目标图像的场景信息间的匹配度大于预设匹配度阈值;
    或者,所述第一对象和目标对象的图像相似度大于预设相似度阈值,其中,所述第一图像中包括第一对象,所述目标图像中包括目标对象。
  11. 据权利要求9所述的电子设备,其中,所述场景信息包括以下至少一项:
    图像背景信息和图像拍摄位置信息。
  12. 根据权利要求11所述的电子设备,其中,所述图像背景信息包括以下至少一项:
    图像中除对象外的图像的亮度信息、色度信息和清晰度信息。
  13. 根据权利要求11所述的电子设备,其中,所述图像拍摄位置信息,包括:
    拍摄图像时电子设备所处的地理位置信息。
  14. 根据权利要求9所述的电子设备,还包括:
    检测模块,用于检测第一对象的面部图像是否满足第二预设条件;
    所述获取模块具体用于:
    在所述面部图像满足所述第二预设条件的情况下,获取所述第一对象的第一图像,其中,所述第二预设条件包括以下至少一项:面部图像的完整度大于预设完整度阈值、面部图像中的光照值大于预设光照阈值和面部图像清晰度大于预设清晰度阈值。
  15. 根据权利要求9所述的电子设备,还包括:
    存储模块,用于存储目标图像和所述目标检测结果。
  16. 根据权利要求15所述的电子设备,其中,所述存储模块具体用于:
    存储所述目标对象的参数值、所述目标图像的背景信息和所述目标图像的拍摄位置信息至少其中之一。
  17. 一种电子设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至8中任一项所述的检测结果输出的方法的步骤。
  18. 一种电子设备,被配置成用于执行如权利要求1至8中任一项所述的检测结果输出的方法的步骤。
  19. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至8中任一项所述的检测结果输出的方法的步骤。
  20. 一种计算机程序产品,所述计算机程序产品可被处理器执行以实现如权利要求1至8中任一项所述的检测结果输出的方法的步骤。
PCT/CN2021/081452 2020-03-25 2021-03-18 检测结果输出的方法、电子设备及介质 WO2021190387A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020227036902A KR20220157485A (ko) 2020-03-25 2021-03-18 검출 결과 출력 방법, 전자 장치 및 매체
EP21776269.9A EP4131067A4 (en) 2020-03-25 2021-03-18 DETECTION RESULT OUTPUT METHOD, ELECTRONIC DEVICE, AND MEDIUM
JP2022557197A JP7467667B2 (ja) 2020-03-25 2021-03-18 検出結果出力方法、電子機器及び媒体
US17/948,530 US20230014409A1 (en) 2020-03-25 2022-09-20 Detection result output method, electronic device and medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010218894.7 2020-03-25
CN202010218894.7A CN111401463B (zh) 2020-03-25 2020-03-25 检测结果输出的方法、电子设备及介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/948,530 Continuation US20230014409A1 (en) 2020-03-25 2022-09-20 Detection result output method, electronic device and medium

Publications (1)

Publication Number Publication Date
WO2021190387A1 true WO2021190387A1 (zh) 2021-09-30

Family

ID=71429264

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/081452 WO2021190387A1 (zh) 2020-03-25 2021-03-18 检测结果输出的方法、电子设备及介质

Country Status (6)

Country Link
US (1) US20230014409A1 (zh)
EP (1) EP4131067A4 (zh)
JP (1) JP7467667B2 (zh)
KR (1) KR20220157485A (zh)
CN (1) CN111401463B (zh)
WO (1) WO2021190387A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113762220A (zh) * 2021-11-03 2021-12-07 通号通信信息集团有限公司 目标识别方法、电子设备、计算机可读存储介质
CN114253612A (zh) * 2021-11-25 2022-03-29 上海齐感电子信息科技有限公司 控制方法及控制系统

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111401463B (zh) * 2020-03-25 2024-04-30 维沃移动通信有限公司 检测结果输出的方法、电子设备及介质
CN115131698B (zh) * 2022-05-25 2024-04-12 腾讯科技(深圳)有限公司 视频属性确定方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109558773A (zh) * 2017-09-26 2019-04-02 阿里巴巴集团控股有限公司 信息识别方法、装置及电子设备
CN109840476A (zh) * 2018-12-29 2019-06-04 维沃移动通信有限公司 一种脸型检测方法及终端设备
CN110298304A (zh) * 2019-06-27 2019-10-01 维沃移动通信有限公司 一种皮肤检测方法及终端
US20190340421A1 (en) * 2018-05-01 2019-11-07 Qualcomm Incorporated Face recognition in low light conditions for unlocking an electronic device
CN111401463A (zh) * 2020-03-25 2020-07-10 维沃移动通信有限公司 检测结果输出的方法、电子设备及介质

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07302327A (ja) * 1993-08-11 1995-11-14 Nippon Telegr & Teleph Corp <Ntt> 物体画像検出方法及び検出装置
JP5226590B2 (ja) 2009-04-02 2013-07-03 キヤノン株式会社 画像解析装置、画像処理装置及び画像解析方法
US8330807B2 (en) * 2009-05-29 2012-12-11 Convergent Medical Solutions, Inc. Automated assessment of skin lesions using image library
EP3016005A1 (en) * 2014-10-31 2016-05-04 Korea Advanced Institute Of Science And Technology Method and system for offering of similar photos in real time based on photographing context information of a geographically proximate user group
CN104490361A (zh) * 2014-12-05 2015-04-08 深圳市共创百业科技开发有限公司 基于网络医院的皮肤病远程筛查系统及方法
JP6598480B2 (ja) * 2015-03-24 2019-10-30 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
KR102227319B1 (ko) * 2016-10-31 2021-03-11 가부시끼가이샤 디디에스 피부 정보 처리 프로그램 및 피부 정보 처리 장치
JP2018081402A (ja) 2016-11-15 2018-05-24 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
CN107194158A (zh) * 2017-05-04 2017-09-22 深圳美佳基因科技有限公司 一种基于图像识别的疾病辅助诊断方法
KR102235386B1 (ko) * 2017-07-07 2021-04-01 삼성에스디에스 주식회사 장면전환 검출 장치 및 방법
WO2019047224A1 (zh) * 2017-09-11 2019-03-14 深圳市得道健康管理有限公司 基于云计算平台的中医热成像辅助诊断系统及方法
CN108063884B (zh) * 2017-11-15 2021-02-26 维沃移动通信有限公司 一种图像处理方法及移动终端
KR102052722B1 (ko) * 2017-11-30 2019-12-11 주식회사 룰루랩 편광 필름을 포함하는 포터블 피부 상태 측정 장치, 및 피부 상태 진단 및 관리 시스템
US10839238B2 (en) * 2018-03-23 2020-11-17 International Business Machines Corporation Remote user identity validation with threshold-based matching
CN109198998A (zh) * 2018-08-27 2019-01-15 微云(武汉)科技有限公司 一种智能电子镜及其显示方法
CN109330560A (zh) * 2018-09-10 2019-02-15 天津大学 一种皮肤病识别与检测盒
CN109766471A (zh) * 2019-01-23 2019-05-17 中国科学院苏州生物医学工程技术研究所 皮肤病图像检索方法及系统、存储介质、电子设备
CN109961426B (zh) * 2019-03-11 2021-07-06 西安电子科技大学 一种人脸皮肤肤质的检测方法
CN110380864B (zh) * 2019-07-05 2021-10-01 创新先进技术有限公司 人脸数据采集、验证的方法、设备及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109558773A (zh) * 2017-09-26 2019-04-02 阿里巴巴集团控股有限公司 信息识别方法、装置及电子设备
US20190340421A1 (en) * 2018-05-01 2019-11-07 Qualcomm Incorporated Face recognition in low light conditions for unlocking an electronic device
CN109840476A (zh) * 2018-12-29 2019-06-04 维沃移动通信有限公司 一种脸型检测方法及终端设备
CN110298304A (zh) * 2019-06-27 2019-10-01 维沃移动通信有限公司 一种皮肤检测方法及终端
CN111401463A (zh) * 2020-03-25 2020-07-10 维沃移动通信有限公司 检测结果输出的方法、电子设备及介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4131067A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113762220A (zh) * 2021-11-03 2021-12-07 通号通信信息集团有限公司 目标识别方法、电子设备、计算机可读存储介质
CN114253612A (zh) * 2021-11-25 2022-03-29 上海齐感电子信息科技有限公司 控制方法及控制系统

Also Published As

Publication number Publication date
EP4131067A4 (en) 2023-07-26
KR20220157485A (ko) 2022-11-29
JP2023518548A (ja) 2023-05-02
US20230014409A1 (en) 2023-01-19
JP7467667B2 (ja) 2024-04-15
EP4131067A1 (en) 2023-02-08
CN111401463A (zh) 2020-07-10
CN111401463B (zh) 2024-04-30

Similar Documents

Publication Publication Date Title
WO2021104195A1 (zh) 图像显示方法及电子设备
WO2021190387A1 (zh) 检测结果输出的方法、电子设备及介质
US20220279116A1 (en) Object tracking method and electronic device
CN109461117B (zh) 一种图像处理方法及移动终端
WO2020216129A1 (zh) 参数获取方法及终端设备
CN109240577B (zh) 一种截屏方法及终端
CN107977652B (zh) 一种屏幕显示内容的提取方法及移动终端
WO2019206077A1 (zh) 视频通话处理方法及移动终端
WO2020182035A1 (zh) 图像处理方法及终端设备
WO2021233176A1 (zh) 环境光检测方法及电子设备
CN107730460B (zh) 一种图像处理方法及移动终端
CN109819166B (zh) 一种图像处理方法和电子设备
CN110990172A (zh) 一种应用分享方法、第一电子设备及计算机可读存储介质
WO2021082772A1 (zh) 截屏方法及电子设备
WO2021083090A1 (zh) 消息发送方法及移动终端
JP2023502414A (ja) 対象表示方法及び電子機器
CN109618055B (zh) 一种位置共享方法及移动终端
CN108366171B (zh) 一种温升控制方法和移动终端
WO2021190370A1 (zh) 截图方法及电子设备
CN108536513B (zh) 一种图片显示方向调整方法及移动终端
WO2021185142A1 (zh) 图像处理方法、电子设备及存储介质
WO2021164730A1 (zh) 指纹图像处理方法及电子设备
WO2021104265A1 (zh) 电子设备及对焦方法
CN110740265B (zh) 图像处理方法及终端设备
CN109189735B (zh) 一种预览图像显示方法、移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21776269

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022557197

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20227036902

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021776269

Country of ref document: EP

Effective date: 20221025