CN111445477B - Analysis method, device and server based on automatic segmentation and selection of regions - Google Patents

Analysis method, device and server based on automatic segmentation and selection of regions Download PDF

Info

Publication number
CN111445477B
CN111445477B CN202010129351.8A CN202010129351A CN111445477B CN 111445477 B CN111445477 B CN 111445477B CN 202010129351 A CN202010129351 A CN 202010129351A CN 111445477 B CN111445477 B CN 111445477B
Authority
CN
China
Prior art keywords
value
signal
area
illuminance
noise ratio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010129351.8A
Other languages
Chinese (zh)
Other versions
CN111445477A (en
Inventor
罗静静
甄俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202010129351.8A priority Critical patent/CN111445477B/en
Publication of CN111445477A publication Critical patent/CN111445477A/en
Application granted granted Critical
Publication of CN111445477B publication Critical patent/CN111445477B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides an analysis method, a device and a server based on automatic segmentation and selection of regions, which are based on the existing face segmentation algorithm of face recognition and feature point calibration, and a non-coincident custom region set is used to obtain morphologically symmetric regions of faces and reference regions as target regions. Because the algorithm is not influenced by subjective factors, the selection of the symmetrical regions is based on the human face feature points obtained through the actual facial morphology training, the influence of crowd diversity is avoided, the actual selection regions of each subject can be in one-to-one correspondence in morphology, the technical problems that the influence of subjective factors is difficult to accurately position when the face fixing regions are manually selected, and the determined relative position information is not established between the regions during the analysis of the regions, particularly between the symmetrical regions, are avoided.

Description

Analysis method, device and server based on automatic segmentation and selection of regions
Technical Field
The application relates to the technical field of face detection big data, in particular to an analysis method, an analysis device and a server based on automatic segmentation and selection of areas.
Background
Photoplethysmography (imaging photoplethysmography, iPPG) is used as a non-contact physiological parameter measurement method to detect the intensity of light reflected from the facial region of a person between frames (i.e., iPPG signals) based on video signals, and to calculate physiological parameters such as heart rate, respiratory rate, and blood oxygen saturation from the iPPG signals.
In order to analyze iPPG signals of specific facial regions, the existing iPPG technology generally selects a region of interest (region of interest, ROI) by a manual region selection method, and the defects and shortcomings of the manual region selection method are as follows: when monitoring physiological parameters of a certain subject, only a certain area of the face is generally selected at will for analysis, and the signal quality of the area is lack of consideration, so that the signal reliability is difficult to ensure; when the commonality (regional difference) analysis of the physiological signals of the specific regions among people is carried out, the positions of the selected regions depend on experience of experimenters, are easily influenced by subjective factors, and are difficult to accurately position; when analysis is performed between regions, particularly between symmetric regions, no specific relative position information is established between selected regions.
Disclosure of Invention
In order to overcome at least the above-mentioned shortcomings in the prior art, one of the purposes of the present application is to provide an analysis method, device and server based on region automatic segmentation and selection, which uses a set of misaligned custom regions to obtain a morphologically symmetric region of a face and a reference region as a target region based on the existing face segmentation algorithm of face recognition and feature point calibration. Because the algorithm is not influenced by subjective factors, the selection of the symmetrical regions is based on the human face feature points obtained through the actual facial morphology training, the influence of crowd diversity is avoided, the actual selection regions of each subject can be in one-to-one correspondence in morphology, the technical problems that the influence of subjective factors is difficult to accurately position when the face fixing regions are manually selected, and the determined relative position information is not established between the regions during the analysis of the regions, particularly between the symmetrical regions, are avoided.
In a first aspect, the present application provides an analysis method based on automatic segmentation and selection of regions, the method comprising:
acquiring a face image to be processed, and acquiring n key feature points on the face image and position information of each key feature point based on a preset algorithm;
dividing f preselected areas on the face image according to the position information of the n key feature points, wherein the contour line of each preselected area is formed by connecting at least three key feature points; the face image comprises a plurality of face organs and a bystander area, and the f preselected areas completely cover the bystander area;
calculating the position information of the mass center of each preselected area, and acquiring f positive selection areas taking the mass center of each preselected area as the center according to the position information of the mass center;
extracting an iPG signal of each positive selection region, and selecting the positive selection region of which the iPG signal meets a preset condition as a target region;
analyzing the target area and outputting an analysis result.
In one possible design of the first aspect, before the step of calculating the position information of the centroid of each of the preselected areas, the step of setting the reference illuminance under the current light source device further includes:
Selecting any area on the face as an illuminance calibration area, and acquiring data of each frame of illuminance calibration area under the current illuminance;
calculating original signal data of the illuminance calibration area under the current illuminance, calculating to obtain iPG alternating current signal data through the original signal data, filtering the iPG alternating current signal, calculating to obtain an effective value AC of the iPG alternating current signal of each period after filtering, calculating an average value of the effective values AC of the iPG alternating current signal, and simultaneously calculating a signal-to-noise ratio SNR of the illuminance calibration area under the current illuminance;
controlling the illuminance to rise, and acquiring the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal in the illuminance calibration area under each illuminance, wherein the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal rise synchronously based on the illuminance rise; when the average value of the effective value AC of the iPG alternating current signal is detected to rise to a first value and the rising amplitude is lower than a first set amplitude after rising to the first value, and the signal to noise ratio SNR is raised to a second value and begins to fall after rising to the second value and the falling amplitude is higher than a second set amplitude, the first value and the second value are obtained, and any illuminance value is selected as reference illuminance under the current light source equipment in an illuminance interval corresponding to the first value and the second value.
In the design, a rectangular area in the middle of the forehead on the face is selected as the illuminance calibration area.
In one possible design of the first aspect, the step of calculating the position information of the centroid of each of the pre-selected areas, and obtaining f positive selection areas centered on the centroid of each of the pre-selected areas according to the position information of the centroid includes:
based on f pre-selected areas, taking the mass center of each pre-selected area as the center, respectively setting e pre-selected areas with different sizes, wherein the sizes of the e pre-selected areas are in an increasing trend;
calculating signal-to-noise values of e pre-fetching areas under f pre-selected areas respectively; f, the signal-to-noise ratio values of the prefetched areas with the same size are a group, and the average signal-to-noise ratio value of each group is calculated to obtain e average signal-to-noise ratio values;
when the average signal-to-noise ratio value is detected to rise to the third value and the rising amplitude is lower than the third set amplitude after the average signal-to-noise ratio value rises to the third value, the size of the prefetching area corresponding to the average signal-to-noise ratio value is the size of the positive selection area.
In one possible design of the first aspect, the step of extracting the iPPG signal of each of the positively selected regions and selecting the positively selected region in which the iPPG signal meets the preset condition as the target region includes:
Calculating standard deviation value and signal to noise ratio value of the iPG signal of each positive selection area;
calculating to obtain a quality evaluation value Q according to the standard deviation value and the signal-to-noise ratio value;
and (3) sorting the quality evaluation values Q of all the positive selection areas in a descending order, wherein the preset condition is that the quality evaluation values Q are sorted in the first w bits.
In a second aspect, the present application further provides an analysis device based on automatic segmentation and selection of regions, the device comprising:
the acquisition fixed point module is used for acquiring a face image to be processed, and acquiring n key feature points on the face image and position information of each key feature point based on a preset algorithm;
the combined preselection module is used for dividing f preselection areas on the face image according to the position information of the n key feature points, and the contour line of each preselection area is formed by connecting at least three key feature points; the face image comprises a plurality of face organs and a bystander area, and the f preselected areas completely cover the bystander area;
the secondary region selection module is used for dividing f preselected regions on the face image according to the position information of the n key feature points, and the contour line of each preselected region is formed by connecting at least three key feature points; the face image comprises a plurality of face organs and a bystander area, and the f preselected areas completely cover the bystander area;
And the third-time region selection module is used for extracting the iPG signal of each positive selection region and selecting the positive selection region of which the iPG signal meets the preset condition as a target region.
And the area analysis module is used for analyzing the target area and outputting an analysis result.
In one possible design of the second aspect, before the secondary area selection module, the apparatus further includes an illuminance setting module for setting a reference illuminance under the current light source device; the illuminance setting module includes:
the calibration area selection module is used for selecting any area on the face as an illuminance calibration area and acquiring the data of each frame of illuminance calibration area under the current illuminance;
the data calculation module is used for calculating original signal data of the illuminance calibration area under the current illuminance, calculating to obtain iPG alternating current signal data through the original signal data, filtering the iPG alternating current signal, calculating to obtain an effective value AC of the iPG alternating current signal of each period after filtering, calculating an average value of the effective values AC of the iPG alternating current signal, and calculating a signal-to-noise ratio SNR of the illuminance calibration area under the current illuminance;
the data processing module is used for controlling the illumination to rise and acquiring the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal in the illumination calibration area under each illumination, wherein the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal synchronously rise based on the illumination; when the average value of the effective value AC of the iPG alternating current signal is detected to rise to a first value and the rising amplitude is lower than a first set amplitude after rising to the first value, and the signal to noise ratio SNR is raised to a second value and begins to fall after rising to the second value and the falling amplitude is higher than a second set amplitude, the first value and the second value are obtained, and any illuminance value is selected as reference illuminance under the current light source equipment in an illuminance interval corresponding to the first value and the second value.
In one possible design of the second aspect, the secondary selection module includes:
the pre-fetching setting module is used for respectively setting e pre-fetching areas with different sizes based on f pre-selected areas by taking the mass center of each pre-selected area as the center, wherein the sizes of the e pre-fetching areas are in an increasing trend;
the pre-fetching calculation module is used for calculating the signal-to-noise value of e pre-fetching areas under f pre-selected areas respectively; f, the signal-to-noise ratio values of the prefetched areas with the same size are a group, and the average signal-to-noise ratio value of each group is calculated to obtain e average signal-to-noise ratio values;
and the positive selection setting module is used for detecting that the average signal-to-noise ratio value rises to a third value and the rising amplitude is lower than a third set amplitude after the average signal-to-noise ratio value rises to the third value, and the size of the prefetching area corresponding to the average signal-to-noise ratio value is the size of the positive selection area.
In one possible design of the second aspect, the third time selection module includes:
the numerical calculation module is used for calculating the standard deviation value and the signal-to-noise value of the iPG signal of each positive selection area;
the evaluation calculation module is used for calculating to obtain a quality evaluation value Q according to the standard deviation value and the signal-to-noise ratio value;
the target setting module is used for descending order of quality evaluation values Q of all the positively selected areas, and the preset condition is that the quality evaluation values Q are ordered in the front w bits.
In a third aspect, an embodiment of the present application provides an analysis server based on automatic segmentation and selection of a region, where the analysis server based on automatic segmentation and selection of a region includes a machine-readable storage medium storing machine-executable instructions and a processor, where the processor, when executing the machine-executable instructions, implements a method in executing any of the possible designs of the first aspect or the first aspect.
In a fourth aspect, embodiments of the present application provide a machine-readable storage medium storing machine-executable instructions that, when run on a computer, cause the computer to perform the method of the first aspect or any of the possible designs of the first aspect.
Based on any one of the aspects, the face segmentation algorithm based on the conventional face recognition and feature point calibration uses a misaligned user-defined region set to obtain a morphologically symmetrical face region and a reference region as target regions. Because the algorithm is not influenced by subjective factors, the selection of the symmetrical regions is based on the human face feature points obtained through the actual facial morphology training, the influence of crowd diversity is avoided, the actual selection regions of each subject can be in one-to-one correspondence in morphology, the technical problems that the influence of subjective factors is difficult to accurately position when the face fixing regions are manually selected, and the determined relative position information is not established between the regions during the analysis of the regions, particularly between the symmetrical regions, are avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of an analysis method based on automatic segmentation and selection of regions according to an embodiment of the present application.
Fig. 2 is a schematic diagram of one combination of key feature points.
Fig. 3 is a schematic illustration of the division of triangular pre-selected areas according to the combination of fig. 2.
Fig. 4 is a schematic illustration of a preselected area fully covering a bystander area.
Fig. 5 is a schematic diagram of the centroid of the preselected area.
Fig. 6 is a digital signature schematic of centroid position information for a preselected area.
Fig. 7 is a schematic flow chart of step S100 in one possible embodiment shown in fig. 1.
Fig. 8 is a flow chart illustrating the sub-steps involved in step S100 in one possible embodiment shown in fig. 7.
Fig. 9 is a graph showing the AC average before and after filtering and the broken line of SNR versus illuminance.
Fig. 10 is a flow chart illustrating the sub-steps involved in step S130 in one possible embodiment shown in fig. 1.
FIG. 11 is a diagram illustrating broken lines of average SNR data corresponding to the size of a prefetch region.
Fig. 12 is a flow chart illustrating the sub-steps involved in step S140 in one possible embodiment shown in fig. 1.
Fig. 13 is a schematic diagram of the location information and numerical labels of the symmetric subregions.
Fig. 14 is a schematic block diagram of an analysis device based on automatic segmentation and selection of regions according to an embodiment of the present application.
Fig. 15 is a schematic block diagram of an analysis device based on automatic segmentation and selection of regions, which further includes an illuminance calibration module in one possible embodiment shown in fig. 14.
Fig. 16 is a schematic block diagram of an illuminance calibration module.
Fig. 17 is a schematic diagram of a secondary selection module.
Fig. 18 is a schematic block diagram of a triple selection module.
Detailed Description
The following description is provided in connection with the accompanying drawings, and the specific operation method in the method embodiment may also be applied to the device embodiment or the system embodiment. In the description of the present application, unless otherwise indicated, "at least one" includes one or more. "plurality" means two or more. For example, at least one of A, B and C, includes: a alone, B alone, a and B together, a and C together, B and C together, and A, B and C together. In the present application, "/" means or, for example, A/B may represent A or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone.
Referring to fig. 1, fig. 1 is a flow chart of an analysis method based on automatic segmentation and selection of regions according to a preferred embodiment of the present application. The analysis method based on the automatic segmentation and selection of the region is described in detail below.
Step S110, a face image to be processed is obtained, and n key feature points on the face image and position information of each key feature point are obtained based on a preset algorithm.
In this embodiment, a green light source and an RGB camera are used for video capture, and each frame has a resolution of 2448×2048 (according to different cameras, the actual situation may be different). Based on the Dlib algorithm library, a face region in the first frame image is detected by using a face region detection algorithm (get_front_face_detector) of the Dlib algorithm library, and a face image to be processed is acquired. The face contour detection algorithm is used to obtain 81 key feature points (i.e., n is 81 in this embodiment) in the face image. In this embodiment, the face is relatively stationary with respect to the camera, and therefore the position information of the 81 key feature points obtained in the first frame is used as a marker throughout the video sequence.
And step S120, dividing f preselected areas on the face image according to the position information of the n key feature points, wherein the contour line of each preselected area is formed by connecting at least three key feature points. The face image comprises a plurality of face organs and a bystander area, and f preselected areas completely cover the bystander area.
In this embodiment, 98 pre-selected areas are defined on the face image according to the position information of 81 key feature points (the position information of the key feature points is marked by numerals), the contour line of each pre-selected area is formed by connecting three key feature points, the pre-selected areas are triangular, one combination mode of the key feature points is shown in fig. 2, and the triangular pre-selected areas formed by using the combination mode of fig. 2 are shown in fig. 3. The 98 preselected areas completely cover the side areas of the face except for eyes, nostrils and lips (facial organs), and as shown in fig. 4, the white portions in fig. 4 are the side areas covered by the preselected areas. Because the serial numbers and the positions of the key feature points are corresponding, the same area selection effect can be obtained for different subjects.
Step S130, calculating the position information of the mass center of each preselected area, and acquiring f positive selection areas centering on the mass center of each preselected area according to the position information of the mass center.
In this embodiment, the centroid is the center of gravity of the triangle pre-selected area, is the intersection point of the central lines of each side of the triangle, and the schematic diagram of the centroid position is shown in fig. 5. The position information of the mass center of each preselected area is calculated, numbers are used as marks, as shown in fig. 6, fig. 6 is a schematic diagram of the marks of the mass center position information of the preselected areas, and 98 normal selection areas with the mass center of each preselected area as the center are obtained according to the position information of the mass center.
Step S140, extracting the iPG signal of each positive selection area, and selecting the positive selection area of which the iPG signal meets the preset condition as a target area.
And step S150, analyzing the target area and outputting an analysis result.
Based on the above steps, the face segmentation algorithm based on the existing face recognition and feature point calibration uses a set of misaligned custom triangle regions to obtain a morphologically 49 pairs of face symmetric regions and 1 reference region, which is 98 face regions in total. Because the algorithm is not influenced by subjective factors, the selection of the symmetrical regions is based on the human face feature points obtained through the actual facial morphology training, the influence of crowd diversity is avoided, the actual selection regions of each subject can be in one-to-one correspondence in morphology, the technical problems that the influence of subjective factors is difficult to accurately position when the face fixing regions are manually selected, and the determined relative position information is not established between the regions during the analysis of the regions, particularly between the symmetrical regions, are avoided.
In one possible design, referring to fig. 7, before step S130, the method further includes step S100: setting a reference illuminance under the current light source device. Referring to fig. 8 in combination, for step S100, the following sub-steps may be specifically included:
And step S101, selecting any area on the face as an illuminance calibration area, and acquiring data of the illuminance calibration area on each frame of image under the current illuminance.
In this embodiment, a rectangular area is selected as the illuminance calibration area in the middle of the forehead on the face, and the size of the rectangular area can be customized, for example, by selecting (x 1 ,y 1 )=(1200,400),(x 2 ,y 2 ) = (1300,500). And cutting out rectangular areas with the same positions and the same sizes for each frame of image, cutting out N frames of images to obtain N illuminance calibration areas, and acquiring data of the illuminance calibration areas of each frame of image. The illumination calibration area is only used for illumination calibration, so that one area is selected in the middle of the forehead on the face of the image (researches show that forehead signals are relatively good), all 98 areas are not needed, and the actual area size can be adjusted according to different camera resolutions.
Step S102, original signal data of the illuminance calibration area under the current illuminance is calculated, iPG alternating current signal data is obtained through calculation of the original signal data, the iPG alternating current signal is subjected to filtering processing, effective value AC of the iPG alternating current signal of each period after filtering is obtained through calculation, average value of the effective value AC of the iPG alternating current signal is calculated, and meanwhile signal-to-noise ratio SNR of the illuminance calibration area under the current illuminance is calculated.
In this embodiment, the specific steps of calculating the original signal data of the illuminance calibration area under the current illuminance are as follows: calculated using equation (1) as shown below, where W is the area width (along the x-axis), L is the area length (along the y-axis), and g (L, W) is the intensity values (0-255) of the green channel for a pixel, so equation (1) represents the average of the intensity values for all pixels of the green channel for the selected area.
The specific steps of calculating the iPG alternating current signal data through the original signal data are as follows: calculated using the following formula (2), in formula (2)For convolution symbols, ori_means is a sequence of ori_means in picture order obtained from formula (1), L1 is ori_means length, W1 is sliding window size, W1 is set to sampling frequency (w=fps=40) in this embodiment, one w1 One for a length W1, a value of 1 L1 A sequence of length L1 and a value of 1.
The method comprises the specific steps of filtering the iPG alternating current signal and calculating to obtain the effective value of the iPG alternating current signal of each period, wherein the specific steps are as follows: using equation (3) shown below, the pre-filter signal is denoted as x (i), the post-filter signal is denoted as f (i), this embodiment uses a butterworth band-pass filter of 3 rd order, the filtering range is 0.5-4, and the AC signal effective value (AC) for each period is calculated for x (i), f (i) based on equation (2), where AC (i) is the signal strength of the i-th frame picture, and Np is the number of sampling points in one period. An AC average value is calculated based on the AC signal effective value (AC) of each cycle.
The specific steps of calculating the signal to noise ratio of the illuminance calibration area under the current illuminance are as follows: the signal-to-noise ratio SNR is calculated using equation (4) shown below, where f (i) is the pre-filter signal, x (i) is the pre-filter signal, and N is the total number of frames of the video.
By executing the steps, an AC average value and a signal-to-noise ratio SNR can be calculated under one illumination.
Step S103, controlling the illuminance to rise, and acquiring the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal in the illuminance calibration area under each illuminance, wherein the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal rise synchronously based on the illuminance rise; when the average value of the effective value AC of the iPG alternating current signal is detected to rise to a first value and the rising amplitude is lower than a first set amplitude after rising to the first value, and the signal to noise ratio SNR is raised to a second value and begins to fall after rising to the second value and the falling amplitude is higher than a second set amplitude, the first value and the second value are obtained, and any illuminance value is selected as reference illuminance under the current light source equipment in an illuminance interval corresponding to the first value and the second value.
In this embodiment, the illuminance is set to 17.2lx,69.3lx,141.1lx,220.7lx,290.2lx,483.4lx,562.1lx,662.2lx,762.1lx,818.6lx,865.8lx,880.8lx,920.9lx,1030lx respectively, so as to control the illuminance increase, the sub-step S102 is performed under each illuminance, and an AC average value and an SNR can be calculated under each illuminance, so that multiple groups of AC average values and SNRs corresponding to the illuminance can be obtained. The AC average value and SNR data of multiple groups are plotted as a line graph, and the specific results are shown in fig. 9, and fig. 9 is a line graph of AC average value and SNR before and after filtering, corresponding to illuminance.
Experiments show that as the illuminance is increased, both the AC average value and the SNR (signal to noise ratio) are increased firstly, the AC average value is gradually flattened to a certain degree, and the SNR is suddenly reduced. When both the AC average value and the SNR value are high, the corresponding illuminance is the optimal reference illuminance. In the data of this example, it can be seen from the line graph trend that at 850lx illuminance, both the AC average value and the signal-to-noise ratio SNR are higher, so 850lx illuminance is selected as the reference illuminance.
Based on the above steps, the embodiment further considers the influence of illuminance on signal quality, and before calculating the position information of the centroid of the preselected area, calculates the optimal illuminance capable of effectively optimizing the signal quality under the current light source device, and sets the optimal illuminance as the reference illuminance. The application provides a method for selecting illumination intensity by taking signal intensity and signal-to-noise ratio into consideration simultaneously, which is not disclosed in the prior art.
In one possible design, referring to fig. 10, step S130: calculating the position information of the mass center of each preselected area, and acquiring f positive selection areas centering on the mass center of each preselected area according to the position information of the mass center, wherein the method comprises the following steps:
Substep S131: the centroid position information for each preselected region is calculated and the result is shown in fig. 11. Based on f pre-selected areas, taking the mass center of each pre-selected area as the center, respectively setting e pre-fetching areas with different sizes, wherein the sizes of the e pre-fetching areas are in an increasing trend.
In this embodiment, based on 98 pre-selected regions, 9 pre-fetch regions of different sizes are set, respectively, centered on the centroid of each pre-selected region. The prefetch areas are square areas, and the side lengths of the 9 prefetch areas are respectively 10, 14, 20, 28, 40, 56, 80, 98 and 113, so that the side lengths tend to be increased.
Sub-step S132: calculating signal-to-noise values of e pre-fetching areas under f pre-selected areas respectively; f signal-to-noise ratio values of the prefetched areas with the same size are set, and the average signal-to-noise ratio value of each set is calculated to obtain e average signal-to-noise ratio values.
In this embodiment, 9 pre-fetch areas are set in one pre-fetch area, and the signal to noise ratio value of each pre-fetch area of 98 pre-fetch areas is calculated respectively to obtain 882 signal to noise ratio value data (98 x 9). The signal-to-noise ratio values of 98 pre-fetch areas of the same size are set, for example, the signal-to-noise ratio values of 98 pre-fetch areas with the side length of 10 are set, and the average signal-to-noise ratio value of each set is calculated. And so on, 9 sets of average signal to noise ratio value data can be obtained in total. The 9 sets of average snr value data and the size of the prefetch region are plotted as a line graph, and the specific result is shown in fig. 12, and fig. 12 is a line graph of average snr value data and the size of the prefetch region.
Sub-step S133: when the average signal-to-noise ratio value is detected to rise to the third value and the rising amplitude is lower than the third set amplitude after the average signal-to-noise ratio value rises to the third value, the size of the prefetching area corresponding to the average signal-to-noise ratio value is the size of the positive selection area.
As can be seen from the figure, when the side length of the prefetch area reaches 80, the average signal-to-noise value rises smoothly, so the present embodiment sets the positive selection area to be a square area of side length 80×80.
Based on the above steps, the present embodiment sets a preferred positive selection region size by further considering the influence of the region size on the signal quality and the operation load, and avoiding the signal characteristics that may exist when the selection of the excessively large region covers the small face region. The selection method of the clipping region size, which is proposed in the present application, is not disclosed in the prior art.
In one possible design, referring to fig. 13, step S140: extracting an iPG signal of each positive selection region, and selecting the positive selection region of which the iPG signal meets a preset condition as a target region, wherein the step of extracting the iPG signal of each positive selection region comprises the following steps:
sub-step S141: the standard deviation value and the signal-to-noise value of the iPPG signal for each positively selected region are calculated.
Sub-step S142: and calculating according to the standard deviation value and the signal to noise ratio value to obtain a quality evaluation value Q.
Sub-step S143: and (3) sorting the quality evaluation values Q of all the positive selection areas in a descending order, wherein the preset condition is that the quality evaluation values Q are sorted in the first w bits.
In this embodiment, the quality evaluation value Q is calculated by the following formula (5), where STD is a standard deviation value and SNR is a signal-to-noise value in the formula (5).
Q=mean(SNR)+mean(-STD)=(SNR-STD)/2 (5)
After the quality evaluation value Q is sequenced, the positive selection area of the first 50 bits of the sequence is selected as a target area, and the iPG signal in the positive selection area is extracted for the next analysis. In addition, in the target area with the first 50 bits of the sequence, 38 symmetrical subareas are selected at the positions where symmetry points exist on the left face and the right face, and the iPG signals in the symmetrical subareas are extracted for further analysis. The location information and numerical designation of the symmetric subregion is shown in fig. 14.
Based on the steps, the application provides a signal quality evaluation index, namely a Q value, and selects a face area which can meet the experimental requirements as a target area according to the Q value sequence, which is not disclosed in the prior art.
Fig. 14 is a schematic diagram of functional modules of an analysis device based on automatic segmentation and selection of a region according to an embodiment of the present application, where the method embodiment may divide functional modules of the analysis device based on automatic segmentation and selection of a region. For example, each functional module may be divided corresponding to each function, or two or more functions may be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that the division of the modules in this application is illustrative, and is merely a logic function division, and other division manners may be implemented in practice. For example, in the case of dividing each function module by using each function, the analysis apparatus based on the automatic division and selection of the region shown in fig. 14 is only one apparatus schematic diagram. The analysis device based on the automatic segmentation and selection of the region may include an acquisition pointing module 210, a combination preselection module 220, a secondary region selection module 230, a tertiary region selection module 240, and a region analysis module 250, and the functions of each functional module of the analysis device based on the automatic segmentation and selection of the region are described in detail below.
The acquisition fixed point module 210 is configured to acquire a face image to be processed, and acquire n key feature points on the face image and position information of each key feature point based on a preset algorithm.
The combination preselection module 220 is configured to divide f preselection areas on the face image according to the position information of the n key feature points, where the contour line of each preselection area is formed by connecting at least three key feature points; the face image comprises a plurality of face organs and a bystander area, and f preselected areas completely cover the bystander area.
The secondary selection module 230 is configured to calculate position information of a centroid of each of the pre-selected areas, and obtain f positive selection areas centered on the centroid of each of the pre-selected areas according to the position information of the centroid.
And a third-time selection module 240, configured to extract an iPPG signal of each of the positively selected regions, and select, as the target region, a positively selected region in which the iPPG signal satisfies a preset condition.
And the area analysis module 250 is used for analyzing the target area and outputting an analysis result.
In one possible design, referring to fig. 15, fig. 15 is a schematic diagram of an analysis apparatus based on automatic segmentation and selection of regions in another embodiment, and before the secondary region selection module 230 performs data processing, an illuminance setting module 200 is further included for setting the reference illuminance under the current light source device. That is, the analysis device based on the automatic segmentation and selection of the region may include an acquisition pointing module 210, a combination preselection module 220, an illuminance setting module 200, a secondary selection module 230, a tertiary selection module 240, and a region analysis module 250. Referring to fig. 16, fig. 16 is a schematic structural diagram of the illuminance setting module 200, where the illuminance setting module 200 may include a calibration area selection module 201, a data calculation module 202, and a data processing module 203, and the functions of each functional module of the illuminance setting module 200 are described in detail below.
The calibration area selection module 201 is configured to select any area on the face as an illuminance calibration area, and obtain data of each frame of illuminance calibration area under the current illuminance.
The data calculation module 202 is configured to calculate original signal data of the illuminance calibration area under the current illuminance, calculate the original signal data to obtain iPPG AC signal data, filter the iPPG AC signal, calculate an effective value AC of the iPPG AC signal of each period after filtering, calculate an average value of the effective values AC of the iPPG AC signal, and calculate a signal-to-noise ratio SNR of the illuminance calibration area under the current illuminance.
The data processing module 203 is configured to control illuminance rise, obtain an average value of an effective value AC of an iPPG alternating current signal and a signal-to-noise ratio SNR of an illuminance calibration area under each illuminance, and synchronously rise the average value of the effective value AC of the iPPG alternating current signal and the signal-to-noise ratio SNR based on the illuminance rise; when the average value of the effective value AC of the iPG alternating current signal is detected to rise to a first value and the rising amplitude is lower than a first set amplitude after rising to the first value, and the signal to noise ratio SNR is raised to a second value and begins to fall after rising to the second value and the falling amplitude is higher than a second set amplitude, the first value and the second value are obtained, and any illuminance value is selected as reference illuminance under the current light source equipment in an illuminance interval corresponding to the first value and the second value.
In one possible design, referring to fig. 17, fig. 17 is a schematic structural diagram of the secondary selection module 230, where the secondary selection module 230 may further include a prefetch setting module 231, a prefetch calculation module 232, and a positive selection setting module 233, and the functions of each functional module of the secondary selection module 230 are described in detail below.
The prefetch setting module 231 is configured to set e prefetch areas with different sizes based on f pre-selected areas, with the centroid of each pre-selected area as the center, where the sizes of the e prefetch areas tend to increase.
A prefetch calculation module 232, configured to calculate signal-to-noise values of e prefetch areas under the f pre-selected areas, respectively; f signal-to-noise ratio values of the prefetched areas with the same size are set, and the average signal-to-noise ratio value of each set is calculated to obtain e average signal-to-noise ratio values.
The positive selection setting module 233 is configured to, when it is detected that the average signal-to-noise ratio value rises to the third value and the rising amplitude is lower than the third set amplitude after the average signal-to-noise ratio value rises to the third value, determine the size of the pre-fetch area corresponding to the average signal-to-noise ratio value as the size of the positive selection area.
In one possible design, referring to fig. 18, fig. 18 is a schematic structural diagram of the third time selection module 240, and the third time selection module 240 may further include a numerical calculation module 241, an evaluation calculation module 242, and a target setting module 243, and the functions of each functional module of the third time selection module 240 are described in detail below.
The numerical calculation module 241 is configured to calculate a standard deviation value and a signal to noise ratio value of the iPPG signal of each positive selection region.
And the evaluation calculation module 242 is used for calculating a quality evaluation value Q according to the standard deviation value and the signal-to-noise ratio value.
The target setting module 243 is configured to sort the quality evaluation values Q of all the positively selected areas in a descending order, and the preset condition is that the quality evaluation values Q are sorted in the first w bits.
The embodiment of the application provides an analysis server based on automatic segmentation and selection of areas, which comprises a machine-readable storage medium and a processor, wherein the machine-readable storage medium stores machine-executable instructions, and the processor realizes a method for executing any possible design mode when executing the machine-executable instructions.
The machine-readable storage medium acts as a kind of computer-readable storage medium, storing software programs, computer-executable programs, and modules.
The machine-readable storage medium may include primarily a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functionality; the storage data area may store data created according to the use of the terminal, etc. Further, the machine-readable storage medium may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data rate Synchronous DRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the embodiments of the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to encompass such modifications and variations.

Claims (9)

1. An analysis method based on automatic segmentation and selection of regions, comprising:
acquiring a face image to be processed, and acquiring n key feature points on the face image and position information of each key feature point based on a preset algorithm;
dividing f preselected areas on the face image according to the position information of the n key feature points, wherein the contour line of each preselected area is formed by connecting at least three key feature points; the face image comprises a plurality of face organs and a bystander area, and the f preselected areas completely cover the bystander area;
Calculating the position information of the mass center of each preselected area, and acquiring f positive selection areas taking the mass center of each preselected area as the center according to the position information of the mass center;
extracting an iPG signal of each positive selection region, and selecting the positive selection region of which the iPG signal meets a preset condition as a target region;
analyzing the target area and outputting an analysis result;
the step of calculating the position information of the mass center of each pre-selected area, and acquiring f positive selected areas with the mass center of each pre-selected area as the center according to the position information of the mass center comprises the following steps:
based on f pre-selected areas, taking the mass center of each pre-selected area as the center, respectively setting e pre-selected areas with different sizes, wherein the sizes of the e pre-selected areas are in an increasing trend;
calculating the signal-to-noise ratio values of e pre-fetching areas under f pre-selected areas respectively; f signal-to-noise ratio values of the prefetching areas with the same size are a group, and the average signal-to-noise ratio value of each group is calculated to obtain e average signal-to-noise ratio values;
and when the average signal-to-noise ratio value is detected to rise to a third value and the rising amplitude is lower than a third set amplitude after the average signal-to-noise ratio value rises to the third value, the size of the prefetching area corresponding to the average signal-to-noise ratio value is the size of the positive selection area.
2. The method of claim 1, further comprising the step of setting a reference illuminance under a current light source device before the step of calculating the position information of the centroid of each of the preselected areas, the step of setting the reference illuminance under the current light source device comprising:
selecting any area on a face as an illuminance calibration area, and acquiring data of the illuminance calibration area of each frame under the current illuminance;
calculating original signal data of the illuminance calibration area under the current illuminance, calculating to obtain iPG alternating current signal data through the original signal data, filtering the iPG alternating current signal, calculating to obtain an effective value AC of the iPG alternating current signal of each period after filtering, calculating an average value of the effective values AC of the iPG alternating current signal, and simultaneously calculating a signal-to-noise ratio SNR of the illuminance calibration area under the current illuminance;
controlling the illumination to rise, and acquiring the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal in the illumination calibration area under each illumination, wherein the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal synchronously rise based on the illumination; when the average value of the effective value AC of the iPG alternating current signal is detected to rise to a first value and the rising amplitude is lower than a first set amplitude after rising to the first value, and the signal to noise ratio SNR is raised to a second value and begins to fall after rising to the second value and the falling amplitude is higher than a second set amplitude, the first value and the second value are obtained, and any illuminance value is selected as reference illuminance under the current light source equipment in an illuminance interval corresponding to the first value and the second value.
3. The method for automatically dividing and selecting an area according to claim 2, wherein a rectangular area in the middle of the forehead on the face is selected as the illuminance calibration area.
4. The analysis method based on automatic segmentation and selection of regions according to claim 1 or 2,
the method is characterized in that the step of extracting the iPG signal of each positive selection area and selecting the positive selection area of which the iPG signal meets the preset condition as a target area comprises the following steps:
calculating standard deviation values and signal to noise ratio values of the iPG signals of each positive selection area;
calculating to obtain a quality evaluation value Q according to the standard deviation value and the signal-to-noise ratio value;
and sorting the quality evaluation values Q of all the positive selection areas in a descending order, wherein the preset condition is that the quality evaluation values Q are sorted in the first w bits.
5. An analysis device based on automatic segmentation and selection of regions, comprising:
the acquisition fixed point module is used for acquiring a face image to be processed, and acquiring n key feature points on the face image and position information of each key feature point based on a preset algorithm;
the combined preselection module is used for dividing f preselection areas on the face image according to the position information of the n key feature points, and the contour line of each preselection area is formed by connecting at least three key feature points; the face image comprises a plurality of face organs and a bystander area, and the f preselected areas completely cover the bystander area;
The secondary selection module is used for calculating the position information of the mass center of each preselected area and acquiring f positive selection areas taking the mass center of each preselected area as the center according to the position information of the mass center;
the third-time selecting module is used for extracting the iPG signal of each positive selecting area and selecting the positive selecting area of which the iPG signal meets the preset condition as a target area;
the area analysis module is used for analyzing the target area and outputting an analysis result;
the secondary selection module comprises:
the pre-fetching setting module is used for setting e pre-fetching areas with different sizes based on f pre-selected areas and taking the mass center of each pre-selected area as the center, wherein the sizes of the e pre-fetching areas are in an increasing trend;
the pre-fetching calculation module is used for calculating the signal-to-noise value of e pre-fetching areas under f pre-selected areas respectively; f signal-to-noise ratio values of the prefetching areas with the same size are a group, and the average signal-to-noise ratio value of each group is calculated to obtain e average signal-to-noise ratio values;
and the positive selection setting module is used for detecting that the average signal-to-noise ratio value rises to a third value and the rising amplitude is lower than a third set amplitude after the average signal-to-noise ratio value rises to the third value, and the size of the prefetching area corresponding to the average signal-to-noise ratio value is the size of the positive selection area.
6. The analysis device based on automatic segmentation and selection of areas according to claim 5, further comprising an illuminance setting module for setting a reference illuminance under a current light source device before the secondary selection module; the illuminance setting module includes:
the calibration area selection module is used for selecting any area on the face as an illuminance calibration area and acquiring data of the illuminance calibration area of each frame under the current illuminance;
the data calculation module is used for calculating original signal data of the illuminance calibration area under the current illuminance, calculating to obtain iPG alternating current signal data through the original signal data, filtering the iPG alternating current signal, calculating to obtain an effective value AC of the iPG alternating current signal of each period after filtering, calculating an average value of the effective values AC of the iPG alternating current signal, and calculating the signal-to-noise ratio SNR of the illuminance calibration area under the current illuminance;
the data processing module is used for controlling the illumination to rise and acquiring the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal in the illumination calibration area under each illumination, wherein the average value and the signal-to-noise ratio SNR of the effective value AC of the iPG alternating current signal synchronously rise based on the illumination; when the average value of the effective value AC of the iPG alternating current signal is detected to rise to a first value and the rising amplitude is lower than a first set amplitude after rising to the first value, and the signal to noise ratio SNR is raised to a second value and begins to fall after rising to the second value and the falling amplitude is higher than a second set amplitude, the first value and the second value are obtained, and any illuminance value is selected as reference illuminance under the current light source equipment in an illuminance interval corresponding to the first value and the second value.
7. The area automatic segmentation and selection-based analysis device according to claim 5 or 6, wherein the tertiary selection module comprises:
the numerical calculation module is used for calculating the standard deviation value and the signal-to-noise value of the iPG signal of each positive selection area;
the evaluation calculation module is used for calculating to obtain a quality evaluation value Q according to the standard deviation value and the signal-to-noise ratio value; and the target setting module is used for descending order of the quality evaluation values Q of all the positive selection areas, and the preset condition is that the quality evaluation values Q are ordered in the first w bits.
8. An analysis server based on automatic segmentation and selection of a region, wherein the analysis server based on automatic segmentation and selection of a region comprises a machine-readable storage medium and a processor, the machine-readable storage medium stores machine-executable instructions, and the processor implements the analysis method based on automatic segmentation and selection of a region according to any one of claims 1 to 4 when executing the machine-executable instructions.
9. A readable storage medium having stored therein machine executable instructions that when executed implement the region-based automatic segmentation and selection analysis method of any one of claims 1-4.
CN202010129351.8A 2020-02-28 2020-02-28 Analysis method, device and server based on automatic segmentation and selection of regions Active CN111445477B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010129351.8A CN111445477B (en) 2020-02-28 2020-02-28 Analysis method, device and server based on automatic segmentation and selection of regions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010129351.8A CN111445477B (en) 2020-02-28 2020-02-28 Analysis method, device and server based on automatic segmentation and selection of regions

Publications (2)

Publication Number Publication Date
CN111445477A CN111445477A (en) 2020-07-24
CN111445477B true CN111445477B (en) 2023-07-25

Family

ID=71655752

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010129351.8A Active CN111445477B (en) 2020-02-28 2020-02-28 Analysis method, device and server based on automatic segmentation and selection of regions

Country Status (1)

Country Link
CN (1) CN111445477B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11074495B2 (en) * 2013-02-28 2021-07-27 Z Advanced Computing, Inc. (Zac) System and method for extremely efficient image and pattern recognition and artificial intelligence platform
JP6683367B2 (en) * 2015-03-30 2020-04-22 国立大学法人東北大学 Biological information measuring device, biological information measuring method, and biological information measuring program
CN109589101B (en) * 2019-01-16 2020-08-21 四川大学 Non-contact physiological parameter acquisition method and device based on video
CN109977858B (en) * 2019-03-25 2020-12-01 北京科技大学 Heart rate detection method and device based on image analysis
CN110647815A (en) * 2019-08-25 2020-01-03 上海贝瑞电子科技有限公司 Non-contact heart rate measurement method and system based on face video image

Also Published As

Publication number Publication date
CN111445477A (en) 2020-07-24

Similar Documents

Publication Publication Date Title
CN109076198B (en) Video-based object tracking occlusion detection system, method and equipment
CN107451998B (en) Fundus image quality control method
JP4307496B2 (en) Facial part detection device and program
CN108197546B (en) Illumination processing method and device in face recognition, computer equipment and storage medium
EP2188779B1 (en) Extraction method of tongue region using graph-based approach and geometric properties
US20010036298A1 (en) Method for detecting a human face and an apparatus of the same
US20130121546A1 (en) Inspection of region of interest
CN101599175B (en) Detection method for determining alteration of shooting background and image processing device
WO2013111140A2 (en) Eye tracking
US10178322B2 (en) Method of adjusting digital camera image processing parameters
CN110264493A (en) A kind of multiple target object tracking method and device under motion state
CN109273074B (en) Network model adjusting method and equipment for medical image
AU2020103260A4 (en) Rice blast grading system and method
CN111444555B (en) Temperature measurement information display method and device and terminal equipment
CN109028237A (en) The kitchen ventilator of wind speed adjusting is carried out based on dual area Image Acquisition
CN111445477B (en) Analysis method, device and server based on automatic segmentation and selection of regions
KR101908785B1 (en) Tongue region extraction method and image processing apparatus for performing the method
CN109447948B (en) Optic disk segmentation method based on focus color retina fundus image
CN111192280A (en) Method for detecting optic disc edge based on local feature
Reza et al. Automatic detection of optic disc in fundus images by curve operator
CN112258660B (en) Processing method and system of virtual dressing image
CN111275045B (en) Image main body recognition method and device, electronic equipment and medium
KR102318194B1 (en) Device for predicting optic neuropathy and method for providing prediction result to optic neuropathy using fundus image
US10977482B2 (en) Object attribution analyzing method and related object attribution analyzing device
KR20160104826A (en) Method and apparatus for detecting obscene video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant