CN110879947A - Real-time human face three-dimensional measurement method based on one-time projection structured light parallel stripe pattern - Google Patents

Real-time human face three-dimensional measurement method based on one-time projection structured light parallel stripe pattern Download PDF

Info

Publication number
CN110879947A
CN110879947A CN201811034867.3A CN201811034867A CN110879947A CN 110879947 A CN110879947 A CN 110879947A CN 201811034867 A CN201811034867 A CN 201811034867A CN 110879947 A CN110879947 A CN 110879947A
Authority
CN
China
Prior art keywords
stripe
stripes
clustering
face
nose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201811034867.3A
Other languages
Chinese (zh)
Inventor
王振洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Technology
Original Assignee
Shandong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Technology filed Critical Shandong University of Technology
Priority to CN201811034867.3A priority Critical patent/CN110879947A/en
Publication of CN110879947A publication Critical patent/CN110879947A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/12Acquisition of 3D measurements of objects
    • G06V2201/121Acquisition of 3D measurements of objects using special illumination

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a real-time human face three-dimensional measurement method based on a one-time projection structure light parallel stripe pattern. The single structured light parallel stripe pattern is projected to the surface of the face to be detected through the structured light projection device, the deformed structured light stripe pattern is recorded through the single camera device, and the stripe influenced by the nose shielding area, the stripe not influenced by the nose shielding area and the stripe influenced by the severe change of the ear height are automatically detected through an image processing technology. And carrying out independent stripe clustering on the stripes influenced by the nose shielding area, the stripes not influenced by the nose shielding area and the stripes influenced by the severe change of the ear height respectively by a stripe clustering technology. After all the stripes are clustered, they are identified in sequence from top to bottom in one image. And fitting the stripes after sequential identification through a spline function, extracting the vertical coordinate and interpolating the spline function to generate a vertical coordinate graph. And calculating the three-dimensional face in real time through the ordinate graph and the calibrated system parameters.

Description

Real-time human face three-dimensional measurement method based on one-time projection structured light parallel stripe pattern
Technical Field
The invention relates to an optical three-dimensional sensing technology, in particular to a method for realizing real-time measurement of the three-dimensional shape of the surface of a human face by projecting a pattern consisting of parallel black-and-white stripes or color stripes and by an image processing method aiming at the characteristics of the human face.
Background
The invention relates to a real-time three-dimensional face measurement method based on one-time projection structured light. The three-dimensional measurement technology based on the one-time projection structured light can measure the three-dimensional data of both static human faces and dynamic human faces. The three-dimensional measurement of the static human face plays an important role in the fields of human face recognition, auxiliary human face surgery and the like. The three-dimensional measurement of the dynamic facial expression has wide requirements and application in the fields of computer animation and games, digital movies, virtual reality, remote network video conferences, video telephones, facial expression recognition, auxiliary education and the like.
The current technology for three-dimensional measurement of human faces mainly comprises a binocular/multi-eye vision technology, a multi-projection structured light technology, a one-time projection real-time structured light technology, a binocular-based one-time projection real-time structured light technology, a three-dimensional laser scanning technology and the like. The binocular/multi-view vision technology algorithm has the advantages of simplicity, flexibility, low cost and high complexity, is difficult to meet the real-time requirement, has poor robustness, and is easily influenced by human face textures or illumination conditions, please refer to document n. Uchida, "3D facial recognition using stereo vision," proc. of IEEE int. conf. on image processing 2005 (2005). The multi-projection Structured light technology has the advantages of strong robustness and low efficiency, needs a plurality of images to calculate a three-dimensional face, and is easily interfered by external light, please refer to J.Geng, "Structured light 3D surface imaging: a structural," Adv Optonics, Vol.3 No. 2, pp.128-160, (2011). The one-shot projection real-time structured light technology has the advantages of strong real-time performance and the disadvantages of poor robustness, see documents m. Takeda, q. Gu, m. Kinoshita, h. Takai and y. Takahashi, "Frequency-multiplex source-transform profile: single-shot-two-dimensional shape measurement of objects with large height discrete and/or surface isolations", Applied option, vol. 36, number 22, pp. 5347, (1997). The binocular-based one-time projection real-time structured light technology has the advantages of strong robustness and efficiency. The disadvantage is that the influence of environmental noise is large, and the discontinuous object may generate wrong three-dimensional information, please refer to documents J, Davis, D, Nehab, R, ramamorthi and S, Rusinkiewicz, "space time: a singular frame for depth from the formation" IEEE Trans, Pattern Anal. Mach. Intel., 27(2), 296-type 302 (2005). The three-dimensional laser scanning technology has the advantages of strong robustness and real-time performance and the disadvantages of high cost and high price, and please refer to the quotation of a typical product FARO Focus3D 120. Compared with the prior art, the method has the advantages of high real-time performance and robustness, low cost and the like.
The phase shifting method is one kind of traditional structured light three-dimensional measuring technology and its advantages include high measurement precision and measurement precisionFor example, the document Z.Z. Wang, "Robustmeasurement of the differential surface by phase shift profile," Journal of optics, 16 (10), 105407 (2014). Therefore, three-dimensional measurement technology based on the phase shift method has become a mainstream product in the three-dimensional measurement market. However, it requires at least three or more phase patterns to be projected onto the object to be measured, and requires the object to be measured to be kept in a completely stationary state. Therefore, the phase shift method and the related products thereof cannot perform robust measurement on moving or deformed objects. In addition, the structured light pattern used in the phase shift method has a continuous phase, so that it has very poor resistance to external light interference. How to solve the ordinate of the phase shift method by a single image requires at least three images to solve (y p ) The inventors have solved the problem in previous studies, see the document z.z. Wang and y.m. Yang, "Single shot-dimensional reconstruction based on structured light line pattern," opt. Lasers in Engineering, 106, 10-16, (2018). How to accurately solve the human face ordinate graph through a single human face image (y p ) Therefore, the aim of accurately measuring the dynamic human face is achieved, and the problem which can be solved by applying the invention is solved. Meanwhile, the stripe coding adopted by the present invention is binary coding, which is more resistant to the interference of external light and noise than the continuous phase, please refer to the documents z.z. Wang, unpurved registration and characterization of the reflected laser lines for the reflective laser metal electrodes, ieee transactions on industrial information, 13(4), 1866 + 1876 (2017). Therefore, the invention can carry out real-time three-dimensional measurement on static, moving or deformed objects in a dark room and also in an outdoor lighting scene. In addition, the phase shift method cannot perform robust measurement on the nose-shielded region and the region with the severe ear height variation of the human face. The invention effectively combines the characteristics of the face and provides a special image processing method according to the characteristics of the face to perform independent stripe clustering calculation on the face nose shielding area and the ear height severe change area. Finally, calculating an accurate human face ordinate graph through a single image (y p ) And a three-dimensional face.
Disclosure of Invention
The invention aims to provide a three-dimensional face measurement method based on one-time projection structured light parallel stripes aiming at the defects that the existing real-time structured light three-dimensional measurement technology cannot perform robust three-dimensional measurement on a dynamic face and has poor capability of resisting external light interference and the like, the method combines face features to perform effective image processing, and calculates a face ordinate graph which can be calculated by more than three images in the traditional phase shift method by using a single image (the method comprises the steps of (1)y p ) The method not only meets the real-time performance of the human face three-dimensional measurement, but also ensures that the human face measurement precision is very close to the three-dimensional measurement precision of the traditional phase shift method. Meanwhile, the problem that the traditional phase shift method cannot accurately measure the human face nose shielding area and the area with severe ear height change is solved.
In order to achieve the purpose of the invention, the invention is realized by adopting the following technical scheme:
and projecting a single black-white or color structured light parallel stripe pattern onto the surface of the detected face by using a structured light projection device, wherein the black-white or color structured light parallel stripe pattern is a plurality of parallel stripes generated by binary coding or cosine function coding, and the intervals of the adjacent parallel stripes are the same. The method comprises the steps of recording a deformation stripe pattern distorted by the surface shape of a human face by using a single camera device, segmenting each deformation stripe by using a human face image processing technology, reducing noise and uneven background in an image by using a stripe gradient detection mode by using the human face image processing technology, and performing global segmentation by using a threshold selection mode based on gradient difference. And then, iteratively segmenting the stripes of the face by calculating the area change conditions of the segmentation stripes in different regions of the face and finely adjusting the selected threshold value. A nose shielding area and an ear height severe change area are detected according to the area change condition of the segmentation stripes by a face stripe clustering technology. By calculating the position of the nose, a vertical midline through the middle of the nose is calculated. And dividing the image into two parts through a vertical central line to obtain a left face image and a right face image. And clustering the stripes in the left face image and the stripes in the right face image respectively. The stripes are divided into two types in the left face image and the right face image respectively: (1) stripes not affected by nose-obscured areas; (2) stripes affected by nose blocking areas. (1) And for the stripes which are not influenced by the nose shielding area, stripe clustering is sequentially carried out from top to bottom, the starting point of any stripe is selected as the intersection point of the stripe and a vertical central line passing through the middle position of the nose, the stripe clustering is respectively carried out leftwards and rightwards by calculating the slope direction of the stripe in real time, and the left stripe and the right stripe after clustering are marked by the same number. (2) The main feature of the stripes affected by the nose-obscuring area is that the originally continuous stripes break into parts on the nose and parts on the cheeks. From the area size and coordinate distribution of the fractured portions, the fractured portions on the left cheek, the fractured portions on the left nose, the fractured portions on the right nose, and the fractured portions on the right cheek were identified. And respectively clustering broken parts of the stripes influenced by the nose shielding area at different positions from top to bottom, and sequencing the broken parts in sequence by the clustering method according to the minimum longitudinal coordinate values of the different broken parts at different positions and identifying the different broken parts by using numbers. The identification numbers of different broken parts of the same stripe at different positions are the same.
And respectively merging the clustering result of the stripes which are not influenced by the shielding region and the clustering result of the stripes which are influenced by the shielding region in the left face image and the right face image, so that the identification numbers of the clustering stripes in the left face image and the right face image are sequentially increased from top to bottom, and the identification numbers of the same stripes in the left face image and the right face image are the same. And merging the clustering results of the left face image and the right face image to obtain the final clustering stripes, wherein the identification numbers of the clustering stripes are 1, 2 and … from top to bottom,NwhereinNThe total number of stripes. And fitting each clustering stripe through a spline function to obtain a smooth and continuous fitting curve. And extracting the ordinate of all the points of each fitting curve in the image to generate an ordinate matrix. Resolution of the ordinate matrixN×N Y ) Lower than the resolution of the image captured by the camera device (N X ×N Y ) Therefore, it is necessary to increase the score of the ordinate matrix by the spline interpolation methodResolution of (1) toN X ×N Y ) And obtaining an ordinate graph. And calculating a longitudinal coordinate graph of the calibration plate according to the method, and calibrating parameters of the human face three-dimensional measurement system through the mark points on the calibration plate. When the human face is measured in real time, the camera device captures a human face stripe image, the longitudinal coordinate image of the human face is calculated by the method, and the three-dimensional shape of the human face can be calculated in real time by combining the parameters of the human face three-dimensional measurement system.
Compared with the prior art, the invention has the following advantages:
the projected structured light pattern is black and white stripes with alternate light and shade or color stripes consisting of RGB (red, green and blue) three colors. In the captured deformed stripe image, the contrast of the stripe is obvious, and the influence of noise and uneven background light in the image can be effectively reduced by the provided stripe gradient detection method. The resistance to external disturbing light is therefore greater than in the prior art. The invention can calculate the smooth and continuous human face longitudinal coordinate graph by a stripe clustering method aiming at human face characteristics and a spline function fitting and interpolating method. The problem of no reconstruction data in the prior art can not be caused in the nose shielding area and the ear height severe change area of the reconstructed face. Therefore, the human face measurement precision of the invention is higher than that of the existing one-time projection human face measurement technology. Meanwhile, the reconstruction precision and the resolution of the invention are basically the same as those of the traditional phase shift method, but the traditional phase shift method cannot measure the dynamic human face in real time or in real time through multiple projections, and the invention is a one-time projection, which has high measurement efficiency and can measure the dynamic human face.
Drawings
FIG. 1 is a system block diagram of the present invention.
Fig. 2 is a flow chart of generating a human face ordinate graph according to the present invention.
FIG. 3 is a flow chart of the stripe clustering technique affected by the nose occlusion region of the present invention.
FIG. 4 is a flow chart of the stripe clustering technique of the present invention that is not affected by the nose occlusion region.
FIG. 5 is a flow chart of the clustering technique of the broken part of the stripe on the ear according to the present invention.
Fig. 6 is a flow chart of the real-time three-dimensional face measurement technique of the present invention.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings and the working principle.
FIG. 1 is a system block diagram of the present invention. Comprises an industrial camera, an industrial projector and a high-performance computer. In order to ensure the resolution of the three-dimensional reconstruction result, the resolutions of the industrial camera and the industrial projector are selected to be 1280 × 1024 and 1920 × 1080 or higher, respectively. The image capture frame rate of the camera and the projection frame rate of the projector are both greater than 10 frames/second. Because the fringe pattern generation, image capture, image processing and three-dimensional reconstruction are all performed by the computer, the computer is selected to be a configuration with i7-6700 CPU @ 3.4GHz 16G ram or higher performance in order to achieve the real-time requirement.
FIG. 2 is a flow chart for generating a human face ordinate graph according to the present invention. The specific implementation is as follows:
as shown in fig. 2, firstly, the face stripe image is subjected to stripe enhancement, and a threshold value is selected for segmentation, so as to obtain a segmented face stripe image. The broken part of the stripe on the ear is automatically detected, and the stripe parts on the left ear and the right ear are clustered by the stripe clustering technology of the invention respectively. The face stripe image not including the ear portion is divided into a left face stripe image and a right face stripe image. Secondly, clustering the stripes in the left face stripe image and the right face stripe image by the stripe clustering technology of the invention to respectively obtain a left face clustering stripe and a right face clustering stripe. And combining the left ear and the right ear, wherein the clustering stripes on the left face and the right face are human face clustering stripes. For the firstiStripes of a stripe clusterC i Fitting by spline function to obtain continuous smooth curveS i . The fitting process is done by optimizing the energy function as follows:
Figure 500254DEST_PATH_IMAGE001
(1)
whereinαIs a smoothing coefficient, with a default value of 0.5.j∈{L i ,L i +1,…,R i -1,R i Therein ofL i Is the minimum abscissa value of the cluster stripe,R i the maximum abscissa value of the cluster stripe.
For the firstiContinuous smooth curve of strip fittingS i The portions thereof in the range of the abscissa of the captured image, which are not in the range of the abscissa of the corresponding cluster bar, are assigned zero:
Figure 350399DEST_PATH_IMAGE002
(2)
Figure 596703DEST_PATH_IMAGE003
(3)
S i ,i=1,2,…,Nis sizeN Y The vector of (2). Sequentially from top to bottomNThe vectors are accumulated to a resolution ofN×N Y Ordinate matrix ofM L
Figure 811784DEST_PATH_IMAGE004
(4)
By spline function interpolation, the resolution of the ordinate matrix is increased to be consistent with the resolution of the captured image, and the resolution is generated asN X ×N Y Ordinate ofM Y
The stripe clustering technology of the invention comprises a stripe clustering technology influenced by a nose shielding area, a stripe clustering technology not influenced by the nose shielding area and a stripe clustering technology influenced by the severe change of the ear height.
FIG. 3 is a flow chart of the stripe clustering technique of the present invention affected by the nose occlusion region. The specific implementation is as follows:
the fringes affected by the nose-obscuring region are divided into distinct broken portions, and the areas of these broken portions are significantly smaller than the complete continuous fringes. First, the stripes on the left and right cheeks affected by the nose blocking area were calculated as follows.
The average area of all the streaks is calculated,T a . Removal area is less thanT a Obtaining a fringe imageF. The stripe image is obtained by the following four stepsFThe stripes in (1) are clustered from top to bottom.
Step 1 generating a sub-resolution ofN X ×N Y Cluster image ofCAnd all its pixels are assigned 0. Automatic marking of imagesFGenerating a marking imageL a
Step 2, calculating the binary stripe imageFThe point with the smallest and non-zero middle ordinate, which is the highest point in the image (x top , y top ):
Figure 900963DEST_PATH_IMAGE005
(5)
Calculating a set of coordinates satisfying the following equationXY
Figure 428501DEST_PATH_IMAGE006
(6)
Will binary stripe imageFSet of medium coordinatesXYThe corresponding pixel is assigned a value of 0:
Figure 857208DEST_PATH_IMAGE007
(7)
will mark the imageL a Set of medium coordinatesXYCorresponding pixel assignmentiFor stripes of the first clusteri=1
Figure 305507DEST_PATH_IMAGE008
(8)
Step 3, calculating the updated binary stripe image again through the formula (5)FThe point with the smallest central ordinate and not being zero (x top , y top ) Again, the coordinate set is calculated by equation (6)XYTo mark an imageL a Set of medium coordinatesXYCorresponding pixel assignmenti+1
Figure 85244DEST_PATH_IMAGE009
(9)
The binary stripe image is again represented by equation (7)FSet of medium coordinatesXYThe corresponding pixel is assigned a value of 0.
Step 4, repeating the step 3 until the binary stripe imageFAll the stripes are marked from top to bottom in sequence and moved to the cluster imageCIn (1). For a specific stripe clustering method, please refer to the papers "Z.Z. Wang and Y.M. Yang, Single-shot-direct correlation based on structured light line pattern, Optics and lasers in engineering, Vol.106, pp.10-16,2018".
Clustering imagesCThe length of the middle stripe is calculated by the following formula:
Figure 152558DEST_PATH_IMAGE010
(10)
whereinL i Is as followsiThe minimum abscissa value of the stripe is clustered by the stripe,R i is as followsiThe bars cluster the maximum abscissa value of the stripe.
Filtering the fringe length distribution by discrete fourier transform to remove anomalous distribution as follows:
Figure 435771DEST_PATH_IMAGE011
(11)
Figure 320551DEST_PATH_IMAGE012
(12)
Figure 243376DEST_PATH_IMAGE013
(13)
whereinP’Is the fringe length distribution after filtering.
The length difference of adjacent stripes is calculated by:
Figure 176697DEST_PATH_IMAGE014
(14)
the maximum length difference occurrence position is calculated by the following formula:
Figure 376734DEST_PATH_IMAGE015
(15)
the maximum length difference is removed from the length difference distribution by:
Figure 42202DEST_PATH_IMAGE016
(16)
in the updated length difference distribution, the position where the maximum length difference occurs is calculated:
Figure 530952DEST_PATH_IMAGE017
(17)
the starting index of the cluster stripe affected by the nose is calculated by:
Figure 64702DEST_PATH_IMAGE018
(18)
the ending index of the cluster stripe affected by the nose is calculated by:
Figure 322508DEST_PATH_IMAGE019
(19)
the set of coordinates for the nose-affected stripe is calculated by:
Figure 673724DEST_PATH_IMAGE020
(20)
in the already cleared stripe imageFIn the method, a pixel point corresponding to a coordinate set of the strips affected by the nose is assigned with 1:
Figure 384191DEST_PATH_IMAGE021
(21)
automatic identification of updated binary stripe imagesFThe average area of the marking stripes is calculatedT a . Removal area is less thanT a Of (2) is provided. The remaining stripes affected by the nose are automatically identified again.
In order to accurately separate the stripes on the left and right cheeks affected by the nose-blocking region from the stripes affected by the nose, it is necessary to calculate the abscissa distribution of the remaining after-identified stripes. To distinguish the stripes on the left and right cheeks that are affected by the nose-obscuring region, the abscissa threshold of the affected stripe on the right cheek is calculated by:
Figure 721631DEST_PATH_IMAGE022
(22)
whereinH i Is the firstiThe bars cluster the average of the abscissa of all points on the stripe.N 1 Is the total number of remaining stripes affected by the nose. If the average value of the abscissas of all the points on a certain cluster stripe is larger thanT r Then this cluster stripe is the affected stripe on the right cheek.
The abscissa threshold of the affected stripe on the left cheek was calculated by:
Figure 771627DEST_PATH_IMAGE023
(23)
if the average value of the abscissas of all the points on a certain cluster stripe is less thanT l Then this cluster stripe is the affected stripe on the left cheek.
By the formulas (22) and (23), the stripes affected by the nose on the left cheek and the stripes affected by the nose on the right cheek can be determined, respectively. The stripes affected by the nose on the left cheek or the right cheek were clustered from top to bottom by equations (5) - (9), and the corresponding four steps. Calculating the average value of the vertical coordinates of all the points on the clustered stripesV i And the difference of the mean values of the ordinate of all the points on adjacent cluster stripesD i V . All differencesD i V i=1,2,…,N V ) Is given as the average value ofT d . If the difference between a cluster stripe and its adjacent cluster stripe is larger than 1.5T d Then starting from this cluster stripe down to the last cluster stripe is the broken part of the stripe on the left or right cheek affected by the nose mask region. If the difference of all adjacent cluster stripes is less than 1.5T d Then all cluster bars are broken portions of the bars on the left cheek or the right cheek that are affected by the nose-obscuring region.
Also, it is necessary to calculate the broken portion of the stripe affected by the nose-blocking region on the nose corresponding to that on the cheek. Firstly, dividing the segmented face stripe image into a left face binary image and a right face binary image by taking a vertical central line passing through a midpoint of a nose as a reference. In the left face binary image, the average area of all the stripes is calculatedT a l Removing all areas larger thanT a l And (4) stripes. The remaining stripes are automatically identified and the mean of the abscissas of all the points on each identified stripe is calculated. The band on the nose is calculated by equation (22). The stripes on the nose are clustered from top to bottom through equations (5) - (9), and the corresponding four steps. And calculating the difference between the average value of the vertical coordinates of all the points on the clustered stripes on the nose and the average value of the vertical coordinates of all the points on the adjacent clustered stripes. The average of all differences is recorded asT d l . If the difference between a cluster stripe and its adjacent cluster stripe is larger than 1.5T d l Then starting from this cluster stripe down to the last cluster stripe is the broken part of the stripe on the left nose affected by the nose occlusion area. If the difference of all adjacent cluster stripes is less than 1.5T d l Then all cluster stripes are broken portions of the stripe on the left nose affected by the nose-obscuring area.
In the right face binary image, the average area of all the stripes is calculatedT a r Removing all areas larger thanT a r And (4) stripes. The remaining stripes are automatically identified and the mean of the abscissas of all the points on each identified stripe is calculated. From equation (23), the band on the nose is calculated. The stripes on the nose are clustered from top to bottom through equations (5) - (9), and the corresponding four steps. And calculating the difference between the average value of the vertical coordinates of all the points on the clustered stripes on the nose and the average value of the vertical coordinates of all the points on the adjacent clustered stripes. The average of all differences is recorded asT d r . If the difference between a cluster stripe and its adjacent cluster stripe is larger than 1.5T d r Then starting from this cluster stripe down to the last cluster stripe is the broken part of the stripe on the right nose affected by the nose-obscuring area. If the difference of all adjacent cluster stripes is less than 1.5T d r Then all cluster stripes are broken portions of the stripe on the right nose affected by the nose-obscuring area.
The clustering result of the streak broken portions affected by the nose-shielding region on the left nose and the clustering result of the streak broken portions affected by the nose-shielding region on the left cheek are combined into a clustering result of the streaks affected by the nose-shielding region on the left face. The clustering result of the stripe breaking portion affected by the nose-shielding region on the right nose and the clustering result of the stripe breaking portion affected by the nose-shielding region on the right cheek are combined into the clustering result of the stripe affected by the nose-shielding region on the right face. The remaining stripes are those that are not affected by the nose mask area. The left face stripe and the right face stripe are also divided, and stripe clustering based on the slope direction is respectively carried out. The broken part of the stripe affected by the region where the height of the ear changes drastically is clustered on the stripe based on the slope direction alone.
FIG. 4 is a flow chart of the stripe clustering technique of the present invention that is not affected by nose occlusion areas. The specific implementation is as follows:
the stripes influenced by the nose shielding area in the left face image and the right face image are calculated respectively through the formulas (5) to (23) and the related steps. And directly subtracting the calculated stripes influenced by the nose shielding areas from the left face image or the right face image, and remaining the stripes not influenced by the nose shielding areas. As shown in fig. 4, the clustering technique for such stripes starts from the starting point and clusters point by point toward the end of the stripe along the slope direction of the stripe. The starting point of each stripe is determined by its intersection with the vertical midline through the midpoint of the nose. Suppose that the starting point of a stripe is (S)x, Sy) Selecting L adjacent points to the right or left, and fitting a straight line by the following formula:
Figure 841214DEST_PATH_IMAGE024
(24)
Figure 366873DEST_PATH_IMAGE025
(25)
Figure 180108DEST_PATH_IMAGE026
(26)
Figure 599457DEST_PATH_IMAGE027
(27)
wherein (A), (B), (C), (D), (C), (x i , y i ), i=1,2,…,LIs thatLA neighboring point, thenL+1Points (A)x L+1 , y L+1 ) Calculated from the following formula:
Figure 839946DEST_PATH_IMAGE028
(28)
selecting the distance on the stripeL+1Points (A)x L+1 , y L+1 ) Less than thresholdd 1 Are cluster points. Cluster point update using selectionLAnd (5) carrying out point clustering on adjacent points again through the formulas (24) to (28) until the terminal end of the stripe is reached. For a detailed clustering process, see the articles "Z.Z. Wang, unsupervised registration and characterization, for deflected laser lines for rolling metals, IEEETransactions on Industrial information, Vol.13, number 4, pp.1866-1876,2017".
And re-identifying the cluster stripes which are not influenced by the nose shielding area and the cluster stripes which are influenced by the nose shielding area in the left face image from top to bottom in sequence. So that the identification numbers are in turn1,2,3,…,N. WhereinNIs the total number of cluster stripes.
And re-identifying the cluster stripes which are not influenced by the nose shielding area and the cluster stripes which are influenced by the nose shielding area in the right face image from top to bottom in sequence. So that the identification numbers are in turn1,2,3,…,N. WhereinNIs the total number of cluster stripes.
FIG. 5 is a flow chart of the clustering technique of the broken-off parts of the stripes on the ears of the present invention. The specific implementation is as follows:
on the left ear, clustering point by point according to equations (24) - (28) with the rightmost point of the striped fractured parts as the starting point and the left along the slope direction of the fractured parts. On the right ear, clustering point by point according to equations (24) - (28) with the leftmost point of the striped fractured parts as the starting point and the slope direction of the fractured parts to the right.
The broken parts of the stripes influenced by the severe change of the ear height on the ears and the broken parts on other parts of the human face need to be corresponded through calibration. And after the correspondence, merging the clustering result of the broken stripe part on the left ear into the stripe clustering result of the re-identified left face stripe image. And merging the clustering result of the broken part of the stripe on the right ear into the stripe clustering result of the re-identified stripe image of the right face.
And merging the stripe clustering results of the left face stripe image and the right face stripe image to obtain the stripe clustering result of the face stripe image. And obtaining a human face ordinate graph through formulas (1) to (4).
Fig. 6 is a flow chart of the real-time three-dimensional face measurement technique of the present invention. The specific implementation steps are as follows:
as shown in fig. 1, the computer-controlled industrial projector projects the structured light parallel stripes onto the face to be measured, and at the same time, the industrial camera is controlled to capture a pair of face stripe images.
A human face ordinate graph is calculated and obtained through formulas (1) - (28) and the human face stripe image processing method.
Calculating to obtain parameters of the human face measuring system through a vertical coordinate graph of the calibration plate and mark points on the calibration plate, wherein the parameters comprise a camera affine transformation matrixM WC Affine transformation matrix with ordinateM WP
The three-dimensional face can be reconstructed in real time through the face ordinate graph and the face measurement system parameters obtained by calibration:
Figure 852901DEST_PATH_IMAGE029
(29)
wherein (A), (B), (C), (D), (C), (X W ,Y W ,Z W ) Is the world coordinates of the face. (x c , y c , y p ) Is the corresponding value in the camera coordinate and ordinate graph of the face.

Claims (10)

1. A real-time human face three-dimensional measurement method using single black-white or color structured light stripe patterns and a real-time image processing technology is characterized in that a structured light projection device is used for projecting single black-white or color structured light parallel stripe patterns onto the surface of a detected human face, a single camera device is used for recording deformed structured light stripe patterns, stripes in the image are sequentially marked from top to bottom through a stripe clustering technology, the sequentially marked cluster stripes are fitted into smooth continuous curves through spline functions, the vertical coordinates of sampling points on all the fitted curves are extracted to generate a vertical coordinate matrix, the resolution of the vertical coordinate matrix is improved to be consistent with the image resolution of the camera device through a spline function interpolation method to generate a vertical coordinate graph, and the three-dimensional human face is calculated in real time through the vertical coordinate graph and calibrated system parameters.
2. The method of claim 1, wherein the black-and-white or color structured light parallel stripe pattern is generated by binary coding or cosine function coding, and adjacent parallel stripes are spaced at the same interval.
3. A method according to claim 1, characterized in that the stripe clustering technique described takes different clustering methods depending on whether the stripes are affected by nose-sheltered areas or by areas with strongly varying ear height.
4. The method of claim 1, wherein the stripe clustering technique described employs a top-down clustering method based on minimum vertical coordinate values of broken stripe portions for stripes affected by nose blocking regions.
5. The method of claim 1, wherein the stripe clustering technique described employs a top-to-bottom clustering method based on the direction of the slope of the stripes for stripes that are not affected by the nose mask region.
6. The method of claim 1, wherein the stripe clustering technique described employs a top-to-bottom clustering method based on the direction of the slope of the stripes for the break-off portions of the stripes on the ear.
7. The method according to claim 1, characterized in that the stripe clustering technique described is such that stripe breaks on the ears are in a one-to-one correspondence with stripe breaks on the cheeks, using a method of scaling stripes affected by regions of intense ear height variation.
8. The method of claim 1, wherein said ordinate matrix corresponds to the ordinate of each point on a smooth continuous fit line of all the streaks, and wherein the ordinate map corresponds to the ordinate of each point on the face surface, and wherein the correspondence between the ordinate map and the three-dimensional coordinates of the face is determined by system parameters and is obtained by system calibration.
9. The method of claim 1, wherein said system parameters are comprised of an ordinate affine transformation matrix and a camera affine transformation matrix, and wherein said system parameters are obtained by system calibration.
10. The method of claim 1, wherein the three-dimensional face is computed in real time after system calibration from a human face ordinate graph computed from a single human face streak image.
CN201811034867.3A 2018-09-06 2018-09-06 Real-time human face three-dimensional measurement method based on one-time projection structured light parallel stripe pattern Withdrawn CN110879947A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811034867.3A CN110879947A (en) 2018-09-06 2018-09-06 Real-time human face three-dimensional measurement method based on one-time projection structured light parallel stripe pattern

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811034867.3A CN110879947A (en) 2018-09-06 2018-09-06 Real-time human face three-dimensional measurement method based on one-time projection structured light parallel stripe pattern

Publications (1)

Publication Number Publication Date
CN110879947A true CN110879947A (en) 2020-03-13

Family

ID=69727102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811034867.3A Withdrawn CN110879947A (en) 2018-09-06 2018-09-06 Real-time human face three-dimensional measurement method based on one-time projection structured light parallel stripe pattern

Country Status (1)

Country Link
CN (1) CN110879947A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110926339A (en) * 2018-09-19 2020-03-27 山东理工大学 Real-time three-dimensional measurement method based on one-time projection structured light parallel stripe pattern
WO2022021680A1 (en) * 2020-07-28 2022-02-03 中国科学院深圳先进技术研究院 Method for reconstructing three-dimensional object by fusing structured light with photometry, and terminal device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106017339A (en) * 2016-06-06 2016-10-12 河北工业大学 Three-dimensional measurement method for projecting non-uniform stripes in non-complete constraint system
CN107133591A (en) * 2017-05-05 2017-09-05 深圳前海华夏智信数据科技有限公司 Method for detecting parking stalls and device based on structure light
CN107719361A (en) * 2017-10-10 2018-02-23 深圳市豪恩汽车电子装备股份有限公司 Automatic parking householder method and system based on image vision

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106017339A (en) * 2016-06-06 2016-10-12 河北工业大学 Three-dimensional measurement method for projecting non-uniform stripes in non-complete constraint system
CN107133591A (en) * 2017-05-05 2017-09-05 深圳前海华夏智信数据科技有限公司 Method for detecting parking stalls and device based on structure light
CN107719361A (en) * 2017-10-10 2018-02-23 深圳市豪恩汽车电子装备股份有限公司 Automatic parking householder method and system based on image vision

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ZHEN ZHOU WANG: "Unsupervised Recognition and Characterization of the Reflected Laser Lines for Robotic Gas Metal Arc Welding", 《IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS》 *
ZHENZHOU WANG 等: "Single-shot three-dimensional reconstruction based on structured light line", 《OPTICS AND LASERS IN ENGINEERING》 *
周 麒 等: "基于双目立体视觉的结构光测量技术", 《计算机工程》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110926339A (en) * 2018-09-19 2020-03-27 山东理工大学 Real-time three-dimensional measurement method based on one-time projection structured light parallel stripe pattern
WO2022021680A1 (en) * 2020-07-28 2022-02-03 中国科学院深圳先进技术研究院 Method for reconstructing three-dimensional object by fusing structured light with photometry, and terminal device

Similar Documents

Publication Publication Date Title
Koch Dynamic 3-D scene analysis through synthesis feedback control
US9171372B2 (en) Depth estimation based on global motion
US9123115B2 (en) Depth estimation based on global motion and optical flow
WO2017054589A1 (en) Multi-depth image fusion method and apparatus
CN109636732A (en) A kind of empty restorative procedure and image processing apparatus of depth image
US10803609B2 (en) Image distance calculator and computer-readable, non-transitory storage medium storing image distance calculation program
CN110926339A (en) Real-time three-dimensional measurement method based on one-time projection structured light parallel stripe pattern
CN110536142B (en) Interframe interpolation method for non-rigid image sequence
KR20110014067A (en) Method and system for transformation of stereo content
Berdnikov et al. Real-time depth map occlusion filling and scene background restoration for projected-pattern-based depth cameras
CN205451195U (en) Real -time three -dimensional some cloud system that rebuilds based on many cameras
KR101868483B1 (en) Prediction of two blur parameters of edges in varying contrast
CN107967697A (en) Method for three-dimensional measurement and system based on colored random binary coding structured illumination
CN115170619B (en) Cloud shielding prediction method based on dense optical flow method
CN110879947A (en) Real-time human face three-dimensional measurement method based on one-time projection structured light parallel stripe pattern
CN115330684A (en) Underwater structure apparent defect detection method based on binocular vision and line structured light
KR102327304B1 (en) A method of improving the quality of 3D images acquired from RGB-depth camera
CN110880186A (en) Real-time human hand three-dimensional measurement method based on one-time projection structured light parallel stripe pattern
CN108090920A (en) A kind of new light field image deep stream method of estimation
CN103438833B (en) For optical means and the system of 3-d shape measurement
GB2585197A (en) Method and system for obtaining depth data
CN114677393A (en) Depth image processing method, depth image processing device, image pickup apparatus, conference system, and medium
Schmalz et al. A graph-based approach for robust single-shot structured light
CN114155236A (en) Laser stripe center line extraction method suitable for dynamic measurement environment
CN112102347A (en) Step detection and single-stage step height estimation method based on binocular vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200313

WW01 Invention patent application withdrawn after publication