CN112364777B - Face distance estimation method based on face recognition - Google Patents

Face distance estimation method based on face recognition Download PDF

Info

Publication number
CN112364777B
CN112364777B CN202011263555.7A CN202011263555A CN112364777B CN 112364777 B CN112364777 B CN 112364777B CN 202011263555 A CN202011263555 A CN 202011263555A CN 112364777 B CN112364777 B CN 112364777B
Authority
CN
China
Prior art keywords
distance
person
calculating
face
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011263555.7A
Other languages
Chinese (zh)
Other versions
CN112364777A (en
Inventor
董其任
邹杭
章寅
张研
董黎刚
蒋献
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Gongshang University
Original Assignee
Zhejiang Gongshang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Gongshang University filed Critical Zhejiang Gongshang University
Priority to CN202011263555.7A priority Critical patent/CN112364777B/en
Publication of CN112364777A publication Critical patent/CN112364777A/en
Application granted granted Critical
Publication of CN112364777B publication Critical patent/CN112364777B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Abstract

The invention discloses a face distance estimation method based on face recognition. The face distance calculation is carried out according to images obtained after a plurality of people are shot by the cameras, and the method comprises the following steps: 1) Acquiring a horizontal view angle value of a camera; 2) Establishing a space rectangular coordinate system; 3) Acquiring coordinates of a face organ in an image; 4) Calculating the distance between eyes of a person and the distance between the eyebrows and the center of the mouth in the image, and 5) calculating the real distance between the person and the camera in the direction of the number axis Oz; calculating the real distance from the person to the center point of the image in the direction of the number axis Ox; 6) An estimated distance between any two persons is calculated. According to the invention, the coordinate information of the target eyes, nose and mouth in the picture is obtained by the face recognition technology based on deep learning, and the face distance is obtained by using a projection geometric algorithm on the detection result, so that the face distance estimation can be realized at low cost.

Description

Face distance estimation method based on face recognition
Technical Field
The invention relates to the field of computer vision, in particular to a face distance estimation method based on face recognition.
Background
The measurement of people-to-people spacing in video frames is an important technology in many intelligent education and community infrastructure at present. Traditional ranging modes based on radar, ultrasonic waves, binocular cameras and the like have the defects of high deployment difficulty, low coupling degree with the existing monitoring equipment, high requirement on the environment of a monitoring site and the like. In order to meet the requirement of on-site personnel spacing measurement during monitoring, research and development of a ranging algorithm applicable to most monitoring equipment are urgently needed.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a face distance calculation method based on face recognition. The invention has the advantages of no need of extra hardware overhead, low deployment cost, capability of directly utilizing the existing monitoring equipment to measure distance and low requirement on environmental condition stability. The target detection frame is obtained through the face target detection operation based on deep learning, and the face distance is obtained through a projection geometric algorithm on the detection result, so that the face distance measurement can be realized with low cost.
A face distance estimation method based on face recognition carries out face distance calculation according to images obtained after a plurality of people are shot by cameras, and the method comprises the following steps:
1) Acquiring a value of a horizontal visual angle alpha of the camera;
2) Establishing a space rectangular coordinate system: taking the top left vertex of the picture as an origin O, taking the right direction of an O point as an x-axis positive direction, taking the downward direction of the O point as a y-axis positive direction, and taking the direction which is perpendicular to the plane of the picture and outwards as a z-axis positive direction;
3) Acquiring coordinate information of a left eye pupil, a right eye pupil, a nose tip, a left mouth corner and a right mouth corner of a face of a person from an image: eye_left (x) 1 ,y 1 ),eye_right(x 2 ,y 2 ),nose(x 3 ,y 3 ),mouth_left(x 4 ,y 4 ),mouth_right(x 5 ,y 5 );
4) Calculating the distance g between two eyes of the person in the x-axis direction in the image 1 =|x 1 -x 2 Calculating the distance g between the eyebrow center and the mouth center of the person in the image in the y-axis direction 2 =(|y 1 -y 4 |+|y 2 -y 5 I)/2, calculating the interocular distance of the person on the xOy plane in the image
Figure BDA0002775404680000011
Calculating the center-to-center distance +.f between the eyebrow and the mouth of the person on the xOy plane in the image>
Figure BDA0002775404680000012
The triangle composed of three points of the eye_left, the eye_right and the phase is set as a triangle A, the distance between the coordinate eye_left and the phase on the xOy plane is set as a, the distance between the coordinate eye_right and the phase on the xOy plane is set as b, and g is set as b 1 The actual length in reality is set to g 1 _real;
a) When a is greater than or equal to b and a/b is less than 1.4 or a is less than or equal to b andb/a is less than 1.4, the face is raised, lowered, rotated clockwise or rotated anticlockwise or is completely opposite to the camera mirror face: g 1 _real=(L 1 /f 1 )×g 1
b) When a is more than or equal to b and a/b is more than or equal to 1.4 or a is less than or equal to b and b/a is more than 1.4, the face turns left, right, left upper, right upper, left lower or right lower:
when g1 < g2, g 1 _real=(L 2 /f 2 )×g 1
When g1 > g2, g 1 _real=(L 1 /f 1 )×g 1
5) Calculate g 1 Ratio R to image width W 1 =g 1 W, calculate g 1 Angle of horizontal view of the camera: beta=r 1 X alpha, calculating the true distance n=g of the person to the camera 1 Real/tan beta; let c be the real distance of the person to the camera in the direction of the number axis Oz, estimate the value of c: c is approximately equal to n; calculating the image distance e= |x from the nose tip of the person to the image center point in the direction of the numerical axis Ox 3 -W/2|, calculating the ratio R of e to W 2 =e/W, calculate the angle of the horizontal view of the camera occupied by e: η=r 2 Calculating the true distance w=c×tan η of the person to the image center point in the direction of the number axis Ox;
6) For personnel P i ,P j Respectively performing 3) to 5), respectively obtaining personnel P i And P j True distance c to camera in the direction of the number axis Oz i And c j Personnel P i And P j True distance w to the center of the picture in the direction of the number axis Ox i And w j
7) Calculator P i And P j True distance in the number axis Ox direction: when person P i And P j X= |w when both are on the left or right side of the image center point i -w j I (I); when person P i And P j When distributed on both sides of the image center point, x=w i +w j The method comprises the steps of carrying out a first treatment on the surface of the Personnel P i And P j The true distance z= |c in the direction of the number axis Oz i -c j I (I); calculating an estimated distance between any two persons
Figure BDA0002775404680000021
In the step 3), the coordinate information of the left eye pupil, the right eye pupil, the nose tip, the left mouth corner and the right mouth corner of the face is obtained, and the method comprises the following steps: and acquiring the facial organ coordinate information by using a Retinaface detector based on deep learning training.
In the step 4), the values of L1 and L2 are derived from the average distance between the facial organs of the adult in GB2428-81 series of Chinese adult head types: left eye to right eye interocular distance L 1 =0.07 m, spacing L between eyebrow and mouth center 2 =0.07m。
The invention has the following beneficial effects: the method has the advantages of no need of extra hardware overhead, low deployment cost, capability of directly utilizing the existing monitoring equipment to measure distance, and low requirement on environmental condition stability. The coordinate information of the target eyes, nose and mouth in the picture is obtained through the face recognition technology based on deep learning, and the face distance is obtained through a projection geometric algorithm on the detection result, so that the face distance estimation can be realized at low cost.
Drawings
FIG. 1 is a photograph taken by a camera at a certain time;
fig. 2 is a result diagram of a face distance calculation method based on face recognition.
Detailed description of the preferred embodiments
The present invention will be further described and illustrated with reference to the accompanying drawings and examples in order to make the objects, technical solutions and advantages of the present invention more apparent. The technical features of the embodiments of the invention can be combined correspondingly on the premise of no mutual conflict.
As shown in fig. 1, a face distance estimation method based on face recognition performs face distance calculation according to images obtained after a plurality of persons are photographed by a camera, including the following steps:
1) Acquiring a value of a horizontal visual angle alpha of the camera;
2) Establishing a space rectangular coordinate system: taking the top left vertex of the picture as an origin O, taking the right direction of an O point as an x-axis positive direction, taking the downward direction of the O point as a y-axis positive direction, and taking the direction which is perpendicular to the plane of the picture and outwards as a z-axis positive direction;
3) Acquiring coordinate information of a left eye pupil, a right eye pupil, a nasal tip, a left mouth corner and a right mouth corner of a face of a person in an image: eye_left (x) 1 ,y 1 ),eye_right(x 2 ,y 2 ),nose(x 3 ,y 3 ),mouth_left(x 4 ,y 4 ),mouth_right(x 5 ,y 5 );
4) Calculating the distance g between two eyes of the person in the x-axis direction in the image 1 =|x 1 -x 2 Calculating the distance g between the eyebrow center and the mouth center of the person in the image in the y-axis direction 2 =(|y 1 -y 4 |+|y 2 -y 5 I)/2, calculating the interocular distance of the person on the xOy plane in the image
Figure BDA0002775404680000031
Calculating the center-to-center distance +.f between the eyebrow and the mouth of the person on the xOy plane in the image>
Figure BDA0002775404680000032
The triangle composed of three points of the eye_left, the eye_right and the phase is set as a triangle A, the distance between the coordinate eye_left and the phase on the xOy plane is set as a, the distance between the coordinate eye_right and the phase on the xOy plane is set as b, and g is set as b 1 The actual length in reality is set to g 1 _real;
a) When a is more than or equal to b and a/b is less than or equal to 1.4 or a is less than or equal to b and b/a is less than 1.4, the face is lifted, lowered, rotated clockwise, rotated anticlockwise or completely opposite to the camera mirror face: g 1 _real=(L 1 /f 1 )×g 1
b) When a is more than or equal to b and a/b is more than or equal to 1.4 or a is less than or equal to b and b/a is more than 1.4, the face turns left, right, left upper, right upper, left lower or right lower:
when g1 < g2, g 1 _real=(L 2 /f 2 )×g 1
When g1 > g2, g 1 _real=(L 1 /f 1 )×g 1
5) Calculate g 1 Ratio R to image width W 1 =g 1 W, calculate g 1 Angle of horizontal view of the camera: beta=r 1 X alpha, calculating the true distance n=g of the person to the camera 1 Real/tan beta; let c be the real distance of the person to the camera in the direction of the number axis Oz, estimate the value of c: c is approximately equal to n; calculating the image distance e= |x from the nose tip of the person to the image center point in the direction of the numerical axis Ox 3 -W/2|, calculating the ratio R of e to W 2 =e/W, calculate the angle of the horizontal view of the camera occupied by e: η=r 2 Calculating the true distance w=c×tan η of the person to the image center point in the direction of the number axis Ox;
6) For personnel P i ,P j Respectively performing 3) to 5), respectively obtaining personnel P i And P j True distance c to camera in the direction of the number axis Oz i And c j Personnel P i And P j True distance w to the center of the picture in the direction of the number axis Ox i And w j
7) Calculator P i And P j True distance in the number axis Ox direction: when person P i And P j X= |w when both are on the left or right side of the image center point i -w j I (I); when person P i And P j When distributed on both sides of the image center point, x=w i +w j The method comprises the steps of carrying out a first treatment on the surface of the Personnel P i And P j The true distance z= |c in the direction of the number axis Oz i -c j I (I); calculating an estimated distance between any two persons
Figure BDA0002775404680000041
In the step 3), the coordinate information of the left eye pupil, the right eye pupil, the nose tip, the left mouth corner and the right mouth corner of the face is obtained, and the method comprises the following steps: obtaining facial organ coordinate information by using a Retinaface detector based on deep learning training;
the saidIn step 4) of (2), the values of L1 and L2 are derived from the average distance between the facial organs of adults of the Chinese adult head series of GB 2428-81: left eye to right eye interocular distance L 1 =0.07 m, spacing L between eyebrow and mouth center 2 =0.07m。
Examples
To facilitate the understanding and practice of the invention by those of ordinary skill in the art, a specific embodiment of the method of the invention will now be presented. The core idea of estimating the distance of the person by using the face recognition technology and the projection geometric algorithm is as follows: the coordinate information of the target eyes, nose and mouth in the picture is obtained through the face recognition technology based on deep learning, and the face distance is obtained through a projection geometric algorithm on the detection result, so that the face distance estimation can be realized at low cost.
A seup>A-health DS-2DC7423IW-A camerup>A is installed in up>A certain room, and the iVMS-4200 client is used for acquiring the ip address of the camerup>A and the URL used for accessing in up>A program. In the program, an OpenCV function library of python language is used, and a camera is accessed according to the URL to acquire a picture.
And acquiring a picture of the place by a camera of the place. The picture size is: 1087×610, coordinates Pic (x pic ,y pic ) = (543.5, 305). Referring to the camera specification, the horizontal angle of view α=60° of the camera is obtained.
Establishing a space rectangular coordinate system: the top left vertex of the picture is taken as an origin O, the right direction of the O point is taken as an x-axis positive direction, the downward direction of the O point is taken as a y-axis positive direction, and the direction perpendicular to the plane of the picture is taken as a z-axis positive direction.
Person P in image is acquired by using Retinaface detector based on deep learning training 1 Coordinate information of left eye pupil, right eye pupil, nasal tip, left mouth corner, right mouth corner: left_eye 1 (636,447)、right_eye 1 (653,454)、nose 1 (634,463)、left_mouth 1 (636,472)、right_mouth 1 (647, 477); coordinate information of left eye pupil, right eye pupil, nose tip, left mouth corner, right mouth corner of person P2: left_eye 2 (140,243)、right_eye 2 (151,240)、nose 2 (146,250)、left_mouth 2 (143,255)、right_mouth 2 (152,253)。
According to the average distance between adult facial organs in GB2428-81, china adult head shape series, eye distance L1=0.07 m, and distance L2=0.07 m from eyebrow center to mouth center.
Calculator P 1 G of (2) 1 =|x 1 -x 2 |=17,g 2 =(|y 1 -y 4 |+|y 2 -y 5 |)/2=24,
Figure BDA0002775404680000051
Figure BDA0002775404680000052
Figure BDA0002775404680000053
At this time, a.ltoreq.b and b/a < 1.4, g 1 _real=(L 1 /f 1 )×g 1 =0.065m。R 1 =g 1 /W=0.016,β=R 1 ×α=0.94°,c i ≈n=g 1 _real/tanβ=3.95m。e=|x 3 -W/2|=90.5,R 2 =e/W=0.083,η=R 2 ×α=4.995°,w i =c i ×tanη=0.35m。
Calculator P 2 G of (2) 1 =11,g 2 =12.5,f 1 =11.40,f 2 =12.66, a=9.22, b=11.18. At this time, a.ltoreq.b and b/a < 1.4, g 1 _real=(L 1 /f 1 )×g 1 =0.068m。R 1 =g 1 /W=0.01,β=R 1 ×α=0.61°,c j ≈n=g 1 _real/tanβ=6.37m。e=|x 3 -W/2|=397.5,R 2 =e/W=0.37,η=R 2 ×α=21.94°,w j =c j ×tanη=2.57m。
Personnel P 1 Subtracting the x-axis coordinate k of the center of the picture from the x-axis coordinate of the tip of the nose 1 =90.5, person P 2 Subtracting the x-axis coordinate of the center of the picture from the x-axis coordinate of the tip of the nosek 2 = -397.5, it can be determined that person P1 is on the right side of the image, person P 2 On the left side of the image. Deriving x=w i +w j =2.92m,z=|c i -c j I, get the estimated distance between any two people
Figure BDA0002775404680000054
/>

Claims (2)

1. The face distance estimation method based on face recognition is characterized by comprising the following steps of:
1-1) obtaining a value of a horizontal view angle alpha of the camera;
1-2) establishing a space rectangular coordinate system: taking the top left vertex of the picture as an origin O, taking the right direction of an O point as an x-axis positive direction, taking the downward direction of the O point as a y-axis positive direction, and taking the direction which is perpendicular to the plane of the picture and outwards as a z-axis positive direction;
1-3) acquiring coordinate information of a left eye pupil, a right eye pupil, a nose tip, a left mouth corner and a right mouth corner of a face of a person from an image: eye_left (x) 1 ,y 1 ),eye_right(x 2 ,y 2 ),nose(x 3 ,y 3 ),mouth_left(x 4 ,y 4 ),mouth_right(x 5 ,y 5 );
1-4) calculating the distance g between the eyes of the person in the x-axis direction in the image 1 =|x 1 -x 2 Calculating the distance g between the eyebrow center and the mouth center of the person in the image in the y-axis direction 2 =(|y 1 -y 4 |+|y 2 -y 5 I)/2, calculating the interocular distance of the person on the xOy plane in the image
Figure FDA0004139080970000011
Calculating the center-to-center spacing of the eyebrow to the mouth of the person on the xOy plane in the image
Figure FDA0004139080970000012
Three points of eye_left, eye_right and nose are combinedThe triangle formed is set to be triangle a, the distance between the coordinates eye_left and the phase on the xOy plane is set to be a, the distance between the coordinates eye_right and the phase on the xOy plane is set to be b, and g is set to be g 1 The actual length in reality is set to g 1 _real;
a) When a is more than or equal to b and a/b is less than or equal to 1.4 or a is less than or equal to b and b/a is less than 1.4, the face is lifted, lowered, rotated clockwise, rotated anticlockwise or completely opposite to the camera mirror face: g 1 _real=(L 1 /f 1 )×g 1
b) When a is more than or equal to b and a/b is more than or equal to 1.4 or a is less than or equal to b and b/a is more than 1.4, the face turns left, right, left upper, right upper, left lower or right lower:
when g1 < g2, g 1 _real=(L 2 /f 2 )×g 1
When g1 > g2, g 1 _real=(L 1 /f 1 )×g 1
1-4) the L 1 And L 2 The values of (2) are derived from the average spacing of the facial organs of adults from the chinese adult cephalic series of GB 2428-81: left eye to right eye interocular distance L 1 =0.07 m, spacing L between eyebrow and mouth center 2 =0.07m;
1-5) calculating g 1 Ratio R to image width W 1 =g 1 W, calculate g 1 Angle of horizontal view of the camera: beta=r 1 X alpha, calculating the true distance n=g of the person to the camera 1 Real/tan beta; let c be the real distance of the person to the camera in the direction of the number axis Oz, estimate the value of c: c is approximately equal to n; calculating the image distance e= |x from the nose tip of the person to the image center point in the direction of the numerical axis Ox 3 -W/2|, calculating the ratio R of e to W 2 =e/W, calculate the angle of the horizontal view of the camera occupied by e: η=r 2 Calculating the true distance w=c×tan η of the person to the image center point in the direction of the number axis Ox;
1-6) person P i ,P j Respectively performing 1-3) to 1-5), respectively obtaining personnel P i And P j True distance c to camera in the direction of the number axis Oz i And c j People, peoplePersonnel P i And P j True distance w to the center of the picture in the direction of the number axis Ox i And w j
1-7) calculating person P i And P j True distance in the number axis Ox direction: when person P i And P j X= |w when both are on the left or right side of the image center point ii -w j I (I); when person P i And P j When distributed on both sides of the image center point, x=w i +w j The method comprises the steps of carrying out a first treatment on the surface of the Personnel P i And P j The true distance z= |c in the direction of the number axis Oz ii -c j I (I); calculating an estimated distance between any two persons
Figure FDA0004139080970000021
2. The method according to claim 1, wherein 1-3) the acquiring coordinate information of the left eye pupil, the right eye pupil, the nose tip, the left mouth corner, and the right mouth corner of the face comprises the following steps: and acquiring the facial organ coordinate information by using a Retinaface detector based on deep learning training.
CN202011263555.7A 2020-11-12 2020-11-12 Face distance estimation method based on face recognition Active CN112364777B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011263555.7A CN112364777B (en) 2020-11-12 2020-11-12 Face distance estimation method based on face recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011263555.7A CN112364777B (en) 2020-11-12 2020-11-12 Face distance estimation method based on face recognition

Publications (2)

Publication Number Publication Date
CN112364777A CN112364777A (en) 2021-02-12
CN112364777B true CN112364777B (en) 2023-05-16

Family

ID=74514563

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011263555.7A Active CN112364777B (en) 2020-11-12 2020-11-12 Face distance estimation method based on face recognition

Country Status (1)

Country Link
CN (1) CN112364777B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115409663A (en) * 2022-09-02 2022-11-29 吉林农业科技学院 Training system for improving enterprise employee quality

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110728215A (en) * 2019-09-26 2020-01-24 杭州艾芯智能科技有限公司 Face living body detection method and device based on infrared image
CN110781712A (en) * 2019-06-12 2020-02-11 上海荟宸信息科技有限公司 Human head space positioning method based on human face detection and recognition
CN111914783A (en) * 2020-08-10 2020-11-10 深圳市视美泰技术股份有限公司 Method and device for determining human face deflection angle, computer equipment and medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110781712A (en) * 2019-06-12 2020-02-11 上海荟宸信息科技有限公司 Human head space positioning method based on human face detection and recognition
CN110728215A (en) * 2019-09-26 2020-01-24 杭州艾芯智能科技有限公司 Face living body detection method and device based on infrared image
CN111914783A (en) * 2020-08-10 2020-11-10 深圳市视美泰技术股份有限公司 Method and device for determining human face deflection angle, computer equipment and medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱树先 ; 张仁杰 ; .人眼定位算法与人脸图像标准化.上海理工大学学报.2006,(03),全文. *

Also Published As

Publication number Publication date
CN112364777A (en) 2021-02-12

Similar Documents

Publication Publication Date Title
US10353465B2 (en) Iris and pupil-based gaze estimation method for head-mounted device
WO2020215961A1 (en) Personnel information detection method and system for indoor climate control
WO2022012192A1 (en) Method and apparatus for constructing three-dimensional facial model, and device and storage medium
CN105894574B (en) A kind of binocular three-dimensional reconstruction method
CN107545302B (en) Eye direction calculation method for combination of left eye image and right eye image of human eye
CN106796449A (en) Eye-controlling focus method and device
Tang et al. ESTHER: Joint camera self-calibration and automatic radial distortion correction from tracking of walking humans
CN111932678B (en) Multi-view real-time human motion, gesture, expression and texture reconstruction system
WO2019062056A1 (en) Smart projection method and system, and smart terminal
Tian et al. Computer vision-based door detection for accessibility of unfamiliar environments to blind persons
CN110807364A (en) Modeling and capturing method and system for three-dimensional face and eyeball motion
CN110096925A (en) Enhancement Method, acquisition methods and the device of Facial Expression Image
CN113077519B (en) Multi-phase external parameter automatic calibration method based on human skeleton extraction
CN110796032A (en) Video fence based on human body posture assessment and early warning method
CN112364777B (en) Face distance estimation method based on face recognition
CN111582036B (en) Cross-view-angle person identification method based on shape and posture under wearable device
CN108133189B (en) Hospital waiting information display method
CN107862713A (en) Video camera deflection for poll meeting-place detects method for early warning and module in real time
WO2019127319A1 (en) Distortion measurement method and system for head-mounted display device
CN112633217A (en) Human face recognition living body detection method for calculating sight direction based on three-dimensional eyeball model
US9940504B2 (en) Method to produce consistent face texture
CN112102504A (en) Three-dimensional scene and two-dimensional image mixing method based on mixed reality
JP2007109126A (en) Moving body distribution estimation device, moving body distribution estimation method, and moving body distribution estimation program
WO2022247230A1 (en) Distance measurement method and apparatus
Tordoff et al. Head pose estimation for wearable robot control.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant