KR101653278B1 - Face tracking system using colar-based face detection method - Google Patents

Face tracking system using colar-based face detection method Download PDF

Info

Publication number
KR101653278B1
KR101653278B1 KR1020160039961A KR20160039961A KR101653278B1 KR 101653278 B1 KR101653278 B1 KR 101653278B1 KR 1020160039961 A KR1020160039961 A KR 1020160039961A KR 20160039961 A KR20160039961 A KR 20160039961A KR 101653278 B1 KR101653278 B1 KR 101653278B1
Authority
KR
South Korea
Prior art keywords
face
detected
tracking
color
image
Prior art date
Application number
KR1020160039961A
Other languages
Korean (ko)
Inventor
오성권
김진율
Original Assignee
수원대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 수원대학교산학협력단 filed Critical 수원대학교산학협력단
Priority to KR1020160039961A priority Critical patent/KR101653278B1/en
Application granted granted Critical
Publication of KR101653278B1 publication Critical patent/KR101653278B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • G06K9/00234
    • G06T7/408
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a real-time face tracking system using color-based face detection and, more specifically, to a real-time face tracking system using color-based face detection, which comprises: a face detection module (100) for detecting a face from an input image by using the skin color information of an HCbCr color space; a face recognition module (200) for recognizing whether the face detected by the face detection module (100) is a tracking target; and a face tracking module (300) for tracking the face when the face recognized by the face recognition module (200) is the tracking target. The real-time face tracking system using color-based face detection according to the present invention can accurately detect the face from the input image while being less vulnerable to lighting changes by detecting the upper body of an object from the input image through an HOG feature vector and an SVM classifier and detecting the face from the detected upper body of the object through the skin color information and haar-like feature of the HCbCr color space which is not sensitive to the lighting changes and has high color distribution density to detect a skin color well. In addition, the real-time face tracking system using color-based face detection according to the present invention can rapidly and accurately recognize whether the face detected from the input image is the tracking target by reducing the dimension of data input to a pattern classifier through PCA and (2D)^2PCA to reduce unnecessary computation due to a high dimension.

Description

Technical Field [0001] The present invention relates to a face tracking system,

The present invention relates to a real time face tracking system, and more particularly, to a real time face tracking system using color based face region detection.

Recently, as many CCTV (Closed Circuit Television) systems have been installed and operated for the purpose of monitoring and security, personal information or personal privacy caused by the collection, leakage, misuse and abuse of surveillance images has been increasingly infringed. , And the demand for intelligent video surveillance system using intelligent image analysis using computer vision is also increasing.

Intelligent video surveillance system is widely used in tracking and safety surveillance by combination of object detection and object recognition technology. In order to track an object, object detection and recognition must be performed correctly. However, there is a limitation in detecting accurate objects due to changes in the shape of objects in an image, illumination changes, obstacles, and the like, which are used to detect existing objects.

With regard to a technique for detecting an object in such a video, Japanese Patent Laid-Open Publication No. 10-2012-0050342 (entitled: Video Object Detection Apparatus and Method, Published on May 18, 2012) has been disclosed .

The present invention has been proposed in order to solve the above-mentioned problems of the previously proposed methods. The present invention detects an upper body of an object from an input image using a Histogram of Oriented Gradient (HOG) feature vector and SVM (Support Vector Machine) And detects the face through skin color information and haar-like feature of HCbCr color space which is not sensitive to illumination change in the upper half of detected object and has high density of color distribution and detects skin color well. And real time face tracking system based on color-based face detection that can detect a face more accurately in an input image.

In addition, the present invention reduces the amount of data to be input to the pattern classifier using Principal Component Analysis (PCA) and (2D) 2 PCA to reduce the amount of computation by unnecessarily high dimension, It is another object of the present invention to provide a real-time face tracking system through color-based face detection, which can more quickly and accurately recognize whether a face detected in an input image is a tracking object.

According to an aspect of the present invention, there is provided a real-time face tracking system using color-based face detection,

As a real time face tracking system,

A face detection module for detecting a face using skin color information of an HCbCr color space from an input image;

A face recognition module for recognizing whether a face detected through the face detection module is a tracking object; And

And a face tracking module for tracking the face when the face recognized by the face recognition module is a tracking object.

Preferably, the face detection module includes:

A HOG feature vector extractor for extracting a HOG feature vector of the object from the input image;

An SVM classifier for detecting an upper body of an object in the input image based on the HOG feature vector extracted from the HOG feature vector extracting unit; And

And a skin region detection unit for detecting a skin region using skin color information of an HCbCr color space in an upper half of an object detected through the SVM classifier,

It is possible to detect the eye using the Haar-like feature in the skin region detected by the skin region detecting unit and detect the face based on the distance between the two eyes.

Preferably, the face recognition module includes:

And a preprocessing unit for reducing the dimension of the face data detected from the face detection module by using PCA and (2D) 2 PCA.

Preferably, the face recognition module comprises:

Recognizing whether a face detected from the face detection module is to be tracked using a learned fuzzy C-means (FCM) based RBF neural network pattern classifier based on a learning image database,

Wherein the FCM-based RBF neural network pattern classifier comprises:

Detecting an upper half of an object in the learning image through a HOG feature vector extracting unit and an SVM classifier from a learning image input from the learning image database, and extracting skin color information of the HCbCr color space and haar- Feature, and the detected face data is preprocessed using PCA and (2D) 2 PCA, and the preprocessed data is input, and each learning image can be learned.

Preferably,

If the face recognized by the face recognition module is a tracking object, the tracking module tracks the face by combining a mean-shift algorithm and a histogram inversion method,

If the face recognized by the face recognition module is not the object to be traced, it may be moved to the next frame input to detect and recognize the face.

According to the real-time face tracking system using the color-based face detection proposed in the present invention, the upper body of the object is detected from the input image using the HOG feature vector and the SVM classifier, and the upper body of the detected object is sensitive The face is detected through the skin color information and the haar-like feature of the HCbCr color space which detects the skin color with high density of the color distribution. Thus, the face can be detected more accurately in the input image have.

Also, according to the present invention, by reducing the dimension of data to be input to the pattern classifier using PCA and (2D) 2 PCA, it is possible to reduce the amount of computation by unnecessarily high dimension, It is possible to recognize it more quickly and accurately.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram of a real-time face tracking system based on color-based face detection according to an embodiment of the present invention. FIG.
2 is a diagram illustrating a configuration of a face detection module in a real-time face tracking system through color-based face detection according to an embodiment of the present invention.
3 is a diagram illustrating a learning process of a HOG feature vector extracting unit and an SVM classifier in a real-time face tracking system using color-based face detection according to an exemplary embodiment of the present invention.
4 is a diagram showing a state in which an upper body of an object is detected in an input image through a learned HOG feature vector extracting unit and an SVM classifier in a real time face tracking system through color-based face detection according to an embodiment of the present invention.
FIG. 5 illustrates a process of detecting a skin region using skin color information of an HCbCr color space in a detected upper half body in a real time face tracking system using color-based face detection according to an exemplary embodiment of the present invention.
6 is a flowchart illustrating a method of detecting a face using a haar-like feature in a detected skin region in a real-time face tracking system using color-based face detection according to an exemplary embodiment of the present invention, FIG. 2 is a diagram illustrating a process of detecting a face with a face;
FIG. 7 is a diagram illustrating a process of tracking a face in real time by combining a mean-shift algorithm and a histogram reverse projection algorithm in a real-time face tracking system using color-based face detection according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. In the following detailed description of the preferred embodiments of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear. The same or similar reference numerals are used throughout the drawings for portions having similar functions and functions.

In addition, in the entire specification, when a part is referred to as being 'connected' to another part, it may be referred to as 'indirectly connected' not only with 'directly connected' . Also, to "include" an element means that it may include other elements, rather than excluding other elements, unless specifically stated otherwise.

FIG. 1 is a block diagram of a real-time face tracking system based on color-based face detection according to an embodiment of the present invention. 1, a real-time face tracking system 10 using color-based face detection according to an exemplary embodiment of the present invention includes a face detection module 100, a face recognition module 200, and a face tracking module 300 ). ≪ / RTI > Hereinafter, each configuration of the real-time face tracking system 10 through color-based face detection according to an embodiment of the present invention will be described in detail with reference to the drawings.

The face detection module 100 detects the face using the skin color information of the HCbCr color space from the input image. Each configuration of the face detection module 100 will be described with reference to FIG.

FIG. 2 is a diagram illustrating a configuration of a face detection module in a real-time face tracking system based on color-based face detection according to an embodiment of the present invention. 2, the face detection module 100 may include a HOG feature vector extraction unit 110, an SVM classifier 120, and a skin region detection unit 130. [

The HOG feature vector extraction unit 110 extracts the HOG feature vector of the object from the input image and the SVM classifier 120 extracts the HOG feature vector extracted from the HOG feature vector extraction unit 110, It detects the upper body of the object. FIG. 3 is a diagram illustrating a learning process of a HOG feature vector extracting unit and an SVM classifier in a real-time face tracking system using color-based face detection according to an embodiment of the present invention. FIG. 4 is a diagram showing a state in which an upper body of an object is detected in an input image through a learned HOG feature vector extraction unit and an SVM classifier in a real-time face tracking system based on color-based face detection. 3, the HOG feature vector extraction unit 110 extracts a HOG feature vector from a large amount of upper body image and non-upper body image (background image), and the SVM classifier 120 extracts a HOG feature vector extraction unit 110 ) Is classified into an upper half image and a non-upper half image (background image), and the upper half image can be detected from the input image. Also, according to the embodiment, the learned HOG feature extraction unit 110 and the SVM classifier 120 can detect the upper half of the object in the input image, as shown in FIG.

The skin region detection unit 130 detects the skin region using the skin color information of the HCbCr color space in the upper half of the object detected through the SVM classifier 120. The HCbCr color space is extracted from the HSV color space and the YCbCr color space by extracting the H and CbCr elements, respectively, in order to increase the accuracy of the skin color detection. Hereinafter, with reference to FIG. 5, a description will be made of a process of detecting a skin region using skin color information of the HCbCr color space in the upper half of the body detected by the skin region detection unit 130 through the SVM classifier 120.

FIG. 5 is a diagram illustrating a process of detecting a skin region using skin color information of an HCbCr color space in a detected upper half body in a real time face tracking system based on color-based face detection according to an embodiment of the present invention. 5 (b), the skin area detection unit 130 uses the skin color information of the HCbCr color space in the upper half of the object detected from the input image through the SVM classifier 120, , It is possible to detect the skin region as shown in FIG. 5 (d) by removing the noise by performing an open operation, which is one of the parent association associations.

However, in the upper half of the body detected through the SVM classifier 120, the skin region detected based on the skin color information of the HCbCr color space through the skin region detecting unit 130 is detected as a face or a hand similar to the skin color of the face The face detection module 100 detects the eyes using the Haar-like feature in the skin region detected through the skin region detection unit 130, and detects the eyes of the two eyes It is possible to more accurately detect the face based on the distance between the face and the face. Hereinafter, referring to FIG. 6, a method of detecting an eye using a Haar-like feature in the skin region detected through the skin region detecting unit 130 and detecting a face based on the distance between the detected two eyes Will be described.

6 is a flowchart illustrating a method of detecting a face using a haar-like feature in a detected skin region in a real-time face tracking system using color-based face detection according to an exemplary embodiment of the present invention, FIG. 2 is a diagram illustrating a process of detecting a face. Since the skin region detected through the skin region detecting unit 130 can be detected as a neck or a hand similar to the skin color of the face other than the face as well as the face, as shown in Fig. 6, Within the skin region, the eye can be detected using the Haar-like feature, and the face can be detected based on the distance between the detected two eyes.

The face recognition module 200 recognizes whether the face detected through the face detection module 100 is a tracking object. Here, the face recognition module 200 can be designed using a learned FCM-based RBF neural network pattern classifier based on a large number of learning image databases, and can determine whether faces detected from the face detection module 100 are tracking objects Can be recognized. More specifically, the FCM-based RBF neural network pattern classifier detects an upper half of an object from a large amount of learning images input through the HOG feature vector extracting unit 110 and the SVM classifier 120, in using the color information and the Haar-like feature of HCbCr color space to detect the face, and receiving the data of the detected face pretreatment using a PCA and (2D) 2 PCA data, the learning of each learning image Based on the learned information, it is possible to recognize whether or not the face detected by the face detecting unit 100 is a tracking object. In addition, through the learning process described above, it is possible to determine optimal parameters including the fuzzification coefficient of the FCM-based RBF neural network pattern classifier, the polynomial form of the connection weight, and the number of nodes.

An important factor in real time face tracking such as the present invention is high face detection rate and fast face recognition speed. However, since the face detected by the face detection method using the skin color according to the present invention uses the intensity value of the image as the input value of the classifier, it influences the learning speed and the recognition performance of the classifier according to the size of the image input to the classifier Can be. According to the embodiment, when a face is detected through eye detection using the skin color information and the Haar-like feature of the HCbCr color space in the face detection module 100, when a face is detected by converting the image size to 90x90, The face data detected through the detection module 100 can be generated in 8100 dimensions. If such high dimensional data is used as an input value to the classifier, the learning speed and the recognition performance of the classifier may be affected. In order to solve such a problem, the face recognition module 200 of the present invention includes a preprocessing unit 210 for reducing the dimension of the face data detected from the face detection module 100 by using PCA and (2D) 2 PCA ). PCA is a typical linear transformation feature extraction method that uses a smaller number of features than the actual input by using a smaller number of dimensions than the number of dimensions of the input data in the feature data obtained by using the covariance matrix of data, (2D) 2 The PCA is an extension of the existing PCA method. The existing PCA reduces the two dimensions to one dimension (2D). 2 The PCA does not convert the one-dimensional image but maintains the information of the two- .

The tracking module 300 combines a mean-shift algorithm and a histogram reverse projection method to track a face when the face recognized from the face recognition module 200 is a tracking object.

In general, the mean-shift algorithm is a method of searching and tracking an object using two probability density functions. The two probability density functions are the target model probability and the target probability. Specifically, the similarity of two probability density functions is calculated, and a vector is generated in a direction in which the degree of similarity (Batcheria's coefficient) is high. By performing repetitive calculation and moving along a vector generated in a direction of high similarity, . ≪ / RTI >

In addition, the histogram reverse projection technique is a process of digitizing how many pixel color values included in the current input image are included in the object to be traced. The histogram of the object to be traced is Hm, the color value (X), the histogram reverse projection value W (x) can be obtained by Equation (1), and the calculated W (x), that is, how much the pixel color value in the input image is included in the tracking object The mean-shift algorithm can be applied to the distribution of the probability values to track the target.

Figure 112016031496464-pat00001

Here, H m represents a histogram of the tracking object, and I (x) represents a color value at a pixel x of the input image I.

More specifically, the weighted average position of the pixel coordinates in the current search window is calculated using Equation (2) by using W (x) calculated from Equation 1 as a weight, with the position of the object in the previous image frame as an initial position, And moving the search window such that X new obtained from Equation 2 becomes the center of the new search window and repeating this process until convergence can be tracked.

Figure 112016031496464-pat00002

Where K is the kernel function and r i is the distance from the current search window to x i .

Hereinafter, a process of tracking a face by combining a mean-shift algorithm and a histogram inversion method will be described with reference to FIG.

FIG. 7 is a diagram illustrating a process of tracking a face in real time by combining a mean-shift algorithm and a histogram reverse projection technique in a real-time face tracking system using color-based face detection according to an embodiment of the present invention. According to the embodiment, when the target object to be tracked in the input image is determined, as shown in FIG. 7, the histogram of the target object is stored as the target model, and then the histogram reverse projection method is used for the input image, (X), which indicates how much the pixel color value is included in the target model to be tracked, calculates the similarity based on the probability value W (x), and calculates the degree of similarity (Batateria coefficient) It is possible to trace the target object that should be tracked while moving along the generated vector.

According to the embodiment, when the face recognized by the face recognition module 200 is not the object to be tracked, the face is detected by the face detection module 100 by moving to the next frame input, Recognizes whether the detected face is a tracking object, tracks the face if the recognized face is a tracking object, and moves to the next frame if it is not a tracking object.

The success or failure of the tracking in the tracking module 300 can be determined based on the error of the yukilidian distance. If the tracking error is equal to or less than the preset error threshold value, the tracking is continued. If the error is equal to or greater than the predetermined error threshold value, And then move to the next frame to detect the face.

As described above, according to the real-time face tracking system using the color-based face detection proposed in the present invention, the upper body of the object is detected from the input image using the HOG feature vector and the SVM classifier, By detecting faces through skin color information and haar-like features of the HCbCr color space, which is not sensitive to illumination changes and has a high density of color distribution, it is less affected by illumination changes, Can be detected more accurately.

Also, according to the present invention, by reducing the dimension of data to be input to the pattern classifier using PCA and (2D) 2 PCA, it is possible to reduce the amount of computation by unnecessarily high dimension, It is possible to recognize it more quickly and accurately.

The present invention may be embodied in many other specific forms without departing from the spirit or essential characteristics of the invention.

10: Real-time face tracking system based on color-based face detection according to an embodiment of the present invention
100: Face detection module 110: HOG feature vector extraction unit
120: SVM classifier 130: skin area detection unit
200: face recognition module 300: face tracking module

Claims (5)

As a real time face tracking system,
A face detection module (100) for detecting a face using skin color information of an HCbCr color space from an input image;
A face recognition module (200) for recognizing whether a face detected through the face detection module (100) is an object to be tracked; And
And a face tracking module (300) for tracking the face when the face recognized by the face recognition module (200) is to be tracked,
The face detection module (100)
A HOG feature vector extraction unit 110 for extracting a Histogram of Oriented Gradient (hereinafter referred to as 'HOG') feature vector of the object from the input image;
A Support Vector Machine (hereinafter referred to as SVM) classifier 120 for detecting an upper body of an object in the input image based on the HOG feature vector extracted from the HOG feature vector extraction unit 110; And
And a skin region detection unit 130 for detecting a skin region using skin color information of the HCbCr color space in the upper half of the object detected through the SVM classifier 120,
Eye is detected using the Haar-like feature in the skin region detected by the skin region detecting unit 130, a face is detected based on the distance between the detected two eyes,
The HOG feature vector extraction unit 110 is learned to extract a HOG feature vector from a large amount of upper body image and non-upper body image (background image)
The SVM classifier 120 classifies an image input based on the HOG feature vector extracted by the HOG feature vector extractor 110 into an upper body image and a non-upper body image (background image)
The face recognition module 200,
It is possible to recognize whether the face detected from the face detection module 100 is a target of tracking by using a learned fuzzy C-means (FCM) based RBF neural network pattern classifier ,
Wherein the FCM-based RBF neural network pattern classifier comprises:
The upper body of the object in the learning image is detected from the learning image input from the learning image database through the HOG feature vector extraction unit 110 and the SVM classifier 120, and the upper half of the HCbCr color space after the color information, and using the Haar-like feature detection face, and pre-processing the data of the detected face, using PCA and (2D) 2 PCA, receiving the said pre-processing the data, the learning of each learning image (10), characterized in that it comprises:
delete The method of claim 1, wherein the face recognition module (200)
(210) for reducing the dimension of the face data detected from the face detection module (100) using Principal Component Analysis (PCA) and (2D) 2 PCA (10), characterized in that it comprises:
delete The method according to claim 1,
When the face recognized by the face recognition module 200 is a tracking object, the tracking module 300 tracks the face by combining a mean-shift algorithm and a histogram back projection method,
Based face detection system according to the present invention is characterized in that when the face recognized by the face recognition module 200 is not the object to be tracked, ).
KR1020160039961A 2016-04-01 2016-04-01 Face tracking system using colar-based face detection method KR101653278B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160039961A KR101653278B1 (en) 2016-04-01 2016-04-01 Face tracking system using colar-based face detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160039961A KR101653278B1 (en) 2016-04-01 2016-04-01 Face tracking system using colar-based face detection method

Publications (1)

Publication Number Publication Date
KR101653278B1 true KR101653278B1 (en) 2016-09-01

Family

ID=56942757

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160039961A KR101653278B1 (en) 2016-04-01 2016-04-01 Face tracking system using colar-based face detection method

Country Status (1)

Country Link
KR (1) KR101653278B1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107392187A (en) * 2017-08-30 2017-11-24 西安建筑科技大学 A kind of human face in-vivo detection method based on gradient orientation histogram
CN107943077A (en) * 2017-11-24 2018-04-20 歌尔股份有限公司 A kind of method for tracing, device and the unmanned plane of unmanned plane drop target
CN108520527A (en) * 2018-03-28 2018-09-11 东北大学 A kind of space-time context fast track method based on color attribute and PCA
CN108520261A (en) * 2018-03-01 2018-09-11 中国农业大学 A kind of recognition methods of peanut kernels quantity and device
KR101908481B1 (en) * 2017-07-24 2018-12-10 동국대학교 산학협력단 Device and method for pedestraian detection
CN109711414A (en) * 2018-12-19 2019-05-03 国网四川省电力公司信息通信公司 Equipment indicating lamp color identification method and system based on camera image acquisition
CN110032915A (en) * 2018-01-12 2019-07-19 杭州海康威视数字技术股份有限公司 A kind of human face in-vivo detection method, device and electronic equipment
CN110188624A (en) * 2019-05-10 2019-08-30 国网福建省电力有限公司龙岩供电公司 A kind of substation's wind drift recognition methods and system based on deep learning
CN110807402A (en) * 2019-10-29 2020-02-18 深圳市梦网百科信息技术有限公司 Facial features positioning method, system and terminal equipment based on skin color detection
CN112149578A (en) * 2020-09-24 2020-12-29 四川川大智胜软件股份有限公司 Face skin material calculation method, device and equipment based on face three-dimensional model
US11776239B2 (en) 2019-12-12 2023-10-03 Samsung Electronics Co., Ltd. Liveness test method and liveness test apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000048184A (en) * 1998-05-29 2000-02-18 Canon Inc Method for processing image, and method for extracting facial area and device therefor
KR101449744B1 (en) * 2013-09-06 2014-10-15 한국과학기술원 Face detection device and method using region-based feature
KR101589149B1 (en) * 2015-05-27 2016-02-03 수원대학교산학협력단 Face recognition and face tracking method using radial basis function neural networks pattern classifier and object tracking algorithm and system for executing the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000048184A (en) * 1998-05-29 2000-02-18 Canon Inc Method for processing image, and method for extracting facial area and device therefor
KR101449744B1 (en) * 2013-09-06 2014-10-15 한국과학기술원 Face detection device and method using region-based feature
KR101589149B1 (en) * 2015-05-27 2016-02-03 수원대학교산학협력단 Face recognition and face tracking method using radial basis function neural networks pattern classifier and object tracking algorithm and system for executing the same

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101908481B1 (en) * 2017-07-24 2018-12-10 동국대학교 산학협력단 Device and method for pedestraian detection
CN107392187B (en) * 2017-08-30 2020-08-11 西安建筑科技大学 Face in-vivo detection method based on gradient direction histogram
CN107392187A (en) * 2017-08-30 2017-11-24 西安建筑科技大学 A kind of human face in-vivo detection method based on gradient orientation histogram
CN107943077A (en) * 2017-11-24 2018-04-20 歌尔股份有限公司 A kind of method for tracing, device and the unmanned plane of unmanned plane drop target
CN110032915A (en) * 2018-01-12 2019-07-19 杭州海康威视数字技术股份有限公司 A kind of human face in-vivo detection method, device and electronic equipment
CN108520261A (en) * 2018-03-01 2018-09-11 中国农业大学 A kind of recognition methods of peanut kernels quantity and device
CN108520261B (en) * 2018-03-01 2021-06-18 中国农业大学 Method and device for identifying peanut kernel number
CN108520527A (en) * 2018-03-28 2018-09-11 东北大学 A kind of space-time context fast track method based on color attribute and PCA
CN109711414A (en) * 2018-12-19 2019-05-03 国网四川省电力公司信息通信公司 Equipment indicating lamp color identification method and system based on camera image acquisition
CN110188624A (en) * 2019-05-10 2019-08-30 国网福建省电力有限公司龙岩供电公司 A kind of substation's wind drift recognition methods and system based on deep learning
CN110807402A (en) * 2019-10-29 2020-02-18 深圳市梦网百科信息技术有限公司 Facial features positioning method, system and terminal equipment based on skin color detection
CN110807402B (en) * 2019-10-29 2023-08-08 深圳市梦网视讯有限公司 Facial feature positioning method, system and terminal equipment based on skin color detection
US11776239B2 (en) 2019-12-12 2023-10-03 Samsung Electronics Co., Ltd. Liveness test method and liveness test apparatus
CN112149578A (en) * 2020-09-24 2020-12-29 四川川大智胜软件股份有限公司 Face skin material calculation method, device and equipment based on face three-dimensional model
CN112149578B (en) * 2020-09-24 2024-05-24 四川川大智胜软件股份有限公司 Face skin material calculation method, device and equipment based on face three-dimensional model

Similar Documents

Publication Publication Date Title
KR101653278B1 (en) Face tracking system using colar-based face detection method
CN107330920B (en) Monitoring video multi-target tracking method based on deep learning
Ge et al. Real-time pedestrian detection and tracking at nighttime for driver-assistance systems
US7957560B2 (en) Unusual action detector and abnormal action detecting method
Davis et al. A two-stage template approach to person detection in thermal imagery
Zhu et al. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations
US20180341803A1 (en) Information processing apparatus, information processing method, and storage medium
KR101764845B1 (en) A video surveillance apparatus for removing overlap and tracking multiple moving objects and method thereof
US6661907B2 (en) Face detection in digital images
Ogale A survey of techniques for human detection from video
CN109145742A (en) A kind of pedestrian recognition method and system
Tsintotas et al. DOSeqSLAM: Dynamic on-line sequence based loop closure detection algorithm for SLAM
Dewangan et al. Real time object tracking for intelligent vehicle
Liu et al. Smoke-detection framework for high-definition video using fused spatial-and frequency-domain features
Campos et al. Discrimination of abandoned and stolen object based on active contours
Abedin et al. Traffic sign recognition using surf: Speeded up robust feature descriptor and artificial neural network classifier
Mao et al. Training a scene-specific pedestrian detector using tracklets
Ramezani et al. A new DSWTS algorithm for real-time pedestrian detection in autonomous agricultural tractors as a computer vision system
Hsiao et al. EfficientNet based iris biometric recognition methods with pupil positioning by U-net
CN108596057B (en) Information security management system based on face recognition
Wang et al. A two-layer night-time vehicle detector
El-Said Shadow aware license plate recognition system
Zhang et al. A novel efficient method for abnormal face detection in ATM
Xu et al. Efficient eye states detection in real-time for drowsy driving monitoring system
Kang et al. Real-time pedestrian detection using support vector machines

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant