CN109766838A - A kind of gait cycle detecting method based on convolutional neural networks - Google Patents

A kind of gait cycle detecting method based on convolutional neural networks Download PDF

Info

Publication number
CN109766838A
CN109766838A CN201910026947.2A CN201910026947A CN109766838A CN 109766838 A CN109766838 A CN 109766838A CN 201910026947 A CN201910026947 A CN 201910026947A CN 109766838 A CN109766838 A CN 109766838A
Authority
CN
China
Prior art keywords
gait
frame
neural networks
convolutional neural
gait cycle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910026947.2A
Other languages
Chinese (zh)
Other versions
CN109766838B (en
Inventor
王科俊
丁欣楠
李伊龙
周石冰
徐怡博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN201910026947.2A priority Critical patent/CN109766838B/en
Publication of CN109766838A publication Critical patent/CN109766838A/en
Application granted granted Critical
Publication of CN109766838B publication Critical patent/CN109766838B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention provides a kind of gait cycle detecting method based on convolutional neural networks, pre-processes to gait video, including video decoding, and pedestrian contour extracts and the normalized image pretreatment operation of mass center;Train the convolutional neural networks for extracting gait cycle feature;Gait sequence of frames of video to be measured is sent into convolutional neural networks, output waveform obtains a gait cycle after filtering, through the position of the adjacent peaks and troughs of determination.This method has very strong robustness to the variation of angle change, dress ornament and belongings, it solves the problems, such as to be difficult to detect gait cycle under front and back visual angle, the method of the present invention is significant to Gait Recognition precision in complex environment is improved, it can be used for the front end in Gait Recognition, in the identification security monitoring, human-computer interaction, medical diagnosis and access control system etc..

Description

A kind of gait cycle detecting method based on convolutional neural networks
Technical field
The invention belongs to computer vision fields, and in particular to a kind of gait cycle detection side based on convolutional neural networks Method.
Background technique
Gait Recognition can complete the receipts of data compared with living things feature recognition mode in the unwitting situation of tester Collect the identification of work and remote identity.And gait cycle detection is an inevitable process in Gait Recognition, a tool There is the Algorithm for gait recognition of good discrimination to be built upon on the intact gait cycle of segmentation.Again because Gait Recognition has Have a concealed feature, the randomness of data acquisition is larger, pedestrian relative to camera direction and pedestrian dress ornament state all It can be arbitrary, increase the difficulty of cycle detection.
The development of gait cycle detection technique along with Gait Recognition development.In existing method mostly using pedestrian's width as Feature carries out gait cycle detection.Document (Silhouette-Based Human Identification from Body Shape and Gait.IEEE International Conference on Automatic Face and Gesture Recognition.2002,366-372 it) proposes earlier and carries out period inspection using the width and altitude feature of physical trait It surveys, but this method is affected by pedestrian and video camera distance.Document (Gait recognition with transient binary patterns.Visual Communication and Image Representation.2015,33(C),69- Etc. 77) propose on this basis using normalized single profile width feature progress gait cycle detection, but this method exists Work is difficult under front and back side visual angle.Document (Silhouette Analysis-Based Gait Recognition for Human Identification.Pattern Analysis&Machine Intelligence IEEE Transactions On, 2003,25 (12): 1505-1518) it proposes to carry out gait cycle detection using the ratio of width to height of gait profile, avoid height Normalization is influenced caused by pedestrian's width.Document (Human Identification Using Temporal Information Preserving Gait Template.IEEE Transactions on Pattern Analysis& Machine Intelligence, 2012,34 (11): 2164-76) by the extraction to each frame image lower limb mean breadth come Frame position representative in a complete gait sequence is represented, influence of the object to pedestrian's width is avoided carrying around.It is overall For gait cycle can effectively be detected at 90 ° of side based on body width characteristics method, but regarded in a front surface and a side surface Error is very big when angle, or even cannot work.Document (The humanID gait challenge problem:data sets, performance,and analysis.IEEE Transactions on Pattern Analysis&Machine Intelligence, 2005,27 (2): 162-177) Selection utilization human body two-dimensional areas as cycle detection feature carry out Feature extraction.But error very big problem when equally existing a front surface and a side surface visual angle.Document (Dual-ellipse fitting approach for robust gait periodicity detection.Neurocomputing,2012,79(3):173- 178) a kind of method based on bielliptic(al) fitting is proposed.With center of mass point be divide pedestrian central point by pedestrian contour according to figure The vertical direction of picture is divided into left and right two halves, is fitted respectively with ellipse, so that pedestrian contour is tangent with ellipse, it is ellipse to calculate this Round eccentricity, the elliptical eccentricity of two fittings are added the periodic feature that the frame image is indicated as feature.In side Higher discrimination is achieved under 90 ° of face, front and back visual angle, but larger in such as 18 ° of angle of strabismus and 36 ° of equal errors.
Deep learning is fast-developing in recent years, and convolutional neural networks are wide as a kind of effective image characteristics extraction tool It is general to apply to computer vision field.It is inspired by it, set forth herein a kind of gait cycle detection side based on convolutional neural networks Method extracts the gait cycle feature of each frame by training convolutional neural networks, using this feature location present frame in step Gait cycle detection process is completed in position in the state period.This method has stronger robustness, in different perspectives, different More accurate gait cycle can be detected in the case where dress and belongings.
Summary of the invention
The object of the present invention is to provide a kind of gait cycle detecting methods based on convolutional neural networks, have stronger Robustness can detect more accurate gait cycle in the case where different perspectives, different dressings and belongings.
The object of the present invention is achieved like this:
A kind of gait cycle detecting method based on convolutional neural networks, concrete implementation step are as follows:
Step 1. pre-processes gait video, including video decoding, and pedestrian contour extracts and the normalized figure of mass center As pretreatment operation;
Step 2. trains the convolutional neural networks for extracting gait cycle feature;
Gait sequence of frames of video to be measured is sent into convolutional neural networks by step 3., and output waveform is led to after filtering It crosses and determines that the position of adjacent peaks and troughs obtains a gait cycle.
Step 2 specifically:
Step 2.1. indicates position of each video frame in a gait cycle in training set with numerical quantization, and marks Calculation formula for its label, label value is
Wherein LiThe label value of the i-th frame in gait video, the gait cycle where N indicates the i-th frame includes N frame, n table altogether Show the i-th frame for n-th frame in the gait cycle at place;
The video frame marked is sent into convolutional neural networks by step 2.2., obtains output valve;
Step 2.3. calculates the error between output valve and label, passes through the more of error back propagation and stochastic gradient descent Secondary repetitive exercise network is trained until error no longer declines after successive ignition, and the calculation formula of error is
Wherein m is the batch size for inputting network, i.e., every batch of contains m images,For the nerve of corresponding video frame tagging The estimation of network;
Step 2.4. is saved and is replicated the convolutional neural networks of training completion.
Convolutional neural networks structure described in step 2 includes multilayer convolutional layer and connects the last layer convolutional layer at least The last layer connection output layer of one layer of full articulamentum, full articulamentum is single neuron.
Step 1 specifically includes:
Step 1.1. carries out sub-frame processing to video sequence, and the sequence after framing is the figure being sequentially arranged As sequence;
Image sequence comprising pedestrian and background sequence are carried out greyscale transformation by step 1.2., are estimated using median method The background of entire sequence, and carry out binaryzation and obtain gait contour images,
Dk(x, y)=| Ik(x,y)-Mk(x,y)|
Wherein Ik(x, y) is the gray value at the pixel (x, y) of video sequence kth frame, Mk(x, y) is background ash herein Angle value, Dk(x, y) is background difference image, and T is selected binarization threshold;
The normalization of step 1.3. profile, carries out unified scaling for all profiles with consistent height, pedestrian contour is returned One input changed is the content in rectangle tangent with pedestrian contour in each video frame;For interception all in training set The image of profile traverses its all picture altitude respectively, does ratio as with calibrated altitude;Calibrated altitude under a certain visual angle is H, K frame image is shared under visual angle, and the height of every frame image sequentially in time is hk, k=1,2 ..., K, then every frame image Amplification factor be ak=hk/ H applies its corresponding amplification factor a to every frame image is obtained under the visual angle respectivelyk, using double Linear interpolation algorithm,
fa=f (x, y)+(f (x+1, y)-f (x, y)) × p
fb=f (x, y+1)+(f (x+1, y+1)-f (x, y+1)) × p
In formula, f (x, y) is the gray value at coordinate (x, y) before interpolation, and p and q are weights;Second of linear interpolation is carried out, Calculating the interpolation result at (x, y) is
G (x, y)=fa+(fb-fa)×q
=(1-p) (1-q) f (x, y)+p (1-p) f (x+1, y)
+q(1-p)f(x,y+1)+pqf(x+1,y+1)
Wherein g (x, y) is the gray value at coordinate (x, y) after interpolation.
The beneficial effects of the present invention are: this method has very strong robust to the variation of angle change, dress ornament and belongings Property, it solves the problems, such as to be difficult to detect gait cycle under front and back visual angle, existing gait Recognition technology is mostly built It stands on the basis of the gait cycle of an Accurate Segmentation, the method for the present invention has to Gait Recognition precision in complex environment is improved Significance can be used for the front end in Gait Recognition, be suitable for security monitoring, human-computer interaction, medical diagnosis and access control system etc. In identification in.
Detailed description of the invention
Fig. 1 is flow chart of the invention.
Fig. 2 is the effect picture of preceding background separation.
Fig. 3 is a gait cycle and its label value comprising 24 frames.
Fig. 4 is the example of waveform after output waveform and filtering.
Specific embodiment
The present invention will be further described with reference to the accompanying drawing:
Embodiment 1
It is illustrated by taking certain large-scale Gait Recognition database as an example, which includes the gait video sequence of 124 people Column, everyone 110 sections of videos, including different perspectives, dress ornament and belongings.
Step 1. pre-processes gait video, including, video decoding, pedestrian contour extracts and the figures such as mass center normalization As pretreatment operation.It first has to carry out sub-frame processing to video sequence, the sequence after framing is to be sequentially arranged Orderly image sequence, the image sequence comprising pedestrian and background sequence are first subjected to greyscale transformation, substantially not for illumination For the indoor environment of change, estimate using median method the background of entire sequence, subtract background with foreground image later Image carries out binaryzation again and obtains gait contour images, and the gray value at the pixel (x, y) of setting video sequence kth frame is Ik(x, Y), background gray levels herein are Mk(x, y), then background difference image Dk(x, y) and binarization result Bk(x, y) is respectively as follows:
Dk(x, y)=| Ik(x,y)-Mk(x,y)|
Wherein T is selected binarization threshold, and process is as shown in Figure 2.Profile normalization be exactly by all profiles into Unified scaling go with consistent height, thus when avoiding direction and the distance change of pedestrian and camera, since the depth of field makes to walk State profile sequence size changes.Four sides of pedestrian contour and frame are made using a rectangular frame for each video frame It is tangent, using the figure not of uniform size that this frame surrounds as the normalized input of pedestrian contour.It is cut for all in training set The image of contouring traverses its all picture altitude respectively, does ratio as with calibrated altitude.If the standard under a certain visual angle is high Degree is H, and K frame image is shared under the angle, and the height of every frame image sequentially in time is hk, k=1,2 ..., K, then often The amplification factor of frame image
ak=hk/H
To every frame image is obtained under the visual angle, its corresponding amplification factor a is applied respectivelyk, using bilinear interpolation algorithm,
fa=f (x, y)+(f (x+1, y)-f (x, y)) × p
fb=f (x, y+1)+(f (x+1, y+1)-f (x, y+1)) × p
In formula, f (x, y) is the gray value before interpolation at coordinate (x, y), similarly, f (x, y+1) be coordinate before interpolation (x, Y+1 the gray value at), f (x+1, y+1) are the gray value at coordinate (x+1, y+1) before interpolation, and p and q are weights.Again by faAnd fb Carry out the interpolation result at second of linear interpolation calculating (x, y):
G (x, y)=fa+(fb-fa)×q
=(1-p) (1-q) f (x, y)+p (1-p) f (x+1, y)
+q(1-p)f(x,y+1)+pqf(x+1,y+1)
Wherein g (x, y) is the gray value at coordinate (x, y) after interpolation.Normalization operation is completed, it is big after being normalized The small greyscale video frame for being 128 × 88.
Step 2. training convolutional neural networks are for extracting gait cycle feature.
Step 2.1. quantifies each video frame according to its position in gait cycle, and is labeled as its label;
Specifically, choosing the period is 1, the sinusoidal signal that amplitude is 1 is used to characterize the week of gait sequence as low-dimensional signal Phase property.It being defined in a gait profile sequence, legs joined and right crus of diaphragm have the tendency that video frame advanced in years forward is initial position, After the sequence of a cycle, the frame that two legs close up again and right crus of diaphragm has trend advanced in years forward is the end position in the period. This frame image can be used as the end of a cycle at this time, and can be used as the beginning of this period image.When having good positioning After initial position and final position, the mark value in middle position is calculated according to the SIN function of average value, then has training set In each frame label value are as follows:
Wherein LiThe label value of the i-th frame in gait video, the gait cycle where N indicates the frame includes N frame, n table altogether Show that the frame for n-th frame in the gait cycle at place, is illustrated in figure 3 the label value example of a gait cycle comprising 24 frames.
Video frame is sent into convolutional neural networks by step 2.2., obtains output valve;
The video frame marked in training set is sent into convolutional neural networks, the structure of the convolutional neural networks such as can be with It is: inputs the gray level image for 128 × 88, the combination of first 6 layers respectively 3 convolutional layer and pond layer;First layer is to have 64 5 The convolutional layer that the step-length of × 5 convolution kernels is 1, the second layer is that core is 3 × 3, the pond layer that step-length is 2;Third layer be have 64 3 × The convolutional layer that the step-length of 3 convolution kernels is 1, it is 3 × 3 that the 4th layer, which is core, the pond layer that step-length is 2;Layer 5 is to have 64 3 × 3 The convolutional layer that the step-length of convolution kernel is 1, layer 6 is that core is 3 × 3, the pond layer that step-length is 2;7th layer, the 8th layer and 9th layer point It is not the full articulamentum containing 1024,256 and 1 nodes.
Step 2.3. calculates the error between output valve and label, passes through the more of error back propagation and stochastic gradient descent Secondary repetitive exercise network;Error is calculated by mean square error (Mean Squared Error, MSE)
Wherein m is the batch size for inputting network, i.e. every batch of contains m image (video frame),For corresponding video frame mark The estimation of the neural network of label, the i.e. output of neural network.It is trained until error no longer declines after successive ignition.
Step 2.4. is saved and trained convolutional neural networks in copy step 2.3, can be obtained for gait cycle The convolutional neural networks of property feature extraction and SIN function regression modeling;
Gait sequence of frames of video to be measured is sent into convolutional neural networks by step 3. after the preprocessing process of step 1 In, using frame sequence as horizontal axis, network output is that the longitudinal axis draws waveform, and output waveform passes through the adjacent wave crest of determination after filtering Or a gait cycle, the waveform of network output as shown in Figure 4 and filtered waveform effect can be obtained in the position of trough, It is easy to get to its adjacent wave crest and trough, gait cycle can be obtained.

Claims (4)

1. a kind of gait cycle detecting method based on convolutional neural networks, which is characterized in that concrete implementation step are as follows:
Step 1. pre-processes gait video, including video decoding, and pedestrian contour extracts and the normalized image of mass center is pre- Processing operation;
Step 2. trains the convolutional neural networks for extracting gait cycle feature;
Gait sequence of frames of video to be measured is sent into convolutional neural networks by step 3., and output waveform is after filtering, by true The position of fixed adjacent peaks and troughs obtains a gait cycle.
2. a kind of gait cycle detecting method based on convolutional neural networks according to claim 1, which is characterized in that step Rapid 2 specifically:
Step 2.1. indicates position of each video frame in a gait cycle in training set with numerical quantization, and is labeled as it The calculation formula of label, label value is
Wherein LiThe label value of the i-th frame in gait video, the gait cycle where N indicates the i-th frame includes N frame altogether, and n indicates i-th Frame is n-th frame in the gait cycle at place;
The video frame marked is sent into convolutional neural networks by step 2.2., obtains output valve;
Step 2.3. calculates the error between output valve and label, is changed by error back propagation and the multiple of stochastic gradient descent Generation training network, is trained until error no longer declines, the calculation formula of error is after successive ignition
Wherein m is the batch size for inputting network, i.e., every batch of contains m images,For the neural network of corresponding video frame tagging Estimation;
Step 2.4. is saved and is replicated the convolutional neural networks of training completion.
3. a kind of gait cycle detecting method based on convolutional neural networks according to claim 1, it is characterised in that: step Convolutional neural networks structure described in rapid 2 includes multilayer convolutional layer and the full connection of at least one layer for connecting the last layer convolutional layer The last layer connection output layer of layer, full articulamentum is single neuron.
4. a kind of gait cycle detecting method based on convolutional neural networks according to claim 1, it is characterised in that: step Rapid 1 specifically includes:
Step 1.1. carries out sub-frame processing to video sequence, and the sequence after framing is the image sequence being sequentially arranged Column;
Image sequence comprising pedestrian and background sequence are carried out greyscale transformation by step 1.2., estimate using median method entire The background of sequence, and carry out binaryzation and obtain gait contour images,
Dk(x, y)=| Ik(x,y)-Mk(x,y)|
Wherein Ik(x, y) is the gray value at the pixel (x, y) of video sequence kth frame, Mk(x, y) is background gray levels herein, Dk(x, y) is background difference image, and T is selected binarization threshold;
All profiles are carried out unified scaling with consistent height, pedestrian contour normalization by the normalization of step 1.3. profile Input be content in each video frame in the rectangle tangent with pedestrian contour;For interception profile all in training set Image, traverse its all picture altitude respectively, do ratio as with calibrated altitude;Calibrated altitude under a certain visual angle is H, depending on K frame image is shared under angle, the height of every frame image sequentially in time is hk, k=1,2 ..., K, then every frame image is put Big multiple is ak=hk/ H applies its corresponding amplification factor a to every frame image is obtained under the visual angle respectivelyk, using bilinearity Interpolation algorithm,
fa=f (x, y)+(f (x+1, y)-f (x, y)) × p
fb=f (x, y+1)+(f (x+1, y+1)-f (x, y+1)) × p
In formula, f (x, y) is the gray value at coordinate (x, y) before interpolation, and p and q are weights;Second of linear interpolation is carried out, is calculated Interpolation result at (x, y) is
G (x, y)=fa+(fb-fa)×q
=(1-p) (1-q) f (x, y)+p (1-p) f (x+1, y)
+q(1-p)f(x,y+1)+pqf(x+1,y+1)
Wherein g (x, y) is the gray value at coordinate (x, y) after interpolation.
CN201910026947.2A 2019-01-11 2019-01-11 Gait cycle detection method based on convolutional neural network Expired - Fee Related CN109766838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910026947.2A CN109766838B (en) 2019-01-11 2019-01-11 Gait cycle detection method based on convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910026947.2A CN109766838B (en) 2019-01-11 2019-01-11 Gait cycle detection method based on convolutional neural network

Publications (2)

Publication Number Publication Date
CN109766838A true CN109766838A (en) 2019-05-17
CN109766838B CN109766838B (en) 2022-04-12

Family

ID=66453724

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910026947.2A Expired - Fee Related CN109766838B (en) 2019-01-11 2019-01-11 Gait cycle detection method based on convolutional neural network

Country Status (1)

Country Link
CN (1) CN109766838B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110598540A (en) * 2019-08-05 2019-12-20 华中科技大学 Method and system for extracting gait contour map in monitoring video
CN110765925A (en) * 2019-10-18 2020-02-07 河南大学 Carrier detection and gait recognition method based on improved twin neural network
CN110796100A (en) * 2019-10-31 2020-02-14 浙江大华技术股份有限公司 Gait recognition method and device, terminal and storage device
CN112329716A (en) * 2020-11-26 2021-02-05 重庆能源职业学院 Pedestrian age group identification method based on gait characteristics
CN112989889A (en) * 2019-12-17 2021-06-18 中南大学 Gait recognition method based on posture guidance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107122707A (en) * 2017-03-17 2017-09-01 山东大学 Video pedestrian based on macroscopic features compact representation recognition methods and system again
CN108460340A (en) * 2018-02-05 2018-08-28 北京工业大学 A kind of gait recognition method based on the dense convolutional neural networks of 3D
US20180351775A1 (en) * 2012-12-05 2018-12-06 Origin Wireless, Inc. Method, apparatus, and system for wireless motion monitoring

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180351775A1 (en) * 2012-12-05 2018-12-06 Origin Wireless, Inc. Method, apparatus, and system for wireless motion monitoring
CN107122707A (en) * 2017-03-17 2017-09-01 山东大学 Video pedestrian based on macroscopic features compact representation recognition methods and system again
CN108460340A (en) * 2018-02-05 2018-08-28 北京工业大学 A kind of gait recognition method based on the dense convolutional neural networks of 3D

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
URIEL MARTINEZ-HERNANDEZ ET AL.: "Recognition of Walking Activity and Prediction of Gait Periods with a CNN and First-Order MC Strate", 《 2018 7TH IEEE INTERNATIONAL CONFERENCE ON BIOMEDICAL ROBOTICS AND BIOMECHATRONICS (BIOROB)》 *
汤荣山 等: "基于卷积神经网络和不完整步态周期的步态识别方法", 《通信技术》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110598540A (en) * 2019-08-05 2019-12-20 华中科技大学 Method and system for extracting gait contour map in monitoring video
CN110598540B (en) * 2019-08-05 2021-12-03 华中科技大学 Method and system for extracting gait contour map in monitoring video
CN110765925A (en) * 2019-10-18 2020-02-07 河南大学 Carrier detection and gait recognition method based on improved twin neural network
CN110765925B (en) * 2019-10-18 2023-05-09 河南大学 Method for detecting carrying object and identifying gait based on improved twin neural network
CN110796100A (en) * 2019-10-31 2020-02-14 浙江大华技术股份有限公司 Gait recognition method and device, terminal and storage device
CN110796100B (en) * 2019-10-31 2022-06-07 浙江大华技术股份有限公司 Gait recognition method and device, terminal and storage device
CN112989889A (en) * 2019-12-17 2021-06-18 中南大学 Gait recognition method based on posture guidance
CN112989889B (en) * 2019-12-17 2023-09-12 中南大学 Gait recognition method based on gesture guidance
CN112329716A (en) * 2020-11-26 2021-02-05 重庆能源职业学院 Pedestrian age group identification method based on gait characteristics

Also Published As

Publication number Publication date
CN109766838B (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN109766838A (en) A kind of gait cycle detecting method based on convolutional neural networks
Niu et al. Unsupervised saliency detection of rail surface defects using stereoscopic images
CN110084156B (en) Gait feature extraction method and pedestrian identity recognition method based on gait features
CN105809693B (en) SAR image registration method based on deep neural network
CN108520216B (en) Gait image-based identity recognition method
CN101398886B (en) Rapid three-dimensional face identification method based on bi-eye passiveness stereo vision
CN108537191B (en) Three-dimensional face recognition method based on structured light camera
CN104063702B (en) Three-dimensional gait recognition based on shielding recovery and partial similarity matching
CN109409190A (en) Pedestrian detection method based on histogram of gradients and Canny edge detector
CN105277567B (en) A kind of fabric defects detection method
CN108764186A (en) Personage based on rotation deep learning blocks profile testing method
CN107451609A (en) Lung neoplasm image identification system based on depth convolutional neural networks
CN106875395A (en) Super-pixel level SAR image change detection based on deep neural network
CN107967695A (en) A kind of moving target detecting method based on depth light stream and morphological method
CN107633272B (en) DCNN texture defect identification method based on compressed sensing under small sample
CN103971329A (en) Cellular nerve network with genetic algorithm (GACNN)-based multisource image fusion method
Chunli et al. A behavior classification based on enhanced gait energy image
CN102880870B (en) The extracting method of face characteristic and system
Xie et al. Fabric defect detection method combing image pyramid and direction template
Han et al. An improved corner detection algorithm based on harris
CN111858997B (en) Cross-domain matching-based clothing template generation method
CN105550703A (en) Image similarity calculating method suitable for human body re-recognition
CN103824057A (en) Pig respiratory rate detection method based on area feature operator
CN109544575A (en) One kind being based on the matched method for reconstructing 3 D contour of ISAR sequence polygon
CN104574400A (en) Remote sensing image segmenting method based on local difference box dimension algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220412

CF01 Termination of patent right due to non-payment of annual fee