CN107563359B - Recognition of face temperature is carried out for dense population and analyzes generation method - Google Patents

Recognition of face temperature is carried out for dense population and analyzes generation method Download PDF

Info

Publication number
CN107563359B
CN107563359B CN201710912520.3A CN201710912520A CN107563359B CN 107563359 B CN107563359 B CN 107563359B CN 201710912520 A CN201710912520 A CN 201710912520A CN 107563359 B CN107563359 B CN 107563359B
Authority
CN
China
Prior art keywords
image
face
human body
axis
factor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710912520.3A
Other languages
Chinese (zh)
Other versions
CN107563359A (en
Inventor
杨晓凡
刘玉蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xinjiang Rongkun Technology Co.,Ltd.
Original Assignee
Chongqing City Intellectual Property Road Science And Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing City Intellectual Property Road Science And Technology Co Ltd filed Critical Chongqing City Intellectual Property Road Science And Technology Co Ltd
Priority to CN201710912520.3A priority Critical patent/CN107563359B/en
Publication of CN107563359A publication Critical patent/CN107563359A/en
Application granted granted Critical
Publication of CN107563359B publication Critical patent/CN107563359B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention proposes one kind and carrying out recognition of face temperature analysis generation method for dense population, including:S1 is acquired characteristics of human body's image in dense population and face characteristic image by image capture module respectively, establishes and initially sentences the disconnected model of screening, to extract personnel's attribute into crowded region.

Description

Recognition of face temperature is carried out for dense population and analyzes generation method
Technical field
The present invention relates to big data analysis fields, more particularly to one kind carrying out recognition of face temperature analysis for dense population Generation method.
Background technology
Today's society personnel transfer is frequent, and on market, station, airport etc., stream of peoples' close quarters have a large amount of video monitor Equipment, but be only for carrying out close quarters simple Image Acquisition, subsequent classification and differentiation are not carried out to image, But due to crowded complexity in social life, need rationally to advise the personnel in the crowded region that comes in and goes out and place It draws, takes corresponding management and configuration, to enable the food and drink of stream of people's close quarters, plug into traffic and entrance is rationally matched It sets, after obtaining great amount of images characteristic information, the prior art can not sort out it or sort out inaccuracy, cause the later stage When carrying out crowded region division, data sample reference can not be provided, just there is an urgent need for those skilled in the art to solve accordingly for this Technical problem.
Invention content
The present invention is directed at least solve the technical problems existing in the prior art, one kind is especially innovatively proposed for close Collection crowd carries out recognition of face temperature and analyzes generation method.
In order to realize the above-mentioned purpose of the present invention, the present invention provides one kind carrying out recognition of face temperature for dense population Generation method is analyzed, including:
S1 adopts characteristics of human body's image in dense population and face characteristic image by image capture module respectively Collection is established and initially sentences the disconnected model of screening, to extract personnel's attribute into crowded region.
It is described recognition of face temperature is carried out for dense population to analyze generation method, it is preferred that the S1 includes:
S1-1, it is assumed that it is new user that same user, which enters crowded region, wherein if including attendant or The personnel frequently passed in and out, in this model it is not intended that, because after the enough sample of acquisition, attendant or frequently The quantity of the personnel of disengaging can be ignored, and be left from crowded region, is set as corresponding personnel's certification and terminates, passes through Obtain image capture module image information in image characteristics of human body's image and face characteristic image judge, setting figure As data information coordinate [x, y] progress image acquisition, the basic point by coordinate [x, y] as image is original according to coordinate [x, y] Scanning weight is respectively set in point
Wherein p is that the number in image obtains the factor, carries out extraction of square root operation to four orientation of [x, y] coordinate, n is just Integer, nvalidFor the efficiently individual quantity decision threshold of acquisition, h (i, j) is the characteristics of human body's image i and face obtained in an orientation The number of characteristic image j,
S1-2, if characteristics of human body's image weights vector acquired in an orientation is bi=A (c-w) × (cw), A are The probability of occurrence value of human body foundation characteristic c and hand strap article characteristics w, cw are that human body foundation characteristic and hand strap article characteristics are common The definition value of appearance;It is f to obtain an orientation face characteristic image weights vectorj=B × (CT), B are to obtain face characteristic Probability value, C are face expressive features set, and T is the statistics coefficient for the unit area that face is identified as work(;Wherein C= {smile,openmouth,downhead,uphead,weeping,halfface}
S1-3 ensures the stability for obtaining information, according to biAnd fjVector value choose multizone sample calculated, Then pass through preliminary screening formulaPreliminary screening is carried out to image, Wherein, λ4For the calculating parameter of j-th of human face expression set of comprehensive i-th of characteristics of human body image in image, β4For in image The match parameter of j-th of human face expression set of comprehensive i-th of characteristics of human body image, Li,jTotally go out for the personnel in image Occurrence number, Qi,jFor the conditional probability value in the crowded region during preliminary screening in image, σ2(i, j) is crowded The judgement Extreme Parameters of area people dense degree, Pi,jFor history demographic's numerical value of stream of people's close quarters;
S1-4 carries out classification judgement, by different face expressive features collection after judging by above-mentioned primary dcreening operation to characteristics of image The image data for closing C carries out model judgement;The histogram of effective characteristics of human body's image is extracted, texture information is constructed, people is obtained and connects Each attribute value in expressive features set,
Smile attribute value Csmilejj·δxj·δyj, wherein δxjAnd δyjRespectively the X-axis smile characteristics factor and Y-axis are special Levy the factor;
Open one's mouth attribute value Copenmouthjj·τxjτyj, wherein τxjAnd τyjRespectively X-axis is opened one's mouth characterization factor and Y-axis Mouth characterization factor;
Bow attribute value Cdownheadjj·βxjβyj, wherein βxjAnd βyjRespectively X-axis bow characterization factor and Y-axis it is low Head characterization factor;
New line attribute value Cuphead=∑jj·εxj·εyj, wherein εxjAnd εyjRespectively X-axis new line characterization factor and Y-axis lift Head characterization factor;
Sobbing attribute valueWhereinWithRespectively X-axis sobbing characterization factor and Y Axis sobbing characterization factor;
Side face attribute value Chalfface=∑jj·μxj·μyj, wherein μxjAnd μyjRespectively X-axis side face characterization factor and Y-axis Side face characterization factor;
Primary dcreening operation is repeated, until generation repetitive rate rise after, terminate S1-1 to S1-3 the step of.
In conclusion by adopting the above-described technical solution, the beneficial effects of the invention are as follows:
After the present invention is by being acquired image, according to the facial information of personnel and crowded region is passed in and out The bodily form and wearing difference are classified, perfect so as to carry out the corresponding auxiliary facility of stream of people's close quarters, are passed through The sorter model is classified, and consuming system resource is small, saves time overhead, to provide conjunction for stream of people's close quarters The allocation plan of reason is conducive to personnel and dredges and personnel re-assignment.
The additional aspect and advantage of the present invention will be set forth in part in the description, and will partly become from the following description Obviously, or practice through the invention is recognized.
Description of the drawings
The above-mentioned and/or additional aspect and advantage of the present invention will become in the description from combination following accompanying drawings to embodiment Obviously and it is readily appreciated that, wherein:
Fig. 1 is general illustration of the present invention.
Specific implementation mode
The embodiment of the present invention is described below in detail, examples of the embodiments are shown in the accompanying drawings, wherein from beginning to end Same or similar label indicates same or similar element or element with the same or similar functions.Below with reference to attached The embodiment of figure description is exemplary, and is only used for explaining the present invention, and is not considered as limiting the invention.
In the description of the present invention, it is to be understood that, term " longitudinal direction ", " transverse direction ", "upper", "lower", "front", "rear", The orientation or positional relationship of the instructions such as "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outside" is based on attached drawing institute The orientation or positional relationship shown, is merely for convenience of description of the present invention and simplification of the description, and does not indicate or imply the indicated dress It sets or element must have a particular orientation, with specific azimuth configuration and operation, therefore should not be understood as the limit to the present invention System.
In the description of the present invention, unless otherwise specified and limited, it should be noted that term " installation ", " connected ", " connection " shall be understood in a broad sense, for example, it may be mechanical connection or electrical connection, can also be the connection inside two elements, it can , can also indirectly connected through an intermediary, for the ordinary skill in the art to be to be connected directly, it can basis Concrete condition understands the concrete meaning of above-mentioned term.
As shown in Figure 1, the method for the present invention includes the following steps:
S1 adopts characteristics of human body's image in dense population and face characteristic image by image capture module respectively Collection is established and initially sentences the disconnected model of screening, to extract personnel's attribute into crowded region;
S2 will leave after being judged characteristics of human body's image and face characteristic image according to initial screening judgment models The personnel in crowded region carry out matching acquisition again, by grader distinguish region that corresponding dense population is reached or The respective nodes that person leaves, to be pushed to terminal.
S1-1, it is assumed that it is new user that same user, which enters crowded region, wherein if including attendant or The personnel frequently passed in and out, in this model it is not intended that, because after the enough sample of acquisition, attendant or frequently The quantity of the personnel of disengaging can be ignored, and be left from crowded region, is set as corresponding personnel's certification and terminates, passes through Obtain image capture module image information in image characteristics of human body's image and face characteristic image judge, setting figure As data information coordinate [x, y] progress image acquisition, the basic point by coordinate [x, y] as image is original according to coordinate [x, y] Scanning weight is respectively set in point
Wherein p is that the number in image obtains the factor, carries out extraction of square root operation to four orientation of [x, y] coordinate, n is just Integer, nvalidFor the efficiently individual quantity decision threshold of acquisition, h (i, j) is the characteristics of human body's image i and face obtained in an orientation The number of characteristic image j,
S1-2, if characteristics of human body's image weights vector acquired in an orientation is bi=A (c-w) × (cw), A are people The probability of occurrence value of body foundation characteristic c and hand strap article characteristics w, cw are that human body foundation characteristic and hand strap article characteristics go out jointly Existing definition value;It is f to obtain an orientation face characteristic image weights vectorj=B × (CT), B are that acquisition face characteristic is general Rate value, C are face expressive features set, and T is the statistics coefficient for the unit area that face is identified as work(;Wherein C=smile, openmouth,downhead,uphead,weeping,halfface}
S1-3 ensures the stability for obtaining information, according to biAnd fjVector value choose multizone sample calculated, Then pass through preliminary screening formulaPreliminary screening is carried out to image, Wherein, λ4For the calculating parameter of j-th of human face expression set of comprehensive i-th of characteristics of human body image in image, β4For in image The match parameter of j-th of human face expression set of comprehensive i-th of characteristics of human body image, Li,jTotally go out for the personnel in image Occurrence number, Qi,jFor the conditional probability value in the crowded region during preliminary screening in image, σ2(i, j) is crowded The judgement Extreme Parameters of area people dense degree, Pi,jFor history demographic's numerical value of stream of people's close quarters;
S1-4 carries out classification judgement, by different face expressive features collection after judging by above-mentioned primary dcreening operation to characteristics of image The image data for closing C carries out model judgement;The histogram of effective characteristics of human body's image is extracted, texture information is constructed, people is obtained and connects Each attribute value in expressive features set,
Smile attribute value Csmile=∑jj·δxj·δyj, wherein δxjAnd δyjRespectively the X-axis smile characteristics factor and Y-axis are special Levy the factor;
Open one's mouth attribute value Copenmouth=∑jj·τxjτyj, wherein τxjAnd τyjRespectively X-axis is opened one's mouth characterization factor and Y-axis Mouth characterization factor;
Bow attribute value Cdownhead=∑jj·βxjβyj, wherein βxjAnd βyjRespectively X-axis bow characterization factor and Y-axis it is low Head characterization factor;
New line attribute value Cuphead=∑jj·εxj·εyj, wherein εxjAnd εyjRespectively X-axis new line characterization factor and Y-axis lift Head characterization factor;
Sobbing attribute valueWhereinWithRespectively X-axis sobbing characterization factor and Y Axis sobbing characterization factor;
Side face attribute value Chalffacejj·μxj·μyj, wherein μxjAnd μyjRespectively X-axis side face characterization factor and Y-axis Side face characterization factor;
Primary dcreening operation is repeated, until generation repetitive rate rise after, terminate S1-1 to S1-3 the step of;
Entire crowded region image data is divided, forms besel sequence pair (M by S2-11,M2),(M2, M3),...,(Mn-1,Mn);The hand-held object boundary for positioning characteristics of human body's image, since the initial frame head portion of video image;Positioning The access boundary of some characteristics of human body's image, the crowded area occurred from video image tail search characteristics of human body's image The corresponding position in domain, and judge the position that characteristics of human body's image occurs, residence time, and whether do shopping or hold Article;
S2-2, by being compared crawl to besel sequence pair, judge front and back one characteristics of human body's image of video frame and The change degree of face characteristic image
Wherein, wherein | Ei,jLn+Ei,jMn| it is inquiry feature L to be matchednWith besel image MnSimilarity, E representatives It flows close quarters and matches amount of images, S represents the interference set for influencing characteristics of human body's image and face characteristic image, and s, t are just The value of integer, s, t is different, and minimum value is 1, and maximum occurrences are matched characteristics of human body's image in matching characteristics of image figure With face characteristic image number;ωi,jThe weight of degree of correlation total degree, K are matched for face expressive features set CiIt is crowded Region carries out the penalty factor of characteristics of human body's image erroneous matching, and z and d respectively represent collection set and the people of characteristics of human body's image The collection set of the lower besel of body characteristics image,
The change degree is subjected to information matches with the crowded regional location residing for corresponding image capture module, is obtained The positive correlation conditional function of crowded regional location and change degree
Wherein, Y (x, y) and Z (x, y) indicates to lack between characteristics of human body's image and face characteristic image coordinate point (x, y) respectively The interaction relationship of mistake, ηiAnd σjCharacteristics of human body's image judgment threshold and face characteristic image judgment threshold are indicated respectively, are Positive number in open interval (0,1), rx,yIt indicates similar with face characteristic image to characteristics of human body's image at the position coordinate (x, y) Degree judges the factor,
S2-3, according to incidence relation between the characteristics of human body's image and face characteristic image of each individual of definition, according to pass Connection relationship generates the non-dominant individual collections of different degree of correlation grades to the inquiry degree of correlation and data relevancy ranking, according to people Non-dominant individual amount in body characteristics image and face characteristic image gradation, serial number grade it is small to big sequence slave the degree of correlation, such as Fruit is not matched in the outlet of each stream of people's close quarters with characteristics of human body's image and face characteristic image any feature Correlation chart picture executes step S2-1, if corresponding crowded regional location obtains correlation chart picture and in corresponding position Signature is carried out, step S2-4 is executed;
Crowded zonelog is arranged in S2-4, and the attribute information of stream of people's close quarters is extracted according to user demand, into Row similarity calculation inquires similarity using characteristics of human body's figure similarity calculation, is looked into using the calculating of face characteristic image similarity Similarity is ask, until daily record similarity and inquiry similarity convergence;The characteristics of human body of acquiescence is balanced by using matching weight α Image and face characteristic image correlativity and user define degree of correlation weighing result value D [i, j]=maxFi,j(1-α)·P(i, j)+α·P(i,j,rx,y)+minFi,jWherein, maxFi,jThe maximum value of the change degree of characteristics of human body's image and face characteristic image, minFi,jThe minimum value of the change degree of characteristics of human body's image and face characteristic image, P (i, j) are stream of people's close quarters initial decision Decision value, P (i, j, rx,y) it is that stream of people's close quarters result judges decision value, rx,yIt indicates to the human body at the position coordinate (x, y) Characteristic image and face characteristic image similarity judge the factor, and wherein initial decision decision value is according to history feature image data The initial decision for carrying out close quarters judges that decision value is after being optimized after judging by S2-1 to S2-4 for result Judge decision value.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show The description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or example Point is included at least one embodiment or example of the invention.In the present specification, schematic expression of the above terms are not Centainly refer to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described can be any One or more embodiments or example in can be combined in any suitable manner.
Although an embodiment of the present invention has been shown and described, it will be understood by those skilled in the art that:Not In the case of being detached from the principle of the present invention and objective a variety of change, modification, replacement and modification can be carried out to these embodiments, this The range of invention is limited by claim and its equivalent.

Claims (1)

1. a kind of carrying out recognition of face temperature analysis generation method for dense population, which is characterized in that including:
S1 is acquired characteristics of human body's image in dense population and face characteristic image by image capture module, builds respectively It is vertical initially to sentence the disconnected model of screening, to extract personnel's attribute into crowded region;
The S1 includes:
S1-1, it is assumed that it is new user that same user, which enters crowded region, wherein if including attendant or frequently The personnel of disengaging, in this model it is not intended that, because after the enough sample of acquisition, attendant or frequently disengaging The quantity of personnel can be ignored, left from crowded region, be set as corresponding personnel's certification and terminate, pass through acquisition The image information of image capture module in image characteristics of human body's image and face characteristic image judge, be arranged picture number Image acquisition is carried out according to information coordinate [x, y], the basic point by coordinate [x, y] as image is origin point according to coordinate [x, y] Weight She Zhi not scanned
Wherein p is that the number in image obtains the factor, carries out extraction of square root operation to four orientation of [x, y] coordinate, n is just whole Number, nvalidFor the efficiently individual quantity decision threshold of acquisition, h (i, j) is the characteristics of human body's image i obtained in an orientation and face is special The number of image j is levied,
S1-2, if characteristics of human body's image weights vector acquired in an orientation is bi=A (c-w) × (cw), A are human body bases The probability of occurrence value of plinth feature c and hand strap article characteristics w, cw are determining of occurring jointly of human body foundation characteristic and hand strap article characteristics Justice value;It is f to obtain an orientation face characteristic image weights vectorj=B × (CT), B are to obtain face characteristic probability value, C For face expressive features set, T is the statistics coefficient for the unit area that face is identified as work(;Wherein C=smile, openmouth,downhead,uphead,weeping,halfface}
S1-3 ensures the stability for obtaining information, according to biAnd fjVector value choose multizone sample calculated, then Pass through preliminary screening formulaPreliminary screening is carried out to image, In, λ4For the calculating parameter of j-th of human face expression set of comprehensive i-th of characteristics of human body image in image, β4It is complete in image The match parameter of j-th of human face expression set of i-th of orientation characteristics of human body's image, Li,jTotally occur for the personnel in image Number, Qi,jFor the conditional probability value in the crowded region during preliminary screening in image, σ2(i, j) is stream of people compact district The judgement Extreme Parameters of domain densely populated place degree, Pi,jFor history demographic's numerical value of stream of people's close quarters;
S1-4 carries out classification judgement after the preliminary screening of S1-3 to characteristics of image, by different face expressive features set The image data of C carries out model judgement;The histogram of effective characteristics of human body's image is extracted, texture information is constructed, obtains face table Each attribute value in feelings characteristic set,
Smile attribute value Csmile=∑jj·δxj·δyj, wherein δxjAnd δyjRespectively the X-axis smile characteristics factor and Y-axis feature because Son;
Open one's mouth attribute value Copenmouth=∑jj·τxjτyj, wherein τxjAnd τyjRespectively X-axis opens one's mouth characterization factor and Y-axis is opened one's mouth spy Levy the factor;
Bow attribute value Cdownhead=∑jj·βxjβyj, wherein βxjAnd βyjRespectively X-axis bows characterization factor and Y-axis is bowed spy Levy the factor;
New line attribute value Cuphead=∑jj·εxj·εyj, wherein εxjAnd εyjRespectively X-axis new line characterization factor and Y-axis come back special Levy the factor;
Sobbing attribute valueWhereinWithRespectively X-axis sobbing characterization factor and Y-axis are cried Tears characterization factor;
Side face attribute value Chalfface=∑jj·μxj·μyj, wherein μxjAnd μyjRespectively X-axis side face characterization factor and Y-axis side face Characterization factor;
Primary dcreening operation is repeated, until generation repetitive rate rise after, terminate S1-1 to S1-3 the step of.
CN201710912520.3A 2017-09-29 2017-09-29 Recognition of face temperature is carried out for dense population and analyzes generation method Active CN107563359B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710912520.3A CN107563359B (en) 2017-09-29 2017-09-29 Recognition of face temperature is carried out for dense population and analyzes generation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710912520.3A CN107563359B (en) 2017-09-29 2017-09-29 Recognition of face temperature is carried out for dense population and analyzes generation method

Publications (2)

Publication Number Publication Date
CN107563359A CN107563359A (en) 2018-01-09
CN107563359B true CN107563359B (en) 2018-09-11

Family

ID=60984621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710912520.3A Active CN107563359B (en) 2017-09-29 2017-09-29 Recognition of face temperature is carried out for dense population and analyzes generation method

Country Status (1)

Country Link
CN (1) CN107563359B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110889315B (en) * 2018-09-10 2023-04-28 北京市商汤科技开发有限公司 Image processing method, device, electronic equipment and system
CN111666439B (en) * 2020-05-28 2021-07-13 广东唯仁医疗科技有限公司 Working method for rapidly extracting and dividing medical image big data aiming at cloud environment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101021899A (en) * 2007-03-16 2007-08-22 南京搜拍信息技术有限公司 Interactive human face identificiating system and method of comprehensive utilizing human face and humanbody auxiliary information
CN106127173A (en) * 2016-06-30 2016-11-16 北京小白世纪网络科技有限公司 A kind of human body attribute recognition approach based on degree of depth study
CN106599785A (en) * 2016-11-14 2017-04-26 深圳奥比中光科技有限公司 Method and device for building human body 3D feature identity information database
CN107093171A (en) * 2016-02-18 2017-08-25 腾讯科技(深圳)有限公司 A kind of image processing method and device, system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102553A1 (en) * 2007-02-28 2011-05-05 Tessera Technologies Ireland Limited Enhanced real-time face models from stereo imaging

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101021899A (en) * 2007-03-16 2007-08-22 南京搜拍信息技术有限公司 Interactive human face identificiating system and method of comprehensive utilizing human face and humanbody auxiliary information
CN107093171A (en) * 2016-02-18 2017-08-25 腾讯科技(深圳)有限公司 A kind of image processing method and device, system
CN106127173A (en) * 2016-06-30 2016-11-16 北京小白世纪网络科技有限公司 A kind of human body attribute recognition approach based on degree of depth study
CN106599785A (en) * 2016-11-14 2017-04-26 深圳奥比中光科技有限公司 Method and device for building human body 3D feature identity information database

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
人体目标跟踪和表情识别中的若干问题研究;李远征;《中国博士学位论文全文数据库 信息科技辑》;20131115;第I138-22页 *
基于人体外貌特征的自适应检测与跟踪算法的研究与实现;孙伟;《中国优秀硕士学位论文全文数据库 信息科技辑》;20130615;第I138-1477页 *

Also Published As

Publication number Publication date
CN107563359A (en) 2018-01-09

Similar Documents

Publication Publication Date Title
CN107644218B (en) The working method that crowded region behavior analyzes and determines is realized based on image collecting function
CN109711281B (en) Pedestrian re-recognition and feature recognition fusion method based on deep learning
CN106897670B (en) Express violence sorting identification method based on computer vision
CN103824059B (en) Facial expression recognition method based on video image sequence
CN108520226B (en) Pedestrian re-identification method based on body decomposition and significance detection
CN110598736A (en) Power equipment infrared image fault positioning, identifying and predicting method
CN110119656A (en) Intelligent monitor system and the scene monitoring method violating the regulations of operation field personnel violating the regulations
CN109934176A (en) Pedestrian's identifying system, recognition methods and computer readable storage medium
CN100464332C (en) Picture inquiry method and system
CN101587485B (en) Face information automatic login method based on face recognition technology
CN104036236B (en) A kind of face gender identification method based on multiparameter exponential weighting
Wu et al. End-to-end chromosome Karyotyping with data augmentation using GAN
CN104657718A (en) Face recognition method based on face image feature extreme learning machine
CN110781829A (en) Light-weight deep learning intelligent business hall face recognition method
CN106682578A (en) Human face recognition method based on blink detection
CN109344872A (en) A kind of recognition methods of national costume image
Muthukumar Color-theoretic experiments to understand unequal gender classification accuracy from face images
CN110443137A (en) The recognition methods of various dimensions identity information, device, computer equipment and storage medium
CN107563359B (en) Recognition of face temperature is carried out for dense population and analyzes generation method
CN108762503A (en) A kind of man-machine interactive system based on multi-modal data acquisition
CN110276320A (en) Guard method, device, equipment and storage medium based on recognition of face
CN110533100A (en) A method of CME detection and tracking is carried out based on machine learning
CN109670423A (en) A kind of image identification system based on deep learning, method and medium
CN103034840A (en) Gender identification method
CN110991301A (en) Face recognition method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201225

Address after: Room 103, No.108 Mingzhu Avenue, yong'anzhou Town, Gaogang District, Taizhou City, Jiangsu Province

Patentee after: Yang Jianxin

Address before: 402160 27-6 6 Xinglong Avenue, Yongchuan District, Chongqing, 27-6.

Patentee before: CHONGQING ZHIQUAN ZHILU TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210318

Address after: 518116 701-3, building F, Longjing Science Park, 335 Bulong Road, Ma'antang community, Bantian street, Longgang District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Zhaohua Investment Development Co.,Ltd.

Address before: Room 103, No.108 Mingzhu Avenue, yong'anzhou Town, Gaogang District, Taizhou City, Jiangsu Province

Patentee before: Yang Jianxin

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210615

Address after: 506, 5th floor, No. 95, menmen Road, Baijiantan District, Karamay City, Xinjiang Uygur Autonomous Region 834000

Patentee after: Karamay ZHONGSHEN Energy Co.,Ltd.

Address before: 518116 701-3, building F, Longjing Science Park, 335 Bulong Road, Ma'antang community, Bantian street, Longgang District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Zhaohua Investment Development Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211231

Address after: 834000 room 409, 4th floor, comprehensive training center building, No. 68, Tuanjie South Road, Shaya County, Aksu Prefecture, Karamay City, Xinjiang Uygur Autonomous Region

Patentee after: Xinjiang Rongkun Technology Co.,Ltd.

Address before: 506, 5th floor, No. 95, menmen Road, Baijiantan District, Karamay City, Xinjiang Uygur Autonomous Region 834000

Patentee before: Karamay ZHONGSHEN Energy Co.,Ltd.