CN103309450A - Method for identifying facial expression of user to operate tablet personal computer - Google Patents

Method for identifying facial expression of user to operate tablet personal computer Download PDF

Info

Publication number
CN103309450A
CN103309450A CN2013102294503A CN201310229450A CN103309450A CN 103309450 A CN103309450 A CN 103309450A CN 2013102294503 A CN2013102294503 A CN 2013102294503A CN 201310229450 A CN201310229450 A CN 201310229450A CN 103309450 A CN103309450 A CN 103309450A
Authority
CN
China
Prior art keywords
profile
canthus
corners
mouth
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2013102294503A
Other languages
Chinese (zh)
Inventor
徐凯
徐再
赵云龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZHANGJIAGANG HONGJIA DIGITAL TECHNOLOGY Co Ltd
Original Assignee
ZHANGJIAGANG HONGJIA DIGITAL TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZHANGJIAGANG HONGJIA DIGITAL TECHNOLOGY Co Ltd filed Critical ZHANGJIAGANG HONGJIA DIGITAL TECHNOLOGY Co Ltd
Priority to CN2013102294503A priority Critical patent/CN103309450A/en
Publication of CN103309450A publication Critical patent/CN103309450A/en
Pending legal-status Critical Current

Links

Abstract

The invention relates to a method for identifying the facial expression of a user to operate a tablet personal computer, which comprises the following steps of: A, responding a booting power supply of the tablet personal computer; B, detecting at least one facial information; C, calibrating an operation and control reference position where the tablet personal computer is operated and controlled; and D, realizing the operation and the control of the tablet personal computer by utilizing the facial expression. After the technical scheme provided by the invention is adopted, the biggest characteristic of tablet personal computer operation and control is to realize the method for identifying the facial expression of the user to operate the tablet personal computer, so that the purpose of operating and controlling the tablet personal computer without touch can be realized in the tablet personal computer operation and control field. The operation and the control of the tablet personal computer without touch can be realized only through the acquisition of the facial expression.

Description

A kind of method of identifying user's facial expression operate tablet computer
Technical field
The present invention relates to the panel computer field operation, be specifically related to a kind of method of the user's of identification facial expression operate tablet computer.
Background technology
In present panel computer operating practice, the mobile communication electronic product such as most mobile phone, panel computer more and more are subject to consumer's pursuit and like, but the quick progress of electronic product reaches only applicable ordinary consumer crowd's mode of operation, the drawback of also having brought some to use, modern electronic product is not benefited the crowd that can't realize touch-control, give in other words touch control operation can not the disability crowd brought inconvenience.And based on the limitation of touch control operation, the technological improvement aspect on the mode of operation of panel computer, controlling in the urgent need to some.
Therefore, there is defective in prior art, needs to improve.
Summary of the invention
The object of the invention is to overcome above-mentioned deficiency, a kind of method of the user's of identification facial expression operate tablet computer is provided.
Realize the technical measures of above-mentioned purpose:
A kind of method of identifying user's facial expression operate tablet computer may further comprise the steps:
A, the response of panel computer start power supply;
B, detect at least one facial information;
C, calibration are controlled panel computer and are controlled the reference position;
D, utilize facial expression to realize that panel computer controls.
Said method, wherein, described step B specifically may further comprise the steps:
B11, show the detection position at panel computer;
B12, detection faces contouring and face profile;
B13, judge facial number;
B14, whether judge facial number greater than 4, as withdrawing from greater than 4, as less than then gathering one by one corresponding number facial expression information after 4 quantifications;
B15, provide predetermined face contouring and face profile detection position;
B16, collection location facial information;
B17, collection facial expression information;
B18, the definite collection are finished and are preserved;
B19, determine that whether preservation information equates with quantification among the described step B14, is then to withdraw from, otherwise returns B15.
Said method, wherein, described step B17 specifically may further comprise the steps:
B171, gather the corners of the mouth profile information that raises up;
B172, collection corners of the mouth reduced profile information;
B173, collection canthus profile amplification message;
B174, collection canthus reduced profile information;
B175, gather at least one times information of blinking.
Said method, wherein, described step C specifically may further comprise the steps:
C11, at panel computer upper left side calibration canthus profile and corners of the mouth profile;
C12, at panel computer lower left calibration canthus profile and corners of the mouth profile;
C13, at panel computer upper right side calibration canthus profile and corners of the mouth profile;
C14, at panel computer lower right calibration canthus profile and corners of the mouth profile;
C15, in the middle of the panel computer calibration canthus profile and corners of the mouth profile.
Said method, wherein, after the described step C15, also specifically carry out following steps:
The position of C16, calibration canthus profile and corners of the mouth profile size;
The position of C17, calibration canthus profile and corners of the mouth profile static shift amount;
The position of C18, calibration canthus profile and corners of the mouth profile dynamic offset.
Method as claimed in claim 5 is characterized in that, among the described step C17 after, described static shift amount is no more than the up and down 5CM of position.
Said method, wherein, among the described step C18 after, described
Dynamic offset is no more than 1/3rd of position tablet personal computer display screen up and down.
Said method wherein, also comprises concrete following steps among the described step D, following steps are and/or concern, and order is in no particular order:
D11, judge that picture is corresponding when processing corresponding picture zoomed in or out when the canthus profile zoomed in or out;
Present procedure suspends when D12, judgement secondary nictation;
D13, present procedure withdraws from when judging nictation three times;
D14, judge that the present procedure designated module was followed movement when corners of the mouth profile moved;
When D15, judgement corners of the mouth reduced profile, carry out the present procedure firing order.
Said method wherein, also comprises concrete following steps among the described step D, following steps are and/or concern, and order is in no particular order:
D16, judge that the present procedure designated module was followed movement when the canthus profile moved;
When D17, judgement canthus reduced profile, carry out the present procedure firing order.
Said method wherein, also comprises concrete following steps among the described step D, following steps are and/or concern, and order is in no particular order:
D18, judge that the corners of the mouth moves and during the canthus reduced profile, can carry out simultaneously and follow the tracks of mobile and the current firing order of program;
D19, judge that the canthus moves and during corners of the mouth reduced profile, or carry out simultaneously and follow the tracks of mobile and the current firing order of program.
After adopting technical scheme of the present invention, the characteristics of the maximum that its panel computer is controlled are to have realized identifying the method that user's facial expression is come the operate tablet computer, can control the field at panel computer like this and realize need not the purpose that touch-control can realize that panel computer is controlled.Only the collection by facial expression can need not the controlling of realization panel computer of touch-control.
Description of drawings
Fig. 1 the method for the invention process flow diagram.
Embodiment
Below in conjunction with accompanying drawing the preferred embodiments of the present invention are described in detail, thereby so that advantages and features of the invention can be easier to be it will be appreciated by those skilled in the art that protection scope of the present invention are made more explicit defining.
Embodiment one
As shown in Figure 1, a kind of method of identifying user's facial expression operate tablet computer may further comprise the steps: at first panel computer start power supply response, detect at least one facial information; Then calibration is controlled panel computer and is controlled the reference position; Utilize at last facial expression realization panel computer to control.Detecting facial information specifically may further comprise the steps:
Show the detection position at panel computer; Detection faces contouring and face profile; Judge facial number; Whether judge facial number greater than 4, as withdrawing from greater than 4, as less than then gathering one by one corresponding number facial expression information after 4 quantifications.By providing predetermined face contour and face profile detection position; Face and expression information are positioned and gather, and the information of collection comprises
The corners of the mouth profile information that raises up;
Corners of the mouth reduced profile information;
Canthus profile amplification message;
Canthus reduced profile information;
At least one times information of blinking.
Automatic preservation after collection is finished, then at panel computer left and right sides upper and lower and centre canthus profile and corners of the mouth profile are calibrated, and automatically calculate calibration canthus profile and corners of the mouth profile static shift amount and dynamic offset, wherein the static shift amount is no more than the up and down 5CM of position; Dynamic offset is no more than 1/3rd of position tablet personal computer display screen up and down.Finish after the calibration of location, system can provide a location arrow at screen, by locating the distance between two, is arranged on the middle position, the location arrow moves by facial movement, so just can control location arrow or screen by facial expression.Relevant action comprises:
Judge and start the program of selecting when blinking continuously secondary;
Present procedure withdraws from when judging continuously nictation three times;
Judge that picture is corresponding when processing corresponding picture zoomed in or out when the canthus profile zoomed in or out;
Judge that the present procedure designated module was followed movement when corners of the mouth profile moved;
When judging corners of the mouth reduced profile, carry out the present procedure firing order.
Judge that the corners of the mouth moves and during the canthus reduced profile, can carry out simultaneously and follow the tracks of mobile and the current firing order of program;
Judge that the canthus moves and during corners of the mouth reduced profile, or carry out simultaneously and follow the tracks of mobile and the current firing order of program.
For example check the picture file folder, move by face, to locate arrow moves on the picture file folder, blink twice continuously, open this document folder, but similar selected target picture is browsed, and the operation that can zoom in or out to picture by magnifying eyes is by returning the upper level catalogue three times continuous nictation.
For example carry out the game of the bird of indignation, can move on the startup icon of angry bird by facial movement running fix arrow, start this program when blinking continuously secondary, motion control screen by the canthus is a sizeable scope, then will locate arrow moves to it the locational bird of catapult, selected bird when blinking continuously secondary, mobile to the left by the corners of the mouth, make bird be in the generation state, and by behind the definite emission route of moving up and down of the corners of the mouth, close lips and make corners of the mouth reduced profile, carry out current firing order, bird is ejected, and finishes by that analogy next step action.
Above-described embodiment only is explanation technical conceive of the present invention and characteristics, and its purpose is to allow the personage who is familiar with technique can understand content of the present invention and according to this enforcement, can not limit protection scope of the present invention with this.All equivalences that spirit is done according to the present invention change or modify, and all should be encompassed within protection scope of the present invention.

Claims (10)

1. method of identifying user's facial expression operate tablet computer may further comprise the steps:
A, the response of panel computer start power supply;
B, detect at least one facial information;
C, calibration are controlled panel computer and are controlled the reference position;
D, utilize facial expression to realize that panel computer controls.
2. the method for claim 1 is characterized in that, described step B specifically may further comprise the steps:
B11, show the detection position at panel computer;
B12, detection faces contouring and face profile;
B13, judge facial number;
B14, whether judge facial number greater than 4, as withdrawing from greater than 4, as less than then gathering one by one corresponding number facial expression information after 4 quantifications;
B15, provide predetermined face contouring and face profile detection position;
B16, collection location facial information;
B17, collection facial expression information;
B18, the definite collection are finished and are preserved;
B19, determine that whether preservation information equates with quantification among the described step B14, is then to withdraw from, otherwise returns B15.
3. method as claimed in claim 2 is characterized in that, described step B17 specifically may further comprise the steps:
B171, gather the corners of the mouth profile information that raises up;
B172, collection corners of the mouth reduced profile information;
B173, collection canthus profile amplification message;
B174, collection canthus reduced profile information;
B175, gather at least one times information of blinking.
4. the method for claim 1 is characterized in that, described step C specifically may further comprise the steps:
C11, at panel computer upper left side calibration canthus profile and corners of the mouth profile;
C12, at panel computer lower left calibration canthus profile and corners of the mouth profile;
C13, at panel computer upper right side calibration canthus profile and corners of the mouth profile;
C14, at panel computer lower right calibration canthus profile and corners of the mouth profile;
C15, in the middle of the panel computer calibration canthus profile and corners of the mouth profile.
5. method as claimed in claim 4 is characterized in that, after the described step C15, also specifically carries out following steps:
The position of C16, calibration canthus profile and corners of the mouth profile size;
The position of C17, calibration canthus profile and corners of the mouth profile static shift amount;
The position of C18, calibration canthus profile and corners of the mouth profile dynamic offset.
6. method as claimed in claim 5 is characterized in that, among the described step C17 after, described static shift amount is no more than the up and down 5CM of position.
7. method as claimed in claim 5 is characterized in that, among the described step C18 after, described dynamic offset is no more than 1/3rd of position tablet personal computer display screen up and down.
8. the method for claim 1 is characterized in that, also comprises concrete following steps among the described step D, and following steps are and/or concern, and order is in no particular order:
D11, judge that picture is corresponding when processing corresponding picture zoomed in or out when the canthus profile zoomed in or out;
Present procedure suspends when D12, judgement secondary nictation;
D13, present procedure withdraws from when judging nictation three times;
D14, judge that the present procedure designated module was followed movement when corners of the mouth profile moved;
When D15, judgement corners of the mouth reduced profile, carry out the present procedure firing order.
9. the method for claim 1 is characterized in that, also comprises concrete following steps among the described step D, and following steps are and/or concern, and order is in no particular order:
D16, judge that the present procedure designated module was followed movement when the canthus profile moved;
When D17, judgement canthus reduced profile, carry out the present procedure firing order.
10. the method for claim 1 is characterized in that, also comprises concrete following steps among the described step D, and following steps are and/or concern, and order is in no particular order:
D18, judge that the corners of the mouth moves and during the canthus reduced profile, can carry out simultaneously and follow the tracks of mobile and the current firing order of program;
D19, judge that the canthus moves and during corners of the mouth reduced profile, or carry out simultaneously and follow the tracks of mobile and the current firing order of program.
CN2013102294503A 2013-06-09 2013-06-09 Method for identifying facial expression of user to operate tablet personal computer Pending CN103309450A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2013102294503A CN103309450A (en) 2013-06-09 2013-06-09 Method for identifying facial expression of user to operate tablet personal computer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2013102294503A CN103309450A (en) 2013-06-09 2013-06-09 Method for identifying facial expression of user to operate tablet personal computer

Publications (1)

Publication Number Publication Date
CN103309450A true CN103309450A (en) 2013-09-18

Family

ID=49134751

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2013102294503A Pending CN103309450A (en) 2013-06-09 2013-06-09 Method for identifying facial expression of user to operate tablet personal computer

Country Status (1)

Country Link
CN (1) CN103309450A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571858A (en) * 2014-12-22 2015-04-29 深圳市金立通信设备有限公司 Terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311882A (en) * 2007-05-23 2008-11-26 华为技术有限公司 Eye tracking human-machine interaction method and apparatus
TW201122905A (en) * 2009-12-25 2011-07-01 Primax Electronics Ltd System and method for generating control instruction by identifying user posture captured by image pickup device
JP2011243108A (en) * 2010-05-20 2011-12-01 Nec Corp Electronic book device and electronic book operation method
US20120139830A1 (en) * 2010-12-01 2012-06-07 Samsung Electronics Co., Ltd. Apparatus and method for controlling avatar using expression control point
CN102955565A (en) * 2011-08-31 2013-03-06 德信互动科技(北京)有限公司 Man-machine interaction system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311882A (en) * 2007-05-23 2008-11-26 华为技术有限公司 Eye tracking human-machine interaction method and apparatus
TW201122905A (en) * 2009-12-25 2011-07-01 Primax Electronics Ltd System and method for generating control instruction by identifying user posture captured by image pickup device
JP2011243108A (en) * 2010-05-20 2011-12-01 Nec Corp Electronic book device and electronic book operation method
US20120139830A1 (en) * 2010-12-01 2012-06-07 Samsung Electronics Co., Ltd. Apparatus and method for controlling avatar using expression control point
CN102955565A (en) * 2011-08-31 2013-03-06 德信互动科技(北京)有限公司 Man-machine interaction system and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571858A (en) * 2014-12-22 2015-04-29 深圳市金立通信设备有限公司 Terminal

Similar Documents

Publication Publication Date Title
CN103294401B (en) A kind of icon disposal route and device with the electronic equipment of touch-screen
CN104166553B (en) A kind of display methods and electronic equipment
CN103995660B (en) The method and device of touch screen browser switch window
CN102955658B (en) Device and method for page switching in interaction interface
CN102662558B (en) Word chooses method and apparatus and electronic equipment
CN103473013B (en) The method of a kind of application interface regulation and mobile terminal
CN104123095B (en) A kind of suspension touch control method and device based on vector calculus
CN105183314B (en) A kind of bright screen duration adjusting method and mobile terminal
US20140085223A1 (en) Apparatus and method capable of switching displayed pictures
CN103793178A (en) Vector graph editing method of touch screen of mobile device
CN103902174B (en) A kind of display methods and equipment
CN108376030B (en) Electronic equipment control method and device and electronic equipment
CN103414829A (en) Method, device and terminal device for controlling screen contents
JP2013042360A5 (en)
CN104536661A (en) Terminal screen shot method
CN103135903A (en) Chart gallery display method and device
CN102929527A (en) Device with picture switching function and picture switching method
CN103440191B (en) The lookup method and device of application program
CN103309450A (en) Method for identifying facial expression of user to operate tablet personal computer
CN104166454B (en) A kind of method, apparatus of screen power saving and mobile equipment
CN104267812B (en) A kind of information processing method and electronic equipment
US9665260B2 (en) Method and apparatus for controlling screen of mobile device
CN106980433A (en) The group technology and device of icon
CN104049867B (en) A kind of information processing method and electronic equipment
CN104536564A (en) Terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130918