CN108509037A - A kind of method for information display and mobile terminal - Google Patents

A kind of method for information display and mobile terminal Download PDF

Info

Publication number
CN108509037A
CN108509037A CN201810252200.4A CN201810252200A CN108509037A CN 108509037 A CN108509037 A CN 108509037A CN 201810252200 A CN201810252200 A CN 201810252200A CN 108509037 A CN108509037 A CN 108509037A
Authority
CN
China
Prior art keywords
screen
user
sight
mobile terminal
towards
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810252200.4A
Other languages
Chinese (zh)
Other versions
CN108509037B (en
Inventor
刘林瑞
代祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810252200.4A priority Critical patent/CN108509037B/en
Publication of CN108509037A publication Critical patent/CN108509037A/en
Application granted granted Critical
Publication of CN108509037B publication Critical patent/CN108509037B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

A kind of method for information display of present invention offer and mobile terminal, this method include:If the screen of the mobile terminal is in OFF state, pass through camera collection image;If there are facial image in described image, the eye image feature in the facial image is obtained;According to the eye image feature, judge the sight of user whether towards the screen;If the sight of the user is towards the screen, in the screen display preset content.In this way, when mobile terminal carries out putting out screen display, it, can be long bright to avoid making some screen area always be to maintain by only just showing preset content when the sight of user is towards screen, to save the power consumption of mobile terminal.

Description

A kind of method for information display and mobile terminal
Technical field
The present invention relates to field of communication technology more particularly to a kind of method for information display and mobile terminal.
Background technology
With the development that mobile terminal technology is maked rapid progress, more and more facilities have been brought.Such as put out screen display Show function, can enable mobile terminal in the state that screen extinguishes, keeps length bright in the subregion of screen, with to user Show some specific information or content.In this way, can make user that need not frequently remove actively to light screen i.e. it can be seen that one A little information for wanting to obtain, such as time, weather, the common information of electricity, to greatly facilitate user.However, this method It needs to keep length bright in the subregion of screen, the power consumption of mobile terminal will in this way increased.Current mobile terminal as a result, It when carrying out putting out screen display, needs in the state that screen is off, so that some screen area is always to maintain long bright, to make movement The power consumption of terminal is higher.
Invention content
A kind of method for information display of offer of the embodiment of the present invention and mobile terminal carry out putting out screen display to solve mobile terminal When, need in the state that screen is off, so that some screen area is always to maintain length bright, to make the power consumption of mobile terminal compared with High problem.
In order to solve the above technical problems, the invention is realized in this way:
In a first aspect, an embodiment of the present invention provides a kind of method for information display, it is applied to mobile terminal, including:
If the screen of the mobile terminal is in OFF state, pass through camera collection image;
If there are facial image in described image, the eye image feature in the facial image is obtained;
According to the eye image feature, judge the sight of user whether towards the screen;
If the sight of the user is towards the screen, in the screen display preset content.
Second aspect, the embodiment of the present invention also provide a kind of mobile terminal, including:
Acquisition module passes through camera collection image if the screen for the mobile terminal is in OFF state;
Acquisition module obtains the eye image in the facial image for if there are facial images in described image Feature;
Judgment module, for according to the eye image feature, judging the sight of user whether towards the screen;
Display module, if for the user sight towards the screen, in the screen display preset content.
The third aspect, the embodiment of the present invention also provide a kind of mobile terminal, including processor, memory and are stored in described It is real when the computer program is executed by the processor on memory and the computer program that can run on the processor The step of existing above- mentioned information display methods.
In the embodiment of the present invention, if the screen of the mobile terminal is in OFF state, pass through camera collection image; If there are facial image in described image, the eye image feature in the facial image is obtained;According to the eye image Whether feature judges the sight of user towards the screen;If the sight of the user is towards the screen, in the screen Show preset content.In this way, when mobile terminal carries out putting out screen display, by only just being shown when the sight of user is towards screen Preset content, can be long bright to avoid making some screen area always be to maintain, to save the power consumption of mobile terminal.
Description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, needed in being described below to the embodiment of the present invention Attached drawing to be used is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, For those of ordinary skill in the art, without having to pay creative labor, it can also obtain according to these attached drawings Obtain other attached drawings.
Fig. 1 is a kind of flow chart of method for information display provided in an embodiment of the present invention;
Fig. 2 is the flow chart of another method for information display provided in an embodiment of the present invention;
Fig. 3 is a kind of schematic diagram of eye image feature provided in an embodiment of the present invention;
Fig. 4 is the schematic diagram of another eye image feature provided in an embodiment of the present invention;
Fig. 5 is the face schematic diagram of frontal pose provided in an embodiment of the present invention;
Fig. 6 is the face schematic diagram of the first side face posture provided in an embodiment of the present invention;
Fig. 7 is the face schematic diagram of the second side face posture provided in an embodiment of the present invention;
Fig. 8 is the face schematic diagram provided in an embodiment of the present invention for looking up posture;
Fig. 9 is the face schematic diagram provided in an embodiment of the present invention for overlooking posture;
Figure 10 is a kind of structure chart of mobile terminal provided in an embodiment of the present invention;
Figure 11 is the structure chart of another mobile terminal provided in an embodiment of the present invention;
Figure 12 is the structure chart of another mobile terminal provided in an embodiment of the present invention;
Figure 13 is the structure chart of another mobile terminal provided in an embodiment of the present invention;
Figure 14 is the structure chart of another mobile terminal provided in an embodiment of the present invention.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation describes, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair Embodiment in bright, the every other implementation that those of ordinary skill in the art are obtained without creative efforts Example, shall fall within the protection scope of the present invention.
It is a kind of flow chart of method for information display provided in an embodiment of the present invention referring to Fig. 1, Fig. 1, as shown in Figure 1, packet Include following steps:
If the screen of step 101, the mobile terminal is in OFF state, pass through camera collection image.
Wherein, the screen of mobile terminal usually may include two states, and one is illuminating states, under illuminating state, Mobile terminal can show various contents by lighting screen;And another kind is OFF state, is moved down in OFF state Dynamic terminal will not light any element of screen.Therefore, the screen of the mobile terminal is in OFF state, it can be understood as The screen of mobile terminal state present when non-lit up.
Above-mentioned camera can be for the camera with the direction of screen in the same direction on mobile terminal, such as mobile phone Front camera, and the screen of mobile terminal is in OFF state, then passes through camera collection image, it can be understood as described If mobile terminal detects that screen is in OFF state, can be shot in Background control camera, real-time image acquisition. For example, user picks up when taking out mobile phone from pocket, the screen of mobile phone is in OFF state at this time, then mobile phone can opened from the background It opens camera to be shot, to obtain image data.
If there are facial image in step 102, described image, the eye image feature in the facial image is obtained.
Wherein, above-mentioned facial image can be present among camera institute the image collected, have human face five-sense-organ The parts of images of feature, and above-mentioned eye image feature can be present among the facial image, the eyes with human body The parts of images that organ characteristic matches.For example, after certain user picks up mobile phone, eyes will be opened towards mobile phone, at this time mobile phone The image taken is identified in camera, judges that there are facial images, the i.e. face of user in the image after identification Image, then the eye image feature in facial image will be identified at this time.
Step 103, according to the eye image feature, judge the sight of user whether towards the screen.
Wherein, described according to the eye image feature, judge the sight of user whether towards the screen, it is possible to understand that It is mobile terminal by analyzing eye image feature, judges the sight of user whether towards the screen.
For example, can be by matching the eye image feature with the eye template pre-saved, it will wherein phase It is determined as matching with the eye image feature like maximum eye template is spent, and the maximum eye template of the similarity is Allocated in advance user sight whether towards the screen result.But this method calculation amount is bigger, accuracy rate is not It is very high.
Or it can also be by the way that the direction of gaze of eye image feature be attributed to ten classifications:Above and below, left and right, upper left, Lower-left, upper right, bottom right, centre and closed-eye state, the sample data of the eye image feature by largely collecting this ten classifications, Study is trained to these sample datas by certain machine learning algorithm again, a grader is obtained, then passes through this point Whether class device classifies to the eye image feature that mobile terminal recognizes, determine towards the screen to the sight of user Plan.
Further, it is also possible to by being regarded to user according to the distance of eyeball center to eye socket in the eye image feature Whether line carries out other judgment methods such as judging, as long as can judge towards screen the sight of user, not to this It is defined.
If step 104, the sight of the user are towards the screen, in the screen display preset content.
Wherein, if the sight of the user can be towards the screen, in the screen display preset content, If the mobile terminal determines the sight of user towards the screen, it is default to show that the mobile terminal lights the screen Content.The preset content includes the information such as clock, weather, notice prompt, short message and time.For example, user picks up mobile phone Face has detected portrait image towards mobile phone screen, at this time mobile phone afterwards, and eye image spy is identified in portrait image It levies, mobile phone determines that the sight of user towards the screen of mobile phone, then lights screen by the eye image feature of analysis user later Read clock information.
It should be noted that if the sight of the user can continue to judge not towards the screen, can also terminate to flow Journey, Fig. 1 in the embodiment of the present invention are illustrated with terminating flow.
In the embodiment of the present invention, above-mentioned mobile terminal can be mobile phone, tablet computer (Tablet Personal Computer), laptop computer (Laptop Computer), personal digital assistant (personal digital Assistant, abbreviation PDA), mobile Internet access device (Mobile Internet Device, MID) or wearable device (Wearable Device) etc..
The embodiment of the present invention passes through camera collection image if the screen of the mobile terminal is in OFF state;If There are facial images in described image, then obtain the eye image feature in the facial image;According to eye image spy Whether sign judges the sight of user towards the screen;If the sight of the user is towards the screen, aobvious in the screen Show preset content.In this way, when mobile terminal carries out putting out screen display, by the way that only when the sight of user is towards screen, just display is pre- , can be long bright to avoid making some screen area always be to maintain if content, to save the power consumption of mobile terminal.
It is the flow chart of another method for information display provided in an embodiment of the present invention referring to Fig. 2, Fig. 2.The present embodiment with The main distinction of last embodiment is that the present embodiment obtains the eyeball distance parameter of the eye image feature, and according to described Whether eyeball distance parameter judges the sight of user towards the screen.As shown in Fig. 2, including the following steps:
If the screen of step 201, the mobile terminal is in OFF state, pass through camera collection image.
Wherein, the related description of step 201 elaborates in one embodiment, to avoid repeating, This is repeated no more.
Optionally, if the screen of the mobile terminal is in OFF state, pass through the step of camera collection image Suddenly, including:
If the screen of the mobile terminal is in OFF state, it is detected by range sensor;
If detecting the presence of object within the scope of pre-determined distance centered on by the mobile terminal, adopted by camera Collect image.
Wherein, the range sensor can be for incude mobile terminal between object at a distance from it is preset to complete The element of certain function.For example, the range sensor can be optical distance sensor, infrared distance sensor and ultrasonic wave Range sensor etc.;It the type of sensor and is not construed as limiting it should be understood that the embodiment is adjusted the distance, so that mobile terminal energy It is enough to incude at a distance between object.
For example, it is assumed that pre-determined distance range is set as 2 to 40 centimetres, then after user picks up mobile phone, by the face of oneself Close to mobile phone and face mobile phone screen, mobile phone is 20 by the distance that range sensor detects user face position to mobile phone at this time Centimetre, within preset range, then camera collection image will be opened.Further, the pre-determined distance range can be with It is 2 centimetres to 1 meter;Certainly, the pre-determined distance range can also be other distance ranges, and the embodiment is to pre-determined distance model It encloses and is not construed as limiting, can specifically be set according to the performance of mobile terminal or the demand etc. of user.
In this way, just being schemed by camera acquisition when mobile terminal within the scope of pre-determined distance by detecting the presence of object Picture can make user use the when of putting out screen display in needs and just open camera, so as to save the power consumption of mobile terminal.
If there are facial image in step 202, described image, the eye image feature in the facial image is obtained.
Wherein, the related description of step 203 elaborates in one embodiment, to avoid repeating, This is repeated no more.
Step 203, the eyeball distance parameter for obtaining the eye image feature, wherein the eyeball distance parameter includes: The eyeball center of the eye image feature to canthus horizontal distance value, and/or, the eyeball center to the eye image The vertical range value on the eye socket vertex of feature, the eye socket vertex are the vertex of superior orbit and/or the vertex of inferior orbit.
Wherein, the eyeball distance parameter for obtaining the eye image feature can be that the mobile terminal is to being located at The distance of the epibulbar location point of the eye image feature to other positions point is measured, is calculated.Specifically it is referred to Fig. 3, the angle based on plane carry out observation reference, and the eyeball center of the eye image feature can be positioned at the eyeball figure As the point o of the center location in feature;And the canthus can be the angle of superior orbit curve and inferior orbit junction, that is, distinguish The point a and b at two canthus of the left and right sides both sides on orbital border curve;The vertex of the superior orbit can be understood as The vertex of the peak c of orbital border curve, the inferior orbit can be understood as the peak d of superior orbit boundary curve.
The horizontal distance value at the eyeball center of the eye image feature to canthus can be, the eye image feature The distance at canthus is arrived at eyeball center in the horizontal direction;Specifically as shown in figure 3, the direction of the line between c and d can be set For vertical direction, then the vertical direction of line between c and d is horizontal direction;Later, by canthus a make it is flat with c with The parallel first straight line of line between d, and make the second straight line parallel with the line between c and d by canthus b, then From o to first straight line or second straight line distance in the horizontal direction is the eyeball center to canthus of the eye image feature Horizontal distance value.
And the vertical range value on the eyeball center to the eye socket vertex of the eye image feature can be, the eyes The distance on superior orbit vertex or inferior orbit vertex is arrived at the eyeball center of characteristics of image in vertical direction;Specifically as shown in figure 4, can The direction of the line between c and d is set as vertical direction, then the vertical direction of line between a and b is level Direction;Later, make the third straight line parallel with the line between a and b by superior orbit vertex c, and by inferior orbit vertex d Make fourth straight line parallel with the line between a and b, then the distance from o to third straight line or the 4th line in vertical direction Horizontal distance value of the eyeball center of the as described eye image feature to canthus.
Optionally, if the screen of the mobile terminal is in OFF state, pass through the step of camera collection image After rapid, the method further includes:
If there are facial images in described image, human face posture is identified from the facial image;
It is described that the step of whether sight of user is towards the screen is judged according to the eyeball distance parameter, including:
According to the eyeball distance parameter and the human face posture, judge the sight of user whether towards the screen.
Wherein, the human face posture can be understood as the posture that face is showed in different angle;For example, working as face Either bear right towards a left side in the horizontal direction it is dynamic raise one's head or bend down in vertical direction first-class etc., these situations can all make one Face is in different types of human face posture.
It is described to identify human face posture from the facial image, it can be what the facial image that identification current shooting arrives is in Kind human face posture.It should be noted that identifying that the method for human face posture is more, can mainly pass through base from the facial image In the method for model, the method based on face appearance and the method etc. based on classification.By taking the method based on model as an example, this method The geometry of usually pre-defined human face characteristic point, by extracting characteristic point and model from three-dimensional to two-dimensional mapping relations Determine parameter.
In addition, described according to the eyeball distance parameter and the human face posture, judge the sight of user whether towards institute Stating screen can be, the mobile terminal judges regarding for user according to the eyeball distance parameter under the human face posture Whether line is towards the screen.
In this way, mobile terminal by simultaneously according to the eyeball distance parameter and the human face posture to the sight of user into Row judges, can improve the accuracy of judgement.
Optionally, the horizontal distance value includes the eyeball center to the first branch hole angle of the eye image feature First level distance value, and/or, second horizontal distance of the eyeball center to the second side canthus of the eye image feature Value;
It is described according to the eyeball distance parameter and the human face posture, judge the sight of user whether towards the screen The step of, including:
If the human face posture is frontal pose, the first level distance value is more than the first distance threshold and described Second horizontal distance value is more than second distance threshold value, it is determined that the sight of the user is towards the screen, wherein the face Image is under the frontal pose, the difference of the first side facial image area and the second side facial image area of the facial image Less than the first preset area threshold value;Alternatively,
If the human face posture is the first side face posture, and the first level distance value is less than described first apart from threshold Value, it is determined that the sight of the user is towards the screen, under the first side face posture, the first side of the facial image Facial image area is bigger than the second side facial image area, and the first side facial image area and the second side face figure The difference of image planes product is more than the second preset area threshold value;Alternatively,
If the human face posture is the second side face posture, and the second horizontal distance value is less than described first apart from threshold Value, it is determined that the sight of the user is towards the screen, under the second side face posture, the second side of the facial image Facial image area is bigger than the first side facial image area, and the second side facial image area and the first side facial image face The difference of product is more than the second preset area threshold value;
In the embodiment, the first branch hole angle of the eye image feature can be the left side of the eye image feature Canthus, the second side canthus of the eye image feature can be the right side canthus of the eye image feature.
So, the first level distance value at the eyeball center to the first branch hole angle of the eye image feature can be managed Solution is first level distance value of the eyeball center to the left eye angle of the eye image feature, and the eyeball center is described in The second horizontal distance value at the second side canthus of eye image feature can be understood as the eyeball center to the eye image The second horizontal distance value at the right eye angle of feature.
And the frontal pose, as shown in Figure 5, it can be understood as the head forward direction of user faces the face appearance that screen is presented State;The first side face posture, as shown in Figure 6, it can be understood as the people that the head surface of user is presented after turning right to screen Face posture;The second side face posture, as shown in Figure 7, it can be understood as the head surface of user is presented after turning left to screen Human face posture.
Furthermore, it is possible to using the nose of face as cut-off rule, after face is divided into both sides, wherein the left side of cut-off rule is the The right side of side, cut-off rule is the second side;The first side facial image area can be understood as positioned at the left side of cut-off rule The area of facial image;So, the area of the facial image on the right side of cut-off rule is the second side facial image area.
And the first preset area threshold value and the second preset area threshold value all can be a percentage parameters;It is described Second preset area threshold value and the third preset area threshold value are all higher than the first preset area threshold value;Preferably, described First preset area threshold value can be 5% to 20%, and the second preset area threshold value can be 20%;Certainly, described first Preset area threshold value and the second preset area threshold value can also be other numerical value, to this and be not construed as limiting.
First distance threshold and the second distance threshold value are a preset parameter;Below to described The setting logic of one distance threshold and the second distance threshold value illustrates.
Since the default location of eyeball is usually located at the middle position of eye socket, i.e., the half of the spacing at two canthus Place, and when user in the horizontal direction, when making sight from different angles towards screen with different human face postures, eyeball only can be to A left side deviates to the right, and the deviation range of eyeball is confined within eye socket, and peak excursion usually not more than described two The half of the spacing at a canthus.
It is preferred, therefore, that the setting of first distance threshold and the second distance threshold value can be with the eyes figure As the canthus spacing of feature half on the basis of.
By taking Fig. 3 as an example, the canthus spacing of the eye image feature is X, and X1 is the first level distance value, and X2 is institute State the second horizontal distance value;First distance threshold can be X/2-α 1, and the second distance threshold value can be X/2-α 2;α1 With the respectively parameter presets of α 2, the value of α 1 and α 2 can be adjusted according to experiment measuring and calculating.
If mobile phone screen is watched in certain user front attentively, handset identity goes out user and is in frontal pose at this time, and X1 is more than X/2-α 1, X2 is more than X/2-α 2;Then mobile phone will can determine whether the sight of user towards mobile phone screen.If certain user plane turns right to screen After watch mobile phone screen attentively, handset identity goes out user and is in the first side face posture at this time, and X1 is less than X/2-α 1, then mobile phone will be true The sight of user is determined towards mobile phone screen.If certain user plane watches mobile phone screen attentively after turning left to screen, at this time handset identity Go out user and be in the second side face posture, and X2 is less than X/2-α 2, then mobile phone will can determine whether the sight of user towards mobile phone screen.
In this way, mobile terminal is by respectively under frontal pose, the first side face posture and third side face posture, in conjunction with One horizontal distance value and the second horizontal distance value judge the sight of user;When the face of user is presented in the horizontal direction When different human face posture, mobile terminal can be made more accurate to the judgement of the sight of user.
Optionally, the vertical range value includes first vertical range of the eyeball center to the vertex of the superior orbit Value, and/or, the second vertical range value at the eyeball center to the vertex of the inferior orbit;
It is described according to the eyeball distance parameter and the human face posture, judge the sight of user whether towards the screen The step of, including:
If the human face posture is to look up posture, and the first vertical range value is less than third distance threshold, it is determined that The sight of the user is towards the screen;Alternatively,
If the human face posture is to overlook posture, and the second vertical range value is less than the third distance threshold, then Determine the sight of the user towards the screen.
In the embodiment, the first vertical range value can be the eyeball center to the eye image feature The distance on the vertex of superior orbit boundary curve, the second vertical range value can be the eyeball centers to the eye image The distance on the vertex of the inferior orbit boundary curve of feature.
As shown in figure 8, described look up posture, can be the face appearance that the head of user is presented screen to behind facing upward State;As shown in figure 9, the vertical view posture, it can be understood as the head of user bends down the human face posture presented below to screen.
The third distance threshold and the 4th distance threshold all can be a preset parameters;Below to institute The setting logic for stating third distance threshold and the 4th distance threshold illustrates.
Since the default location of eyeball is usually located at the middle position of eye socket, i.e., the half of the spacing at two canthus Place, and when user is in vertical direction with different human face postures, when making sight from different angles towards screen, eyeball only can Upwardly or downwardly deviate, and the deviation range of eyeball is usually between superior orbit and inferior orbit, and peak excursion is usually not It can be more than the half of the maximum spacing of superior orbit and inferior orbit.
It is preferred, therefore, that the setting of the third distance threshold and the 4th distance threshold can be with the eyes figure As the maximum spacing of the superior orbit and inferior orbit of feature half on the basis of.By taking eye image feature shown in Fig. 4 as an example, If certain user plane watches mobile phone screen attentively after coming back upwards to screen, handset identity, which goes out user and is in, at this time looks up posture, and Y1 is small In Y/2-α 1, then mobile phone will can determine whether the sight of user towards mobile phone screen.If certain user plane is watched attentively after bowing downwards to screen Mobile phone screen, handset identity, which goes out user and is in, at this time looks up posture, and Y2 is less than Y/2-α 1, then mobile phone will can determine whether regarding for user Line is towards mobile phone screen.
In this way, mobile terminal is being by respectively in the case where looking up posture and overlooking posture, in conjunction with the first vertical range value and Second vertical range value judges the sight of user;When different human face postures is presented in vertical direction for the face of user When, mobile terminal can be made more accurate to the judgement of the sight of user.
Step 204, according to the eyeball distance parameter, judge the sight of user whether towards the screen.
Wherein, described according to the eyeball distance parameter, judge whether the sight of user can be institute towards the screen Mobile terminal is stated according to the obtained eyeball distance parameter of measurement, judges the sight of user whether towards the screen.
If step 205, the sight of the user are towards the screen, in the screen display preset content.
Wherein, if the sight of the user can continue to judge, can also terminate flow, this hair not towards the screen Fig. 2 in bright embodiment is illustrated with terminating flow.In addition, other related descriptions of step 205 are implemented at first It is elaborated in example, to avoid repeating, details are not described herein.
Optionally, the screen of the mobile terminal is provided with fingerprint recognition control, and the preset content includes for prompting The predetermined pattern of fingerprint recognition control position;
It is described the screen display preset content the step of, including:
The predetermined pattern is shown in the predeterminable area of the screen, and the predeterminable area is where fingerprint recognition control Screen area.
Wherein, the predeterminable area in the screen shows the predetermined pattern, can be described in mobile terminal is lighted The predeterminable area of screen shows the predetermined pattern.
For example, after user takes out comprehensive screen mobile phone, sight towards comprehensive screen mobile phone screen, at this time this comprehensively A certain region on the screen of screen mobile phone screen will show that a fingerprint pattern, the fingerprint pattern are to refer under the screen of the mobile phone Line identifies the position where control.
Certainly, the predetermined pattern can be other patterns, not make specific restriction to this, as long as can be used in prompting Fingerprint recognition control position.
In this way, mobile terminal can be used to prompt the predetermined pattern of fingerprint recognition control position in screen display, from And user can be allow to know fingerprint recognition control position rapidly, improve the efficiency of unlocked by fingerprint.
The embodiment of the present invention, mobile terminal can by according to eye distance parameter come judge user sight whether direction Screen can improve the accuracy rate of judgement.
It is the structure chart for the mobile terminal that one embodiment of the invention provides referring to Figure 10, Figure 10, it is as shown in Figure 10, mobile whole End 1000 includes acquisition module 1001, acquisition module 1002, judgment module 1003 and display module 1004;Wherein:
Acquisition module 1001 is acquired by camera and is schemed if the screen for the mobile terminal is in OFF state Picture;
Acquisition module 1002 obtains the eyes in the facial image for if there are facial images in described image Characteristics of image;
Judgment module 1003, for according to the eye image feature, judging the sight of user whether towards the screen;
Display module 1004, it is default interior in the screen display if the sight for the user is towards the screen Hold.
Optionally, as shown in figure 11, the judgment module 1003 includes:
Acquiring unit 10031, the eyeball distance parameter for obtaining the eye image feature, wherein the eyeball away from Include from parameter:The eyeball center of the eye image feature to canthus horizontal distance value, and/or, the eyeball center is arrived The vertical range value on the eye socket vertex of the eye image feature, the eye socket vertex are vertex and/or the inferior orbit of superior orbit Vertex;
Judging unit 10032, for according to the eyeball distance parameter, judging the sight of user whether towards the screen Curtain.
Optionally, as shown in figure 12, the mobile terminal 1000 further includes:
Identification module 1005 identifies face for if there are facial images in described image from the facial image Posture;
The judging unit 10032 is used to, according to the eyeball distance parameter and the human face posture, judge regarding for user Whether line is towards the screen.
Optionally, the horizontal distance value includes the eyeball center to the first branch hole angle of the eye image feature First level distance value, and/or, second horizontal distance of the eyeball center to the second side canthus of the eye image feature Value;
If the judging unit 10032 is frontal pose for the human face posture, the first level distance value is more than First distance threshold and the second horizontal distance value are more than second distance threshold value, it is determined that the sight direction of the user The screen, wherein the facial image under the frontal pose, the first side facial image area of the facial image and The difference of the second side facial image area is less than the first preset area threshold value;Alternatively,
If the judging unit 10032 is used for the human face posture as the first side face posture, and the first level distance Value is less than first distance threshold, it is determined that and the sight of the user is towards the screen, under the first side face posture, First side facial image area of the facial image is bigger than the second side facial image area, and the first side facial image face Product and the difference of the second side facial image area are more than the second preset area threshold value;Alternatively,
If the judging unit 10032 is used for the human face posture as the second side face posture, and second horizontal distance Value is less than first distance threshold, it is determined that and the sight of the user is towards the screen, under the second side face posture, The second side facial image area of the facial image is bigger than the first side facial image area, and the second side facial image face Product and the difference of the first side facial image area are more than the second preset area threshold value.
Optionally, the vertical range value includes first vertical range of the eyeball center to the vertex of the superior orbit Value, and/or, the second vertical range value at the eyeball center to the vertex of the inferior orbit;
If the judging unit 10032 is to look up posture, and first vertical range value is small for the human face posture In third distance threshold, it is determined that the sight of the user is towards the screen;Alternatively,
If the judging unit 10032 is to overlook posture, and second vertical range value is small for the human face posture In the third distance threshold, it is determined that the sight of the user is towards the screen.
Optionally, as shown in figure 13, the acquisition module 1001, including:
Detection unit 10011, if being in OFF state for the screen of the mobile terminal, by range sensor into Row detection;
Collecting unit 10012, if for being detected the presence of within the scope of pre-determined distance centered on by the mobile terminal Object then passes through camera collection image.
Optionally, the screen of the mobile terminal is provided with fingerprint recognition control, and the preset content includes for prompting The predetermined pattern of fingerprint recognition control position;
The display module 1004 is used to show the predetermined pattern, the predeterminable area in the predeterminable area of the screen For the screen area where fingerprint recognition control.
Mobile terminal provided in an embodiment of the present invention can realize that mobile terminal is realized each in above method embodiment Process, to avoid repeating, which is not described herein again.And identical advantageous effect can be reached.
Referring to Figure 14, a kind of hardware architecture diagram of Figure 14 mobile terminals of each embodiment to realize the present invention,
The mobile terminal 1400 includes but not limited to:Radio frequency unit 1401, network module 1402, audio output unit 1403, input unit 1404, sensor 1405, display unit 1406, user input unit 1407, interface unit 1408, storage The components such as device 1409, processor 1410 and power supply 1411.It will be understood by those skilled in the art that being moved shown in Figure 14 Terminal structure does not constitute the restriction to mobile terminal, and mobile terminal may include components more more or fewer than diagram, or Combine certain components or different components arrangement.In embodiments of the present invention, mobile terminal includes but not limited to mobile phone, puts down Plate computer, laptop, palm PC, car-mounted terminal, wearable device and pedometer etc..
Wherein, processor 1410 are acquired if the screen for the mobile terminal is in OFF state by camera Image;
If there are facial image in described image, the eye image feature in the facial image is obtained;
According to the eye image feature, judge the sight of user whether towards the screen;
If the sight of the user is towards the screen, in the screen display preset content.
Optionally, what the processor 1410 executed is described according to the eye image feature, judges that the sight of user is It is no towards the screen, including:
Obtain the eyeball distance parameter of the eye image feature, wherein the eyeball distance parameter includes:The eyes The eyeball center of characteristics of image to canthus horizontal distance value, and/or, eye of the eyeball center to the eye image feature The vertical range value on socket of the eye vertex, the eye socket vertex are the vertex of superior orbit and/or the vertex of inferior orbit;
According to the eyeball distance parameter, judge the sight of user whether towards the screen.
Optionally, if the processor 1410 is additionally operable in described image there are facial image, from the facial image Middle identification human face posture;
Whether the processor 1410 executes described according to the eyeball distance parameter, judge the sight of user towards institute Screen is stated, including:
According to the eyeball distance parameter and the human face posture, judge the sight of user whether towards the screen.
Optionally, the horizontal distance value includes the eyeball center to the first branch hole angle of the eye image feature First level distance value, and/or, second horizontal distance of the eyeball center to the second side canthus of the eye image feature Value;
The processor 1410 executes described according to the eyeball distance parameter and the human face posture, judges user's Sight whether towards the screen, including:
If the human face posture is frontal pose, the first level distance value is more than the first distance threshold and described Second horizontal distance value is more than second distance threshold value, it is determined that the sight of the user is towards the screen, wherein the face Image is under the frontal pose, the difference of the first side facial image area and the second side facial image area of the facial image Less than the first preset area threshold value;
If the human face posture is the first side face posture, and the first level distance value is less than described first apart from threshold Value, it is determined that the sight of the user is towards the screen, under the first side face posture, the first side of the facial image Facial image area is bigger than the second side facial image area, and the first side facial image area and the second side face figure The difference of image planes product is more than the second preset area threshold value;
If the human face posture is the second side face posture, and the second horizontal distance value is less than described first apart from threshold Value, it is determined that the sight of the user is towards the screen, under the second side face posture, the second side of the facial image Facial image area is bigger than the first side facial image area, and the second side facial image area and the first side facial image face The difference of product is more than the second preset area threshold value.
Optionally, the vertical range value includes first vertical range of the eyeball center to the vertex of the superior orbit Value, and/or, the second vertical range value at the eyeball center to the vertex of the inferior orbit;
The processor 1410 executes described according to the eyeball distance parameter and the human face posture, judges user's Sight whether towards the screen, including:
If the human face posture is to look up posture, and the first vertical range value is less than third distance threshold, it is determined that The sight of the user is towards the screen;
If the human face posture is to overlook posture, and the second vertical range value is less than the third distance threshold, then Determine the sight of the user towards the screen.
Optionally, if the screen for the mobile terminal that the processor 1410 executes is in OFF state, lead to Camera collection image is crossed, including:
If the screen of the mobile terminal is in OFF state, it is detected by range sensor;
If detecting the presence of object within the scope of pre-determined distance centered on by the mobile terminal, adopted by camera Collect image.
Optionally, the screen of the mobile terminal is provided with fingerprint recognition control, and the preset content includes for prompting The predetermined pattern of fingerprint recognition control position;
The processor 1410 executes described in the screen display preset content, including:
The predetermined pattern is shown in the predeterminable area of the screen, and the predeterminable area is where fingerprint recognition control Screen area.
Mobile terminal 1400 saves the power consumption of mobile terminal, has advantageous effect identical with embodiment of the method.
It should be understood that the embodiment of the present invention in, radio frequency unit 1401 can be used for receiving and sending messages or communication process in, signal Send and receive, specifically, by from base station downlink data receive after, to processor 1410 handle;In addition, by uplink Data are sent to base station.In general, radio frequency unit 1401 includes but not limited to antenna, at least one amplifier, transceiver, coupling Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 1401 can also by radio communication system and network and other Equipment communicates.
Mobile terminal has provided wireless broadband internet to the user by network module 1402 and has accessed, and such as user is helped to receive Send e-mails, browse webpage and access streaming video etc..
It is that audio output unit 1403 can receive radio frequency unit 1401 or network module 1402 or in memory The audio data stored in 1409 is converted into audio signal and exports to be sound.Moreover, audio output unit 1403 can be with The relevant audio output of specific function executed with mobile terminal 1400 is provided (for example, call signal receives sound, message sink Sound etc.).Audio output unit 1403 includes loud speaker, buzzer and receiver etc..
Input unit 1404 is for receiving audio or video signal.Input unit 1404 may include graphics processor (Graphics Processing Unit, GPU) 14041 and microphone 14042, graphics processor 14041 in video to capturing In pattern or image capture mode by image capture apparatus (such as camera) obtain static images or video image data into Row processing.Treated, and picture frame may be displayed on display unit 1406.Through treated the picture frame of graphics processor 14041 It can be stored in memory 1409 (or other storage mediums) or be carried out via radio frequency unit 1401 or network module 1402 It sends.Microphone 14042 can receive sound, and can be audio data by such acoustic processing.Audio that treated Data can be converted to the lattice that mobile communication base station can be sent to via radio frequency unit 1401 in the case of telephone calling model Formula exports.
Mobile terminal 1400 further includes at least one sensor 1405, for example, optical sensor, motion sensor and other Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to ring The light and shade of border light adjusts the brightness of display panel 14061, proximity sensor can when mobile terminal 1400 is moved in one's ear, Close display panel 14061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions The size of (generally three axis) acceleration, can detect that size and the direction of gravity, can be used to identify mobile terminal appearance when static State (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) Deng;Sensor 1405 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, gas Meter, hygrometer, thermometer, infrared sensor etc. are pressed, details are not described herein.
Display unit 1406 is for showing information input by user or being supplied to the information of user.Display unit 1406 can Including display panel 14061, liquid crystal display (Liquid Crystal Display, LCD), organic light-emitting diodes may be used Forms such as (Organic Light-Emitting Diode, OLED) are managed to configure display panel 14061.
User input unit 1407 can be used for receiving the number or character information of input, and generate the use with mobile terminal Family is arranged and the related key signals input of function control.Specifically, user input unit 1407 include touch panel 14071 with And other input equipments 14072.Touch panel 14071, also referred to as touch screen collect user on it or neighbouring touch are grasped Make (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 14071 or in touch panel Operation near 14071).Touch panel 14071 may include both touch detecting apparatus and touch controller.Wherein, it touches The touch orientation of detection device detection user is touched, and detects the signal that touch operation is brought, transmits a signal to touch controller; Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 1410, It receives the order that processor 1410 is sent and is executed.Furthermore, it is possible to using resistance-type, condenser type, infrared ray and surface The multiple types such as sound wave realize touch panel 14071.In addition to touch panel 14071, user input unit 1407 can also include Other input equipments 14072.Specifically, other input equipments 14072 can include but is not limited to physical keyboard, function key (ratio Such as volume control button, switch key), trace ball, mouse, operating lever, details are not described herein.
Further, touch panel 14071 can be covered on display panel 14061, when touch panel 14071 detects After touch operation on or near it, processor 1410 is sent to determine the type of touch event, is followed by subsequent processing device 1410 Corresponding visual output is provided on display panel 14061 according to the type of touch event.Although in fig. 14, touch panel 14071 with display panel 14061 be realize the function that outputs and inputs of mobile terminal as two independent components, but In some embodiments, touch panel 14071 can be integrated with display panel 14061 and realize outputting and inputting for mobile terminal Function does not limit specifically herein.
Interface unit 1408 is the interface that external device (ED) is connect with mobile terminal 1400.For example, external device (ED) may include Wired or wireless headphone port, external power supply (or battery charger) port, wired or wireless data port, storage card Port, the port for connecting the device with identification module, the port audio input/output (I/O), video i/o port, earphone Port etc..Interface unit 1408 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) simultaneously And by one or more elements that the input received is transferred in mobile terminal 1400 or it can be used in mobile terminal Transmission data between 1400 and external device (ED).
Memory 1409 can be used for storing software program and various data.Memory 1409 can include mainly storage program Area and storage data field, wherein storing program area can storage program area, needed at least one function application program (such as Sound-playing function, image player function etc.) etc.;Storage data field can be stored uses created data (ratio according to mobile phone Such as audio data, phone directory) etc..In addition, memory 1409 may include high-speed random access memory, can also include non- Volatile memory, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Processor 1410 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection A part by running or execute the software program and/or module that are stored in memory 1409, and calls and is stored in storage Data in device 1409 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.Place Reason device 1410 may include one or more processing units;Preferably, processor 1410 can integrate application processor and modulation /demodulation Processor, wherein the main processing operation system of application processor, user interface and application program etc., modem processor master Handle wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 1410.
Mobile terminal 1400 can also include the power supply 1411 (such as battery) powered to all parts, it is preferred that power supply 1411 can be logically contiguous by power-supply management system and processor 1410, to realize that management is filled by power-supply management system The functions such as electricity, electric discharge and power managed.
In addition, mobile terminal 1400 includes some unshowned function modules, details are not described herein.
Preferably, the embodiment of the present invention also provides a kind of mobile terminal, including processor 1410, memory 1409, storage On memory 1409 and the computer program that can be run on the processor 1410, the computer program is by processor 1410 Each process of above- mentioned information display methods embodiment is realized when execution, and can reach identical technique effect, to avoid repeating, Which is not described herein again.
It should be noted that herein, the terms "include", "comprise" or its any other variant are intended to non-row His property includes, so that process, method, article or device including a series of elements include not only those elements, and And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including this There is also other identical elements in the process of element, method, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on this understanding, technical scheme of the present invention substantially in other words does the prior art Going out the part of contribution can be expressed in the form of software products, which is stored in a storage medium In (such as ROM/RAM, magnetic disc, CD), including some instructions are used so that a station terminal (can be mobile phone, computer, service Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited in above-mentioned specific Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much Form belongs within the protection of the present invention.

Claims (11)

1. a kind of method for information display is applied to mobile terminal, which is characterized in that including:
If the screen of the mobile terminal is in OFF state, pass through camera collection image;
If there are facial image in described image, the eye image feature in the facial image is obtained;
According to the eye image feature, judge the sight of user whether towards the screen;
If the sight of the user is towards the screen, in the screen display preset content.
2. according to the method described in claim 1, it is characterized in that, described according to the eye image feature, judge user's The step of whether sight is towards the screen, including:
Obtain the eyeball distance parameter of the eye image feature, wherein the eyeball distance parameter includes:The eye image The eyeball center of feature to canthus horizontal distance value, and/or, the eyeball center to the eye socket top of the eye image feature The vertical range value of point, the eye socket vertex are the vertex of superior orbit and/or the vertex of inferior orbit;
According to the eyeball distance parameter, judge the sight of user whether towards the screen.
3. if according to the method described in claim 2, it is characterized in that, the screen of the mobile terminal, which is in, extinguishes shape State, then after the step of passing through camera collection image, the method further includes:
If there are facial images in described image, human face posture is identified from the facial image;
It is described that the step of whether sight of user is towards the screen is judged according to the eyeball distance parameter, including:
According to the eyeball distance parameter and the human face posture, judge the sight of user whether towards the screen.
4. according to the method described in claim 3, it is characterized in that,
The horizontal distance value include the eyeball center to the first branch hole angle of the eye image feature first level away from From value, and/or, the second horizontal distance value at the eyeball center to the second side canthus of the eye image feature, described According to the eyeball distance parameter and the human face posture, the step of whether sight of user is towards the screen is judged, including:
If the human face posture is frontal pose, the first level distance value is more than the first distance threshold and described second Horizontal distance value is more than second distance threshold value, it is determined that the sight of the user is towards the screen, wherein the facial image Under the frontal pose, the first side facial image area of the facial image and the difference of the second side facial image area are less than First preset area threshold value;Alternatively,
If the human face posture is the first side face posture, and the first level distance value is less than first distance threshold, then The sight of the user is determined towards the screen, under the first side face posture, the first side face of the facial image Image area is bigger than the second side facial image area, and the first side facial image area and the second side facial image face The difference of product is more than the second preset area threshold value;Alternatively,
If the human face posture is the second side face posture, and the second horizontal distance value is less than first distance threshold, then Determine the sight of the user towards the screen, under the second side face posture, the second side face of the facial image Image area is bigger than the first side facial image area, and the second side facial image area and the first side facial image area it Difference is more than the second preset area threshold value;
Alternatively,
The vertical range value includes first vertical range value of the eyeball center to the vertex of the superior orbit, and/or, institute State eyeball center to the vertex of the inferior orbit the second vertical range value, it is described according to the eyeball distance parameter and the people Face posture judges the step of whether sight of user is towards the screen, including:
If the human face posture is to look up posture, and the first vertical range value is less than third distance threshold, it is determined that described The sight of user is towards the screen;Alternatively,
If the human face posture is to overlook posture, and the second vertical range value is less than the third distance threshold, it is determined that The sight of the user is towards the screen.
5. if according to the method described in claim 1, it is characterized in that, the screen of the mobile terminal, which is in, extinguishes shape State, then the step of passing through camera collection image, including:
If the screen of the mobile terminal is in OFF state, it is detected by range sensor;
If detecting the presence of object within the scope of pre-determined distance centered on by the mobile terminal, is acquired and schemed by camera Picture.
6. a kind of mobile terminal, which is characterized in that including:
Acquisition module passes through camera collection image if the screen for the mobile terminal is in OFF state;
Acquisition module, if for, there are facial image, obtaining the eye image feature in the facial image in described image;
Judgment module, for according to the eye image feature, judging the sight of user whether towards the screen;
Display module, if for the user sight towards the screen, in the screen display preset content.
7. mobile terminal according to claim 6, which is characterized in that the judgment module includes:
Acquiring unit, the eyeball distance parameter for obtaining the eye image feature, wherein the eyeball distance parameter packet It includes:The eyeball center of the eye image feature to canthus horizontal distance value, and/or, the eyeball center to the eyes The vertical range value on the eye socket vertex of characteristics of image, the eye socket vertex are the vertex of superior orbit and/or the vertex of inferior orbit;
Judging unit, for according to the eyeball distance parameter, judging the sight of user whether towards the screen.
8. mobile terminal according to claim 7, which is characterized in that the mobile terminal further includes:
Identification module identifies human face posture for if there are facial images in described image from the facial image;
The judging unit be used to be judged according to the eyeball distance parameter and the human face posture user sight whether direction The screen.
9. mobile terminal according to claim 8, which is characterized in that
The horizontal distance value include the eyeball center to the first branch hole angle of the eye image feature first level away from From value, and/or, the second horizontal distance value at the eyeball center to the second side canthus of the eye image feature is described to sentence If disconnected unit is frontal pose for the human face posture, the first level distance value is more than the first distance threshold, Yi Jisuo It states the second horizontal distance value and is more than second distance threshold value, it is determined that the sight of the user is towards the screen, wherein the people Face image under the frontal pose, the first side facial image area and the second side facial image area of the facial image it Difference is less than the first preset area threshold value;
Alternatively,
If the judging unit is the first side face posture for the human face posture, and the first level distance value is less than described First distance threshold, it is determined that the sight of the user is towards the screen, under the first side face posture, the face figure First side facial image area of picture is bigger than the second side facial image area, and the first side facial image area and described The difference of two side facial image areas is more than the second preset area threshold value;If alternatively, the judging unit is used for the human face posture For the second side face posture, and the second horizontal distance value is less than first distance threshold, it is determined that the sight of the user Towards the screen, under the second side face posture, the second side facial image area of the facial image is than the first side people Face image area is big, and the difference of the second side facial image area and the first side facial image area is more than described second and presets Area threshold;
Alternatively,
The vertical range value includes first vertical range value of the eyeball center to the vertex of the superior orbit, and/or, institute State eyeball center to the vertex of the inferior orbit the second vertical range value, if the judging unit is for the human face posture Look up posture, and the first vertical range value is less than third distance threshold, it is determined that the sight of the user is towards the screen Curtain;If alternatively, the judging unit is to overlook posture, and the second vertical range value is less than described for the human face posture Third distance threshold, it is determined that the sight of the user is towards the screen.
10. mobile terminal according to claim 6, which is characterized in that the acquisition module, including:
Detection unit is detected if the screen for the mobile terminal is in OFF state by range sensor;
Collecting unit leads to if for detecting the presence of object within the scope of pre-determined distance centered on by the mobile terminal Cross camera collection image.
11. a kind of mobile terminal, which is characterized in that including:It memory, processor and is stored on the memory and can be in institute The computer program run on processor is stated, the processor is realized when executing the computer program as in claim 1-5 Step in any one of them method for information display.
CN201810252200.4A 2018-03-26 2018-03-26 Information display method and mobile terminal Active CN108509037B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810252200.4A CN108509037B (en) 2018-03-26 2018-03-26 Information display method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810252200.4A CN108509037B (en) 2018-03-26 2018-03-26 Information display method and mobile terminal

Publications (2)

Publication Number Publication Date
CN108509037A true CN108509037A (en) 2018-09-07
CN108509037B CN108509037B (en) 2021-04-02

Family

ID=63378473

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810252200.4A Active CN108509037B (en) 2018-03-26 2018-03-26 Information display method and mobile terminal

Country Status (1)

Country Link
CN (1) CN108509037B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109407845A (en) * 2018-10-30 2019-03-01 盯盯拍(深圳)云技术有限公司 Screen exchange method and screen interactive device
CN109725715A (en) * 2018-11-16 2019-05-07 深圳惠智光电子科技有限公司 A kind of eye exercise bootstrap technique, electronic equipment and storage medium
CN109753776A (en) * 2018-12-29 2019-05-14 维沃移动通信有限公司 A kind of method, apparatus and mobile terminal of information processing
CN109933186A (en) * 2019-01-22 2019-06-25 西北大学 A kind of mobile web browser energy consumption optimization method adjusted based on screen intensity
CN109977836A (en) * 2019-03-19 2019-07-05 维沃移动通信有限公司 A kind of information collecting method and terminal
CN110544317A (en) * 2019-08-29 2019-12-06 联想(北京)有限公司 Image processing method, image processing device, electronic equipment and readable storage medium
CN110658906A (en) * 2019-08-30 2020-01-07 华为技术有限公司 Display method and electronic equipment
CN110929286A (en) * 2019-11-20 2020-03-27 四川虹美智能科技有限公司 Method for dynamically detecting operation authorization and intelligent equipment
CN111381788A (en) * 2018-12-28 2020-07-07 百度(美国)有限责任公司 Method and system for disabling a display of a smart display device according to a vision-based mechanism
CN112351230A (en) * 2020-10-22 2021-02-09 深圳Tcl新技术有限公司 Intelligent screen, method for automatically switching display frame rate of intelligent screen and storage medium
CN113190119A (en) * 2021-05-06 2021-07-30 Tcl通讯(宁波)有限公司 Mobile terminal screen lighting control method and device, mobile terminal and storage medium
CN113628579A (en) * 2021-08-09 2021-11-09 深圳市优聚显示技术有限公司 LED energy-saving display method, LED display screen system and LCD display equipment
CN113821106A (en) * 2021-10-08 2021-12-21 江苏铁锚玻璃股份有限公司 Intelligent function navigation method and structure based on intelligent transparent OLED vehicle window
US20210406522A1 (en) * 2020-06-28 2021-12-30 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for waking up device, electronic device, and storage medium
CN113885708A (en) * 2021-10-22 2022-01-04 Oppo广东移动通信有限公司 Screen control method and device of electronic equipment, electronic equipment and storage medium
CN114779916A (en) * 2022-03-29 2022-07-22 杭州海康威视数字技术股份有限公司 Electronic equipment screen awakening method, access control management method and device
CN116112597A (en) * 2020-09-03 2023-05-12 荣耀终端有限公司 Off-screen display method, electronic device and storage medium
WO2023130927A1 (en) * 2022-01-10 2023-07-13 荣耀终端有限公司 Always on display control method, electronic device, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101620359A (en) * 2008-07-04 2010-01-06 华晶科技股份有限公司 Judging method of eye sight line
CN105303170A (en) * 2015-10-16 2016-02-03 浙江工业大学 Human eye feature based sight line estimation method
CN106547358A (en) * 2016-11-25 2017-03-29 维沃移动通信有限公司 A kind of display packing and terminal of terminal time information
US20170131778A1 (en) * 2015-11-10 2017-05-11 Motorola Mobility Llc Method and System for Audible Delivery of Notifications Partially Presented on an Always-On Display
CN106897713A (en) * 2017-03-13 2017-06-27 宇龙计算机通信科技(深圳)有限公司 A kind of method and mobile terminal for waking up mobile terminal screen

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101620359A (en) * 2008-07-04 2010-01-06 华晶科技股份有限公司 Judging method of eye sight line
CN105303170A (en) * 2015-10-16 2016-02-03 浙江工业大学 Human eye feature based sight line estimation method
US20170131778A1 (en) * 2015-11-10 2017-05-11 Motorola Mobility Llc Method and System for Audible Delivery of Notifications Partially Presented on an Always-On Display
CN106547358A (en) * 2016-11-25 2017-03-29 维沃移动通信有限公司 A kind of display packing and terminal of terminal time information
CN106897713A (en) * 2017-03-13 2017-06-27 宇龙计算机通信科技(深圳)有限公司 A kind of method and mobile terminal for waking up mobile terminal screen

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109407845A (en) * 2018-10-30 2019-03-01 盯盯拍(深圳)云技术有限公司 Screen exchange method and screen interactive device
CN109725715B (en) * 2018-11-16 2024-04-05 深圳惠智光电子科技有限公司 Eye training guiding method, electronic equipment and storage medium
CN109725715A (en) * 2018-11-16 2019-05-07 深圳惠智光电子科技有限公司 A kind of eye exercise bootstrap technique, electronic equipment and storage medium
CN111381788B (en) * 2018-12-28 2023-10-13 百度(美国)有限责任公司 Method and system for disabling a display of a smart display device according to a vision-based mechanism
CN111381788A (en) * 2018-12-28 2020-07-07 百度(美国)有限责任公司 Method and system for disabling a display of a smart display device according to a vision-based mechanism
CN109753776A (en) * 2018-12-29 2019-05-14 维沃移动通信有限公司 A kind of method, apparatus and mobile terminal of information processing
CN109753776B (en) * 2018-12-29 2021-01-08 维沃移动通信有限公司 Information processing method and device and mobile terminal
CN109933186A (en) * 2019-01-22 2019-06-25 西北大学 A kind of mobile web browser energy consumption optimization method adjusted based on screen intensity
CN109933186B (en) * 2019-01-22 2023-04-07 西北大学 Mobile web browser energy consumption optimization method based on screen brightness adjustment
CN109977836A (en) * 2019-03-19 2019-07-05 维沃移动通信有限公司 A kind of information collecting method and terminal
CN109977836B (en) * 2019-03-19 2022-04-15 维沃移动通信有限公司 Information acquisition method and terminal
CN110544317A (en) * 2019-08-29 2019-12-06 联想(北京)有限公司 Image processing method, image processing device, electronic equipment and readable storage medium
CN110658906A (en) * 2019-08-30 2020-01-07 华为技术有限公司 Display method and electronic equipment
WO2021036555A1 (en) * 2019-08-30 2021-03-04 华为技术有限公司 Display method and electronic device
CN110929286A (en) * 2019-11-20 2020-03-27 四川虹美智能科技有限公司 Method for dynamically detecting operation authorization and intelligent equipment
US20210406522A1 (en) * 2020-06-28 2021-12-30 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for waking up device, electronic device, and storage medium
US11580781B2 (en) * 2020-06-28 2023-02-14 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for waking up device, electronic device, and storage medium
CN116112597A (en) * 2020-09-03 2023-05-12 荣耀终端有限公司 Off-screen display method, electronic device and storage medium
CN116112597B (en) * 2020-09-03 2023-10-20 荣耀终端有限公司 Electronic equipment with off-screen display function, method for displaying off-screen interface of electronic equipment and storage medium
US11823603B2 (en) 2020-09-03 2023-11-21 Honor Device Co., Ltd. Always-on-display method and electronic device
CN112351230A (en) * 2020-10-22 2021-02-09 深圳Tcl新技术有限公司 Intelligent screen, method for automatically switching display frame rate of intelligent screen and storage medium
CN113190119A (en) * 2021-05-06 2021-07-30 Tcl通讯(宁波)有限公司 Mobile terminal screen lighting control method and device, mobile terminal and storage medium
CN113628579A (en) * 2021-08-09 2021-11-09 深圳市优聚显示技术有限公司 LED energy-saving display method, LED display screen system and LCD display equipment
CN113821106A (en) * 2021-10-08 2021-12-21 江苏铁锚玻璃股份有限公司 Intelligent function navigation method and structure based on intelligent transparent OLED vehicle window
CN113885708A (en) * 2021-10-22 2022-01-04 Oppo广东移动通信有限公司 Screen control method and device of electronic equipment, electronic equipment and storage medium
WO2023130927A1 (en) * 2022-01-10 2023-07-13 荣耀终端有限公司 Always on display control method, electronic device, and storage medium
CN114779916A (en) * 2022-03-29 2022-07-22 杭州海康威视数字技术股份有限公司 Electronic equipment screen awakening method, access control management method and device
CN114779916B (en) * 2022-03-29 2024-06-11 杭州海康威视数字技术股份有限公司 Electronic equipment screen awakening method, access control management method and device

Also Published As

Publication number Publication date
CN108509037B (en) 2021-04-02

Similar Documents

Publication Publication Date Title
CN108509037A (en) A kind of method for information display and mobile terminal
CN108427876A (en) A kind of fingerprint identification method and mobile terminal
CN109151180A (en) A kind of object identifying method and mobile terminal
CN108182896B (en) A kind of brightness detection method, device and mobile terminal
CN108920059A (en) Message treatment method and mobile terminal
CN108366220A (en) A kind of video calling processing method and mobile terminal
CN108427873A (en) A kind of biological feather recognition method and mobile terminal
CN109409244A (en) A kind of object puts the output method and mobile terminal of scheme
CN107831891A (en) A kind of brightness adjusting method and mobile terminal
CN108650408A (en) A kind of unlocking screen method and mobile terminal
CN108307110A (en) A kind of image weakening method and mobile terminal
CN108958593A (en) A kind of method and mobile terminal of determining communication object
CN110018805A (en) A kind of display control method and mobile terminal
CN109669611A (en) Fitting method and terminal
CN108804170A (en) The wearing of intelligent wearable device determines method, intelligent wearable device and storage medium
CN107704812A (en) A kind of face identification method and mobile terminal
CN109756626A (en) A kind of based reminding method and mobile terminal
CN109671034A (en) A kind of image processing method and terminal device
CN109525837A (en) The generation method and mobile terminal of image
CN109544445A (en) A kind of image processing method, device and mobile terminal
CN109448069A (en) A kind of template generation method and mobile terminal
CN109976688A (en) A kind of screen display method and device
CN109918006A (en) A kind of screen control method and mobile terminal
CN109544172A (en) A kind of display methods and terminal device
CN109033912A (en) A kind of recognition methods of identification code and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant