CN105824422A - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN105824422A
CN105824422A CN201610195384.6A CN201610195384A CN105824422A CN 105824422 A CN105824422 A CN 105824422A CN 201610195384 A CN201610195384 A CN 201610195384A CN 105824422 A CN105824422 A CN 105824422A
Authority
CN
China
Prior art keywords
display object
area
display
gaze
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610195384.6A
Other languages
Chinese (zh)
Other versions
CN105824422B (en
Inventor
杨春龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201610195384.6A priority Critical patent/CN105824422B/en
Publication of CN105824422A publication Critical patent/CN105824422A/en
Application granted granted Critical
Publication of CN105824422B publication Critical patent/CN105824422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an information processing method and electronic equipment. The information processing method comprises the following steps: acquiring a first gazing direction of eyeballs of a user against a display interface; determining a first display object from a display image on the basis of the first gazing direction; acquiring position information of the first display object; judging whether the first display object is within a first preset region on the basis of the position information, and generating a first judgment result; when the first judgment result is negative, displaying the first display object in the first preset region in an amplifying manner. According to the method and the electronic equipment provided by the invention, the problem of inconvenience in operation in the prior art because an amplified display position of an object concerned by the user is fixed is solved.

Description

A kind of information processing method and electronic equipment
Technical field
The present invention relates to electronic technology field, particularly to a kind of information processing method and electronic equipment.
Background technology
The dynamic tracer technique of eye, is also called sight line tracking system, is to move information (including the position of the point of fixation of eyeball, fixation time or fixation times etc.) by the eye of detection user to realize oculomotor tracking.The dynamic tracer technique of eye is more and more ripe, and range of application is more and more extensive.In field of human-computer interaction, user, use has the electronic equipment that eye moves tracking function when, can move information according to the eye of user and generate control instruction control electronic equipment.
In the prior art when user watch attentively a certain region of display interface exceed preset duration when, this region can be amplified display in situ.
But present inventor is in realizing the embodiment of the present application during inventive technique scheme, find that above-mentioned technology at least exists following technical problem: owing to the position of the object amplification display of user's concern in prior art is fixing, do not meet the use habit of user, there is the technical problem of inconvenient operation.
Summary of the invention
The present invention provides a kind of information processing method and electronic equipment, and it is fixing that the object solving to pay close attention to due to user in prior art amplifies the position of display, there is the technical problem of inconvenient operation.
On the one hand the embodiment of the present invention provides a kind of information processing method, including:
Obtain the eyeball first direction of gaze for display interface of user;
From described display image, the first display object is determined based on described first direction of gaze;
Obtain the positional information of described first display object;
Based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;
When described first judged result is no, described first display object is shown enlarged in described first predeterminable area.
Optionally, described from described display image, determine the first display object based on described first direction of gaze, specifically include:
Based on described first direction of gaze, from described display interface, determine target area;
Described first display object is determined based on described target area.
Optionally, described based on described target area determine described first display object, specifically include:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
The image shown described first area is as described first display object.
Optionally, described based on described target area determine described first display object, specifically include:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;
Using described operable mark as described first display object.
Optionally, described described first display object is shown enlarged in described first predeterminable area before, also include:
Judge whether the operation enabling enlarging function to be detected, it is thus achieved that the second judged result;
When second judged result is for being, described first display object is shown enlarged in described first predeterminable area.
Optionally, described described first display object is shown enlarged in described first predeterminable area before, also include:
Obtain the size of described first display object;
Judge that whether described size is less than pre-set dimension, it is thus achieved that the 3rd judged result;
When described 3rd judged result is for being, described first display object is shown enlarged in described first predeterminable area.
Optionally, described described first display object is shown enlarged in described first predeterminable area after, also include:
Detection obtains described eyeball for the described first the second direction of gaze showing object and fixation time;
Determining target identification based on described second direction of gaze and fixation time from described first display object, wherein, user can perform the operational order corresponding with described target identification when operating for described target identification;
When described fixation time is more than preset duration, perform the operational order corresponding with described target identification.
On the other hand the embodiment of the present invention provides a kind of electronic equipment, including:
Housing;
Memory element, is arranged in described housing, is used for storing at least one program module;
At least one processor, it is arranged in described housing, being connected with described memory element, at least one processor described is by obtaining and run at least one program module described from described memory element, for obtaining the eyeball first direction of gaze for display interface of user;From described display image, the first display object is determined based on described first direction of gaze;Obtain the positional information of described first display object;Based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;When described first judged result is no, described first display object is shown enlarged in described first predeterminable area.
Optionally, at least one processor described is additionally operable to:
Based on described first direction of gaze, from described display interface, determine target area;
Described first display object is determined based on described target area.
Optionally, at least one processor described is additionally operable to:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
The image shown described first area is as described first display object.
Optionally, at least one processor described is additionally operable to:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;
Using described operable mark as described first display object.
Optionally, at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area before, it may be judged whether the operation enabling enlarging function detected, it is thus achieved that the second judged result;
When second judged result is for being, described first display object is shown enlarged in described first predeterminable area.
Optionally, at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area before, it is thus achieved that the size of described first display object;
Judge that whether described size is less than pre-set dimension, it is thus achieved that the 3rd judged result;
When described 3rd judged result is for being, described first display object is shown enlarged in described first predeterminable area.
Optionally, at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area after, detection obtains described eyeball for the second direction of gaze of described first display object and fixation time;
Determining target identification based on described second direction of gaze and fixation time from described first display object, wherein, user can perform the operational order corresponding with described target identification when operating for described target identification;
When described fixation time is more than preset duration, perform the operational order corresponding with described target identification.
Said one in the embodiment of the present application or multiple technical scheme, at least have one or more technique effects following:
One, owing to, in technical scheme in the embodiment of the present application, have employed the eyeball obtaining user for the first direction of gaze of display interface;From described display image, the first display object is determined based on described first direction of gaze;Obtain the positional information of described first display object;Based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;When described first judged result is no, described first display object is shown enlarged in the technological means in described first predeterminable area.As such, it is possible to arrange predeterminable area according to the use habit of user, and the perpetual object of user is shown enlarged in predeterminable area.So, it is achieved that the technique effect of convenient operation.
Two, owing to, in technical scheme in the embodiment of the present application, have employed and determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;The image shown described first area is as the technological means of described first display object.So, it is not to be the most accurate owing to prior art has eye moving the electronic equipment of tracking function when obtaining eye and moving information, the object that electronic equipment is determined is not likely to be user and is actually subjected to the object of operation, so, after determining the target area that user pays close attention to, the region of close-proximity target zone is amplified display, so can accurately determine operation object according to user for the information of watching attentively in the region after amplifying.So, it is achieved that improve the technique effect of electronic equipment degree of accuracy.
Three, owing to, in technical scheme in the embodiment of the present application, have employed and determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;Using described operable mark as the described first technological means showing object.So, the first area of close-proximity target zone is determined in the target area paid close attention to according to user, then removes background from first area, extracts operable mark and amplifies display.So, decreasing watched attentively the object of user in the first display object, user can determine perpetual object very easily, can improve the degree of accuracy of electronic equipment further, improves the technique effect using impression of user.
Four, owing to, in technical scheme in the embodiment of the present application, have employed and judge whether the operation enabling enlarging function to be detected, it is thus achieved that the second judged result;When second judged result is for being, described first display object is shown enlarged in the technological means in described first predeterminable area.As such, it is possible to according to the demand of user, again the first display object is amplified after confirming through user, it is to avoid the first display object is amplified the problem of display need not amplify display when by user, effectively reduces maloperation.
Accompanying drawing explanation
In order to be illustrated more clearly that the technical scheme in the embodiment of the present application or prior art, in describing embodiment below, the required accompanying drawing used is briefly described, it should be apparent that, the accompanying drawing in describing below is only some embodiments of the application.
The flow chart of the Fig. 1 a kind of information processing method for providing in the embodiment of the present application one;
The structure chart of the Fig. 2 a kind of electronic equipment for providing in the embodiment of the present application two;
The structure chart of the Fig. 3 a kind of electronic equipment for providing in the embodiment of the present application three.
Detailed description of the invention
The embodiment of the present application provides a kind of information processing method and electronic equipment, and it is fixing that the object for solving to pay close attention to due to user in prior art amplifies the position of display, there is the technical problem of inconvenient operation.
Technical scheme in the embodiment of the present application is to solve above-mentioned technical problem, and general thought is as follows:
Obtain the eyeball first direction of gaze for display interface of user;
From described display image, the first display object is determined based on described first direction of gaze;
Obtain the positional information of described first display object;
Based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;
When described first judged result is no, described first display object is shown enlarged in described first predeterminable area.
Due in technical scheme in the embodiment of the present application, have employed after determining the first display object that user pays close attention to, judge that the first display object, whether in default position, if not in default position, is shown enlarged in the technological means of predeterminated position by the first display object.As such, it is possible to arrange the position amplifying display according to the use habit of user, it is achieved that the technique effect of convenient operation.
A kind of information processing method that the embodiment of the present application provides specifically can apply to a kind of electronic equipment, electronic equipment self is controlled for moving information according to the eye of user, described electronic equipment can be mobile phone, digital camera/electronic equipment such as panel computer, notebook computer, it can also be other electronic equipment, here, just differ, one schematically illustrates.
Below in conjunction with the accompanying drawings the main of the embodiment of the present application technical scheme is realized principle, detailed description of the invention and be explained in detail the beneficial effect that should be able to reach.
Embodiment one
Refer to Fig. 1, the embodiment of the present application provides a kind of information processing method, including:
S101: obtain the eyeball first direction of gaze for display interface of user;
S102: determine the first display object from described display image based on described first direction of gaze;
In the embodiment of the present application, after obtaining the first direction of gaze, can determine, according to direction of gaze, the object (the i.e. first display object) that user pays close attention in multiple objects that user watches attentively.
S103: obtain the positional information of described first display object;
S104: based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;
S105: when described first judged result is no, is shown enlarged in described first display object in described first predeterminable area.
In the embodiment of the present application, after determining the first display object, it is thus achieved that the positional information of the first display object, and judge that whether the first display object needs the position (the i.e. first predeterminable area) of display user, if it is not, the first display object is shown enlarged in the first predeterminable area.
Optionally, described from described display image, determine the first display object based on described first direction of gaze, specifically include:
Based on described first direction of gaze, from described display interface, determine target area;
Described first display object is determined based on described target area.
In the embodiment of the present application, obtain eyeball for the first direction of gaze of display interface after, can determine that what user watched attentively is which position (i.e. target area) of display interface according to the first direction of gaze.
Two kinds of preferred methods presented below determine target area:
The first: at least one region of eyeball fixes is obtained based on the first direction of gaze;
From at least one target area, determine that the longest region of fixation time is target area.
The when of watching display interface attentively of user, user's eyeball fixes some region of time is the longest, and to represent user the highest to the attention rate in this region.So, in the embodiment of the present application, target area can be determined according to direction of gaze and fixation time.Determine at least one region of eyeball fixes according to direction of gaze, and obtain user and watch the time in each region attentively, from least one region, obtain the region that fixation time is the longest, using this region as target area, the target area that i.e. user pays close attention to.
Concrete, when user watches display interface attentively when, obtain user and watch the first direction of gaze of display interface attentively, and watch the time in each region attentively, if the direction that user watches attentively is the a-quadrant of display interface, B region and C region, if the fixation time that the fixation time that the fixation time of a-quadrant is 0.5s, B region is 0.1s and C region is 1s.Then determine that the time in eyeball fixes C region is the longest, using C region as target area.
The second: obtain at least one region of eyeball fixes based on the first direction of gaze;
From at least one region described, determine that the most region of unit interval fixation times is target area.
When eyeball is look at viewing area, it is possible to have interference factor so that user may be disturbed the interference generation jumping of factor and regard.So, in the embodiment of the present application, it is also possible to determine target area according to the number of times in unit interval fixation object region.Determine at least one target area of eyeball fixes according to direction of gaze, and watch the time of each target area in obtaining the Subscriber Unit time attentively, from least one target area, obtain the region that fixation times is most, using this region as target area.
Concrete, if the region watched attentively in user 2s is a-quadrant, B region and C region, the number of times watched attentively is respectively 4 times, 1 time and 1 time, it is determined that the number of times going out to watch attentively a-quadrant within the unit interval is most, so determining that a-quadrant is target area.
Obtain after target area and just can determine the first display object according to target area, specifically have following three kinds of modes to determine the first display object:
The first: is using target area as the first display object.
The second: determine described first display object based on described target area, specifically include:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
The image shown described first area is as described first display object.
In the embodiment of the present application, as a example by predetermined threshold value is as 2cm, after determining target area, determine the region of distance objective region 2cm, using the image of display in the region of distance objective region 2cm as the first display image.
It not the highest owing to the existing electronic equipment eye when recording user at process visual information moves the degree of accuracy of information, the perpetual object of the user that electronic equipment is determined, it may not be the object of user's actual concern, so by the method in the embodiment of the present application, region near the perpetual object of user can be defined as the first display object, and the relief user of amplification selects from the first display object again.So, it is achieved that improve the technique effect of the degree of accuracy of electronic equipment.
The third: determine described first display object based on described target area, specifically include:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;
Using described operable mark as described first display object.
In the embodiment of the present application, as a example by predetermined threshold value is as 2cm, after determining target area, the region determining distance objective region 2cm is first area, and extract the operable mark in first area, if operable expression includes A icon, B icon, C icon and D icon, then these icons are extracted as the first display object.
Operable marker extraction in the region that the method for the embodiment of the present application can will be determined goes out, amplify display again, so can get rid of the background image in region, convenient user determines perpetual object from display object, improve the use impression of user, further increase the precision of electronic equipment.
Optionally, before described first display object being shown enlarged in described first predeterminable area, also include:
Judge whether the operation enabling enlarging function to be detected, it is thus achieved that the second judged result;
When second judged result is for being, described first display object is shown enlarged in described first predeterminable area.
The operation enabling enlarging function can be the pressing operation for the programmable button on keyboard.In the embodiment of the present application, by start enlarging function operation for pressing keyboard on " Ctrl " key operation as a example by, after determining the first viewing area, electronic equipment can detect whether user has pressed " Ctrl " key on keyboard, if be detected that after " Ctrl " key is pressed, it is possible to the first display object is shown enlarged in the first predeterminable area.
The operation enabling enlarging function can be user's input for the default voice messaging of electronic equipment.In the embodiment of the present application, it is as a example by " amplification " by default voice messaging.After determining the first viewing area, electronic equipment can detect whether to receive voice messaging, after voice messaging being detected, extract the voice messaging of user's input, and be identified, if the voice identifying user's input is " amplification ", then the first display object is shown enlarged in the first predeterminable area.
Optionally, before described first display object being shown enlarged in described first predeterminable area, also include:
Obtain the size of described first display object;
Judge that whether described size is less than pre-set dimension, it is thus achieved that the 3rd judged result;
When described 3rd judged result is for being, described first display object is shown enlarged in described first predeterminable area.
In the embodiment of the present application, before amplifying display the first display object, also need to judge that whether the size of the first display object is less than pre-set dimension, if first display object be smaller in size than pre-set dimension, then show the undersized of the first display object, inconvenient user's viewing, so needing to amplify the first display object;If the first display size is more than pre-set dimension, then show that the size of the first display object is for the size of the convenient viewing of user, if amplifying display again, may result in the size more than display interface of the size after the first display object amplifies, making the first display object after amplifying imperfect in the display of display interface, the use impression of user is bad.So, the use that can be improved user by the method for the embodiment of the present application further is experienced.
Optionally, described described first display object is shown enlarged in described first predeterminable area after, also include:
Detection obtains described eyeball for the described first the second direction of gaze showing object and fixation time;
Determining target identification based on described second direction of gaze and fixation time from described first display object, wherein, user can perform the operational order corresponding with described target identification when operating for described target identification;
When described fixation time is more than preset duration, perform the operational order corresponding with described target identification.
After being shown enlarged in the first predeterminable area by the first display object, just can facilitate user that the first display object amplified is operated further.
In the embodiment of the present application, include A icon, B icon, C icon, as a example by preset duration is 2s with the first display object.After first display object is amplified display, show that the direction of gaze of object determines the multiple objects watched attentively according to user for first amplified, and watch the fixation time of each object attentively, if user watches A icon 3s attentively, watch B icon 0s attentively, C icon 1s, then can determine that user watches the preset duration that the time of A icon exceedes attentively, it is possible to run the operational order that A icon is corresponding.
By the method for the embodiment of the present application, can show that the eye of object moves information and determines the object of user's actual concern accurately according to user for first after amplifying, improve electronic equipment eye and move the degree of accuracy of tracking.
Embodiment two
Refer to Fig. 2, the embodiment of the present application also provides for a kind of electronic equipment, including:
Housing 201;
Memory element 202, is arranged in described housing 201, is used for storing at least one program module;
At least one processor 203, it is arranged in described housing 201, being connected with described memory element, at least one processor 203 described is by obtaining and run at least one program module described from described memory element 202, for obtaining the eyeball first direction of gaze for display interface of user;From described display image, the first display object is determined based on described first direction of gaze;Obtain the positional information of described first display object;Based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;When described first judged result is no, described first display object is shown enlarged in described first predeterminable area.
Optionally, at least one processor described is additionally operable to:
Based on described first direction of gaze, from described display interface, determine target area;
Described first display object is determined based on described target area.
Optionally, at least one processor described is additionally operable to:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
The image shown described first area is as described first display object.
Optionally, at least one processor described is additionally operable to:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;
Using described operable mark as described first display object.
Optionally, at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area before, it may be judged whether the operation enabling enlarging function detected, it is thus achieved that the second judged result;
When second judged result is for being, described first display object is shown enlarged in described first predeterminable area.
Optionally, at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area before, it is thus achieved that the size of described first display object;
Judge that whether described size is less than pre-set dimension, it is thus achieved that the 3rd judged result;
When described 3rd judged result is for being, described first display object is shown enlarged in described first predeterminable area.
Optionally, at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area after, detection obtains described eyeball for the second direction of gaze of described first display object and fixation time;
Determining target identification based on described second direction of gaze and fixation time from described first display object, wherein, user can perform the operational order corresponding with described target identification when operating for described target identification;
When described fixation time is more than preset duration, perform the operational order corresponding with described target identification.
Embodiment three
Refer to Fig. 3, the embodiment of the present application also provides for a kind of electronic equipment, including:
Obtain unit 301, for obtaining the eyeball first direction of gaze for display interface of user;
Determine unit 302, for determining the first display object from described display image based on described first direction of gaze;
Described acquiring unit 301 is additionally operable to obtain the positional information of described first display object;
Judging unit 303, for based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;
Amplifying unit 304, for described first judged result be no time, by described first display object be shown enlarged in described first predeterminable area.
Optionally, described determine that unit 302 is additionally operable to:
Based on described first direction of gaze, from described display interface, determine target area;
Described first display object is determined based on described target area.
Optionally, described determine that unit 302 is additionally operable to:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
The image shown described first area is as described first display object.
Optionally, described determine that unit 302 is additionally operable to:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;
Using described operable mark as described first display object.
Optionally, described judging unit 303 is additionally operable to: described described first display object is shown enlarged in described first predeterminable area before, it may be judged whether the operation enabling enlarging function detected, it is thus achieved that the second judged result;
Described amplifying unit 304 is additionally operable to: when the second judged result is for being, is shown enlarged in described first predeterminable area by described first display object.
Optionally, described acquisition unit 301 is additionally operable to: described described first display object is shown enlarged in described first predeterminable area before, it is thus achieved that the size of described first display object;
Described judging unit 303 is additionally operable to: judge that whether described size is less than pre-set dimension, it is thus achieved that the 3rd judged result;
Described amplifying unit 304 is additionally operable to: when described 3rd judged result is for being, is shown enlarged in described first predeterminable area by described first display object.
Optionally, described acquisition unit 301 is additionally operable to: described described first display object is shown enlarged in described first predeterminable area after, detection obtains described eyeball for the second direction of gaze of described first display object and fixation time;
Described determine that unit 302 is additionally operable to: determining target identification based on described second direction of gaze and fixation time from described first display object, wherein, user can perform the operational order corresponding with described target identification when operating for described target identification;
Described electronic equipment also includes performance element 305, when described fixation time is more than preset duration, performs the operational order corresponding with described target identification.
Said one in the embodiment of the present application or multiple technical scheme, at least have one or more technique effects following:
One, owing to, in technical scheme in the embodiment of the present application, have employed the eyeball obtaining user for the first direction of gaze of display interface;From described display image, the first display object is determined based on described first direction of gaze;Obtain the positional information of described first display object;Based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;When described first judged result is no, described first display object is shown enlarged in the technological means in described first predeterminable area.As such, it is possible to arrange predeterminable area according to the use habit of user, and the perpetual object of user is shown enlarged in predeterminable area.So, it is achieved that facilitate operating technology effect.
Two, owing to, in technical scheme in the embodiment of the present application, have employed and determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;The image shown described first area is as the technological means of described first display object.So, it is not to be the most accurate owing to prior art has eye moving the electronic equipment of tracking function when obtaining eye and moving information, the object that electronic equipment is determined is not likely to be the object of user's actual concern, so, after determining the target area that user pays close attention to, the region of close-proximity target zone is amplified display, so can accurately determine perpetual object according to user for the information of watching attentively in the region after amplifying.So, it is achieved that improve the technique effect of electronic equipment degree of accuracy.
Three, owing to, in technical scheme in the embodiment of the present application, have employed and determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;Using described operable mark as the described first technological means showing object.So, the first area of close-proximity target zone is determined in the target area paid close attention to according to user, then removes background from first area, extracts operable mark and amplifies display.So, decreasing watched attentively the object of user, user can determine perpetual object very easily, can improve the degree of accuracy of electronic equipment further, improves the technique effect using impression of user.
Four, owing to, in technical scheme in the embodiment of the present application, have employed and judge whether the operation enabling enlarging function to be detected, it is thus achieved that the second judged result;When second judged result is for being, described first display object is shown enlarged in the technological means in described first predeterminable area.As such, it is possible to according to the demand of user, again the first display object is amplified after confirming through user, it is to avoid the first display object is amplified the problem of display need not amplify display when by user, effectively reduces maloperation.
Those skilled in the art are it should be appreciated that embodiments of the invention can be provided as method, system or computer program.Therefore, the form of the embodiment in terms of the present invention can use complete hardware embodiment, complete software implementation or combine software and hardware.And, the present invention can use the form at one or more upper computer programs implemented of computer-usable storage medium (including but not limited to disk memory, CD-ROM, optical memory etc.) wherein including computer usable program code.
The present invention is with reference to describing according to method, equipment (system) and the flow chart of computer program and/or the block diagram of the embodiment of the present application.It should be understood that can be by the flow process in each flow process in computer program instructions flowchart and/or block diagram and/or square frame and flow chart and/or block diagram and/or the combination of square frame.These computer program instructions can be provided to produce a machine to the processor of general purpose computer, special-purpose computer, Embedded Processor or other programmable data processing device so that the instruction performed by the processor of computer or other programmable data processing device is produced for realizing the device of function specified in one flow process of flow chart or multiple flow process and/or one square frame of block diagram or multiple square frame.
These computer program instructions may be alternatively stored in and can guide in the computer-readable memory that computer or other programmable data processing device work in a specific way, the instruction making to be stored in this computer-readable memory produces the manufacture including command device, and this command device realizes the function specified in one flow process of flow chart or multiple flow process and/or one square frame of block diagram or multiple square frame.
These computer program instructions also can be loaded in computer or other programmable data processing device, make to perform sequence of operations step on computer or other programmable devices to produce computer implemented process, thus the instruction performed on computer or other programmable devices provides the step of the function specified in one flow process of flow chart or multiple flow process and/or one square frame of block diagram or multiple square frame for realization.
Specifically, the computer program instructions that information processing method in the embodiment of the present application is corresponding can be stored in CD, hard disk, on the storage mediums such as USB flash disk, when the computer program instructions corresponding with information processing method in storage medium is read by an electronic equipment or be performed, comprise the steps:
Gather the current environment parameter of electronic equipment local environment;Based at least one object of display in a scene, determine and the first object of described current environment match parameters;Based on described current environment parameter, it is thus achieved that the target display parameters of described first object;Described first object is shown with described target display parameters in described scene.
Optionally, computer program instructions corresponding to " based at least one object of display in a scene, determine and the first object of described current environment match parameters " step when executed, specifically includes following steps:
The first object mated with described current environment parameter is determined from least one object described;Or
Based at least one object described, generate first object mated with described current environment parameter.
Optionally, step " determines from least one object described and the first object of described current environment parameter coupling " that corresponding computer program instructions, when being held, specifically includes following steps:
The first corresponding relation based on ambient parameter Yu object, determines first object corresponding with described current environment parameter from least one object described.
Optionally, step " based at least one object described, generate one with the first object of described current environment parameter coupling " corresponding computer program instructions when executed, specifically include following steps:
The second corresponding relation based on ambient parameter Yu object, determines first object corresponding with described current environment parameter;
Judge whether at least one object described exists described first object, it is thus achieved that the first judged result;
When described first judged result is no, generate described first object.
Optionally, the computer program instructions that step " based on described current environment parameter, it is thus achieved that the target display parameters of described first object " is corresponding when executed, specifically includes following steps:
When described current environment parameter is the current wind parameter of described electronic equipment local environment, based on described current wind parameter, obtain the current moving direction of described first object and current translational speed, wherein, described wind-force parameter is for characterizing wind direction and the wind-force value of described electronic equipment local environment;Or
When described current environment parameter is the present intensity of described electronic equipment local environment, based on described present intensity, it is thus achieved that the target display brightness of described first object;Or
When described current environment parameter is the current depth of field of described electronic equipment local environment, based on the described current depth of field, it is thus achieved that the target display dimensions of described first object.
Optionally, the step computer program instructions that " to show described first object with described target display parameters in described scene " corresponding when executed, specifically includes following steps:
At described current moving direction with the first object described in described current translational speed Dynamic Announce in described scene;Or
Described first object is shown with described target display brightness in described scene;Or
Described first object is shown with described target display dimensions in described scene.
Although preferred embodiments of the present invention have been described, but those skilled in the art once know basic creative concept, then these embodiments can be made other change and amendment.So, claims are intended to be construed to include preferred embodiment and fall into all changes and the amendment of the scope of the invention.
Obviously, those skilled in the art can carry out various change and modification without departing from the spirit and scope of the present invention to the present invention.So, if these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, then the present invention is also intended to comprise these change and modification.

Claims (15)

1. an information processing method, including:
Obtain the eyeball first direction of gaze for display interface of user;
From described display image, the first display object is determined based on described first direction of gaze;
Obtain the positional information of described first display object;
Based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;
When described first judged result is no, described first display object is shown enlarged in described first predeterminable area.
2. the method for claim 1, it is characterised in that described determine the first display object from described display image based on described first direction of gaze, specifically includes:
Based on described first direction of gaze, from described display interface, determine target area;
Described first display object is determined based on described target area.
3. method as claimed in claim 2, it is characterised in that described determine described first display object based on described target area, specifically includes:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
The image shown described first area is as described first display object.
4. method as claimed in claim 2, it is characterised in that described determine described first display object based on described target area, specifically includes:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;
Using described operable mark as described first display object.
5. the method for claim 1, it is characterised in that described described first display object is shown enlarged in described first predeterminable area before, also include:
Judge whether the operation enabling enlarging function to be detected, it is thus achieved that the second judged result;
When second judged result is for being, described first display object is shown enlarged in described first predeterminable area.
6. the method for claim 1, it is characterised in that described described first display object is shown enlarged in described first predeterminable area before, also include:
Obtain the size of described first display object;
Judge that whether described size is less than pre-set dimension, it is thus achieved that the 3rd judged result;
When described 3rd judged result is for being, described first display object is shown enlarged in described first predeterminable area.
7. the method for claim 1, it is characterised in that described described first display object is shown enlarged in described first predeterminable area after, also include:
Detection obtains described eyeball for the described first the second direction of gaze showing object and fixation time;
Determining target identification based on described second direction of gaze and fixation time from described first display object, wherein, user can perform the operational order corresponding with described target identification when operating for described target identification;
When described fixation time is more than preset duration, perform the operational order corresponding with described target identification.
8. an electronic equipment, including:
Housing;
Memory element, is arranged in described housing, is used for storing at least one program module;
At least one processor, it is arranged in described housing, being connected with described memory element, at least one processor described is by obtaining and run at least one program module described from described memory element, for obtaining the eyeball first direction of gaze for display interface of user;From described display image, the first display object is determined based on described first direction of gaze;Obtain the positional information of described first display object;Based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;When described first judged result is no, described first display object is shown enlarged in described first predeterminable area.
Electronic equipment the most according to claim 8, it is characterised in that at least one processor described is additionally operable to:
Based on described first direction of gaze, from described display interface, determine target area;
Described first display object is determined based on described target area.
Electronic equipment the most according to claim 9, it is characterised in that at least one processor described is additionally operable to:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
The image shown described first area is as described first display object.
11. electronic equipments according to claim 9, it is characterised in that at least one processor described is additionally operable to:
Determine that the distance with described target area is first area less than the region of predetermined threshold value based on described target area;
Based on described first area, from described first area, extract operable mark, wherein, when user is for described operable mark operation, be able to carry out the operational order corresponding with described operable mark;
Using described operable mark as described first display object.
12. electronic equipments according to claim 8, it is characterised in that at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area before, it may be judged whether the operation enabling enlarging function detected, it is thus achieved that the second judged result;
When second judged result is for being, described first display object is shown enlarged in described first predeterminable area.
13. electronic equipments according to claim 8, it is characterised in that at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area before, it is thus achieved that the size of described first display object;
Judge that whether described size is less than pre-set dimension, it is thus achieved that the 3rd judged result;
When described 3rd judged result is for being, described first display object is shown enlarged in described first predeterminable area.
14. electronic equipments according to claim 8, it is characterised in that at least one processor described is additionally operable to:
Described described first display object is shown enlarged in described first predeterminable area after, detection obtains described eyeball for the second direction of gaze of described first display object and fixation time;
Determining target identification based on described second direction of gaze and fixation time from described first display object, wherein, user can perform the operational order corresponding with described target identification when operating for described target identification;
When described fixation time is more than preset duration, perform the operational order corresponding with described target identification.
15. 1 kinds of electronic equipments, including:
Obtain unit, for obtaining the eyeball first direction of gaze for display interface of user;
Determine unit, for determining the first display object from described display image based on described first direction of gaze;
Described acquiring unit is additionally operable to obtain the positional information of described first display object;
Judging unit, for based on described positional information, it is judged that whether described first display object is in the first predeterminable area, and generates the first judged result;
Amplifying unit, for described first judged result be no time, by described first display object be shown enlarged in described first predeterminable area.
CN201610195384.6A 2016-03-30 2016-03-30 A kind of information processing method and electronic equipment Active CN105824422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610195384.6A CN105824422B (en) 2016-03-30 2016-03-30 A kind of information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610195384.6A CN105824422B (en) 2016-03-30 2016-03-30 A kind of information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN105824422A true CN105824422A (en) 2016-08-03
CN105824422B CN105824422B (en) 2019-07-26

Family

ID=56525430

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610195384.6A Active CN105824422B (en) 2016-03-30 2016-03-30 A kind of information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN105824422B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106791135A (en) * 2016-12-29 2017-05-31 努比亚技术有限公司 A kind of automatic local Zoom display method and mobile terminal
CN108563330A (en) * 2018-03-30 2018-09-21 百度在线网络技术(北京)有限公司 Using open method, device, equipment and computer-readable medium
CN109062409A (en) * 2018-07-27 2018-12-21 华勤通讯技术有限公司 The control method and system of client, mobile terminal
CN109271027A (en) * 2018-09-17 2019-01-25 北京旷视科技有限公司 Page control method, device and electronic equipment
CN110248254A (en) * 2019-06-11 2019-09-17 Oppo广东移动通信有限公司 Display control method and Related product
CN112799573A (en) * 2017-09-22 2021-05-14 创新先进技术有限公司 Message display method and device
CN113269044A (en) * 2021-04-27 2021-08-17 青岛小鸟看看科技有限公司 Display control method and device of head-mounted display equipment and head-mounted display equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102106145A (en) * 2008-07-30 2011-06-22 三星电子株式会社 Apparatus and method for displaying an enlarged target region of a reproduced image
CN103869946A (en) * 2012-12-14 2014-06-18 联想(北京)有限公司 Display control method and electronic device
CN104216624A (en) * 2013-05-30 2014-12-17 联想(北京)有限公司 Display method and electronic device
CN105335061A (en) * 2015-09-23 2016-02-17 小米科技有限责任公司 Information display method and apparatus and terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102106145A (en) * 2008-07-30 2011-06-22 三星电子株式会社 Apparatus and method for displaying an enlarged target region of a reproduced image
CN103869946A (en) * 2012-12-14 2014-06-18 联想(北京)有限公司 Display control method and electronic device
CN104216624A (en) * 2013-05-30 2014-12-17 联想(北京)有限公司 Display method and electronic device
CN105335061A (en) * 2015-09-23 2016-02-17 小米科技有限责任公司 Information display method and apparatus and terminal

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106791135A (en) * 2016-12-29 2017-05-31 努比亚技术有限公司 A kind of automatic local Zoom display method and mobile terminal
CN112799573A (en) * 2017-09-22 2021-05-14 创新先进技术有限公司 Message display method and device
CN108563330A (en) * 2018-03-30 2018-09-21 百度在线网络技术(北京)有限公司 Using open method, device, equipment and computer-readable medium
CN109062409A (en) * 2018-07-27 2018-12-21 华勤通讯技术有限公司 The control method and system of client, mobile terminal
CN109271027A (en) * 2018-09-17 2019-01-25 北京旷视科技有限公司 Page control method, device and electronic equipment
CN109271027B (en) * 2018-09-17 2022-10-14 北京旷视科技有限公司 Page control method and device and electronic equipment
CN110248254A (en) * 2019-06-11 2019-09-17 Oppo广东移动通信有限公司 Display control method and Related product
CN113269044A (en) * 2021-04-27 2021-08-17 青岛小鸟看看科技有限公司 Display control method and device of head-mounted display equipment and head-mounted display equipment

Also Published As

Publication number Publication date
CN105824422B (en) 2019-07-26

Similar Documents

Publication Publication Date Title
CN105824422A (en) Information processing method and electronic equipment
KR102537210B1 (en) Providing Method For Video Contents and Electronic device supporting the same
CN107885533B (en) Method and device for managing component codes
US10003785B2 (en) Method and apparatus for generating images
KR102042461B1 (en) Mobile terminal and method for controlling of the same
CN104869304B (en) Method for displaying focusing and electronic equipment applying same
KR20180008221A (en) Method and device for acquiring image and recordimg medium thereof
CN103729120A (en) Method for generating thumbnail image and electronic device thereof
CN104123093A (en) Information processing method and device
CN114240882A (en) Defect detection method and device, electronic equipment and storage medium
CN112887618B (en) Video shooting method and device
US9560272B2 (en) Electronic device and method for image data processing
US10409478B2 (en) Method, apparatus, and recording medium for scrapping content
CN113936699B (en) Audio processing method, device, equipment and storage medium
KR20180010493A (en) Electronic device and method for editing video thereof
KR20160035865A (en) Apparatus and method for identifying an object
JP7387002B2 (en) Positioning methods, devices, electronic devices, storage media, programs and products
KR20150020865A (en) Method and apparatus for processing a input of electronic device
CN103500122A (en) Multimedia file playing method and electronic equipment
EP2888716B1 (en) Target object angle determination using multiple cameras
CN105430250A (en) Mobile terminal and method of controlling the same
CN103500234A (en) Method for downloading multi-media files and electronic equipment
CN115702443A (en) Applying stored digital makeup enhancements to recognized faces in digital images
CN103870146A (en) Information processing method and electronic equipment
CN110688046B (en) Song playing method and device and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant