CN104714728A - Display method and device - Google Patents

Display method and device Download PDF

Info

Publication number
CN104714728A
CN104714728A CN201310062955.5A CN201310062955A CN104714728A CN 104714728 A CN104714728 A CN 104714728A CN 201310062955 A CN201310062955 A CN 201310062955A CN 104714728 A CN104714728 A CN 104714728A
Authority
CN
China
Prior art keywords
operand
distance
display
user
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310062955.5A
Other languages
Chinese (zh)
Other versions
CN104714728B (en
Inventor
过晓冰
向梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310062955.5A priority Critical patent/CN104714728B/en
Publication of CN104714728A publication Critical patent/CN104714728A/en
Application granted granted Critical
Publication of CN104714728B publication Critical patent/CN104714728B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The embodiment of the invention provides a display method and device and relates to the field of electronic device application. The display method and device solve the problem that a user can not directly trigger a control in a touch mode due to the fact that the control on an operation interface is far away from the operating position of the user. The method comprises the first step of displaying at least one operation object, wherein the operation objects can display running results of applications corresponding to the operation objects after being triggered; the second step of detecting first operation of a user; the third step of determining the first operation object among the operating objects according to the first operation, wherein the first operation object is in a first display state; the fourth step of generating a display adjustment instruction; the fifth step of changing display parameters of the first operation object according to the display adjustment instruction so that the first operation object can be switched into a second display state from the first display state. The display method and device are used for changing the display states of the operation objects in a display plane of an electronic device.

Description

A kind of display packing and equipment
Technical field
The present invention relates to electronic apparatus application field, particularly relate to a kind of display packing and equipment.
Background technology
As user operation three-dimensional (3Dimensions, be called for short 3D) interface time, usually the control needing user to touch on interface could trigger this control, because 3D interface exists three dimension (X, Y, Z) spatially, when the control distance users of the touching that user needs is far away, user triggers with regard to needing the control needed by initiatively touching user.Like this because some control distance users are distant, inconvenience can be brought to user operation.
Inventor finds that in prior art, at least there are the following problems: directly can not be touched by user because some control distance users is comparatively far away when user operation 3D interface and be triggered by this control, thus causes user can not directly by this control of touch trigger.
Summary of the invention
Embodiments of the invention provide a kind of display packing and equipment, solve because the distant user of causing of operating position of control and user on operation interface cannot by directly by touching the problem triggered by this control.
For achieving the above object, embodiments of the invention adopt following technical scheme:
First aspect, provides a kind of display packing, is applied to electronic equipment, and described method comprises:
Show at least one operand, wherein, the operation result of the application that display is corresponding with described operand after described operand can be triggered;
Detect first operation of user;
From at least one operand described, determine the first operand according to described first operation, wherein, described first operand is in the first display state;
Generate vision-control instruction;
Change the display parameter of described first operand according to described vision-control instruction, make described first operand switch to the second display state from the first display state;
Wherein, when described first operand is in described first display state, described user can perceive described first operand and be positioned at the first locus, when described first operand is in described second display state, described user can perceive described first operand and be positioned at second space position, and described first locus is different with described second space position.
In the implementation that the first is possible, specifically comprise in conjunction with first aspect, described operation according to described first determines the first operand from least one operand described, comprising:
Gather the image of described first operation of described user, analyze the sensing that described image determines described first operation, and the operand corresponding according to described sensing extended line is defined as described first operand chosen;
Or,
Gather image and the head image of described first operation of described user, analyze described head image and the described first image operated, and determine the sensing from described head to the extended line direction of described first direction of operating, the operand at the extended line place of described sensing is defined as described first operand by described first operation;
Or,
Gather the described first application drawing picture of described user, analyze and the operand being less than predetermined threshold value to the subpoint distance on described electronic equipment with the described first image projection operated is defined as described first operand.
In the implementation that the second is possible, specifically comprise in conjunction with first aspect, the instruction of described generation vision-control, comprising:
The locus corresponding to operating body making described first operation by gathering described user determines that the locus of described operating body is second space position;
By determining that described second space position generates described vision-control instruction, wherein, described display state parameter comprises the display position parameter corresponding with described second space position.
In the implementation that the third is possible, the implementation possible in conjunction with the second specifically comprises, locus corresponding to the described operating body making described first operation by gathering described user determines that the locus of described operating body is second space position, comprising:
Determine the first operating position of described first operation;
Determine described second space position according to described first operating position, wherein, the distance of described second space position and described first operating position is less than threshold value.
In the 4th kind of possible implementation, specifically comprise in conjunction with a kind of possible implementation of people in first aspect or first aspect, the described display parameter changing described first operand according to described vision-control instruction, comprising:
The space position parameter of described first locus that the described second space position corresponding according to described second display state provides the described first display state of change corresponding for described first operand;
Or,
According to predeterminated position variable parameter, described first operand is moved to described second space position corresponding to described second display state by described first locus that residing described first display state is corresponding.
In the 5th kind of possible implementation, specifically comprise in conjunction with first aspect, described method also comprises:
Detect second operation of described user, judge that whether corresponding with the triggering command preset described second operation operational motion match, if so, then determine the described second described operational motion being operating as corresponding described triggering command.
In the 6th kind of possible implementation, any one the possible implementation comprised in conjunction with first aspect or first aspect specifically comprises, and described first locus is different with described second space position, comprising:
Corresponding first operating position of described first operation, distance between described first operating position and described first locus is the first distance, distance between described first operating position and described second space position is second distance, and wherein, described first distance is greater than described second distance.
In the 7th kind of possible implementation, specifically comprise in conjunction with first aspect, described user can perceive described first operand and be positioned at second space position, and described first locus is different with described second space position, comprising:
Described second space position is positioned at second space plane, described second space plane is the space plane parallel with the display plane residing for described first operand, described display plane is the plane at described electronic equipment display unit place, and between described second space plane and user, there is the 3rd distance, there is between display plane and described user the 4th distance, wherein, described 3rd distance is less than described 4th distance.
In the 8th kind of possible implementation, specifically comprise in conjunction with the 7th kind of possible implementation, described display plane is the plane at described electronic equipment display unit place, and between described second space plane and user, there is the 3rd distance, there is between display plane and described user the 4th distance, wherein, described 3rd distance is less than described 4th distance, comprising:
Described first locus is positioned at the first space plane, described first space plane is the space plane parallel with the display plane residing for described first operand, and described first plane and described user have the 5th distance, described 5th distance is less than described 4th distance, and described 5th distance is greater than described 3rd distance.
Second aspect, provides a kind of electronic equipment, is applied to a kind of display packing, comprises:
Display unit, for showing at least one operand, wherein, the operation result of the application that display is corresponding with described operand after described operand can be triggered;
Detecting unit, for detecting first operation of user;
Choose unit, from least one operand described, determine the first operand for described first operation provided according to described detecting unit, wherein, described first operand is in the first display state;
Setting unit, for generating vision-control instruction;
Regulon, the described vision-control instruction for generating according to described setting unit changes the display parameter of described first operand, makes described first operand switch to the second display state from the first display state;
Wherein, when described first operand is in described first display state, described user can perceive described first operand and be positioned at the first locus, when described first operand is in described second display state, described user can perceive described first operand and be positioned at second space position, and described first locus is different with described second space position.
In the implementation that the first is possible, specifically comprise in conjunction with second aspect, described in choose unit tool specifically for:
Gather the image of described first operation of described user, analyze the sensing that described image determines described first operation, and the operand corresponding according to described sensing extended line is defined as described first operand chosen;
Or,
Gather image and the head image of described first operation of described user, analyze described head image and the described first image operated, and determine the sensing from described head to the extended line direction of described first direction of operating, the operand at the extended line place of described sensing is defined as described first operand by described first operation;
Or,
Gather the described first application drawing picture of described user, analyze and the operand being less than predetermined threshold value to the subpoint distance on described electronic equipment with the described first image projection operated is defined as described first operand.
In the implementation that the second is possible, specifically comprise in conjunction with first aspect, described setting unit comprises:
Gathering subelement, determining that the locus of described operating body is second space position for the locus made corresponding to the described first operating body operated by gathering described user;
Arrange subelement, for by determining that described second space position generates described vision-control instruction, wherein, described display state parameter comprises the display position parameter corresponding with described second space position.
In the implementation that the third is possible, the embodiment possible in conjunction with the second specifically comprises, described collection subelement specifically for:
Determine the first operating position of described first operation;
Determine described second space position according to described first operating position, wherein, the distance of described second space position and described first operating position is less than threshold value.
In the 4th kind of possible implementation, specifically comprise in conjunction with any one possible implementation in second aspect or second aspect, described regulon specifically for:
The space position parameter of described first locus that the described second space position corresponding according to described second display state provides the described first display state of change corresponding for described first operand;
Or,
According to predeterminated position variable parameter, described first operand is moved to described second space position corresponding to described second display state by described first locus that residing described first display state is corresponding.
In the 5th kind of possible implementation, specifically comprise in conjunction with second aspect, described equipment also comprises:
Trigger element, for detecting second operation of described user, judges that whether corresponding with the triggering command preset described second operation operational motion match, and if so, then determines the described second described operational motion being operating as corresponding described triggering command.
In the 6th kind of possible implementation, specifically comprise in conjunction with any one possible implementation in second aspect or second aspect, described first locus is different with described second space position, comprising:
Corresponding first operating position of described first operation, distance between described first operating position and described first locus is the first distance, distance between described first operating position and described second space position is second distance, and wherein, described first distance is greater than described second distance.
In the 7th kind of possible implementation, specifically comprise in conjunction with second aspect, described user can perceive described first operand and be positioned at second space position, and described first locus is different with described second space position, comprising:
Described second space position is positioned at second space plane, described second space plane is the space plane parallel with the display plane residing for described first operand, described display plane is the plane at described electronic equipment display unit place, and between described second space plane and user, there is the 3rd distance, there is between display plane and described user the 4th distance, wherein, described 3rd distance is less than described 4th distance.
In the 8th kind of possible implementation, specifically comprise in conjunction with the 7th kind of possible implementation, described display plane is the plane at described electronic equipment display unit place, and between described second space plane and user, there is the 3rd distance, there is between display plane and described user the 4th distance, wherein, described 3rd distance is less than described 4th distance, comprising:
Described first locus is positioned at the first space plane, described first space plane is the space plane parallel with the display plane residing for described first operand, and described first plane and described user have the 5th distance, described 5th distance is less than described 4th distance, and described 5th distance is greater than described 3rd distance.
The display packing that the embodiment of the present invention provides and equipment, electronic equipment is by changing the display state of display operand, by detecting user operation action selection operation object, by determining second space position, the first operand chosen is switched to the second display state by the first display state, and then changes the locus of the first operand.Thus solve because the distant user of causing of operating position of control and user on operation interface cannot by directly by touching the problem triggered by this control.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, be briefly described to the accompanying drawing used required in embodiment or description of the prior art below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
The schematic flow sheet of a kind of display packing that Fig. 1 provides for the embodiment of the present invention;
The schematic flow sheet of the another kind of display packing that Fig. 2 provides for the embodiment of the present invention;
The process flow diagram of a kind of display packing that Fig. 3 provides for another embodiment of the present invention;
The structural representation of a kind of electronic equipment that Fig. 4 provides for the embodiment of the present invention;
The structural representation of the another kind of electronic equipment that Fig. 5 provides for the embodiment of the present invention.;
The structural representation of another electronic equipment that Fig. 6 provides for the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtained under creative work prerequisite, belong to the scope of protection of the invention.
The embodiment of the present invention provides a kind of display packing, and with reference to shown in Fig. 1, idiographic flow comprises:
101, electronic equipment shows at least one operand.
Wherein, this operand shows the operation result of the application corresponding with this operand after being triggered.
Here electronic equipment facilitates user to carry out selection operation according to actual needs for user shows at least one operand.
102, electronic equipment detects first operation of user.
Here electronic equipment can catch by camera the operational motion that user makes, and determines to calculate according to the sensing of this operational motion the operand that user needs.
103, electronic equipment determines the first operand according to the first operation from least one operand.
Wherein, this first operand is in the first display state.
Here electronic equipment have chosen the operand of user's needs from least one operand of display according to the first operation detected, wherein, the operand that current electronic device is chosen is the first operand, this first operand is in current residing display state for choosing state, and this display state comprises: at least one item in the size of operand, color and locus.
104, electronic equipment generates vision-control instruction.
105, electronic equipment changes the display parameter of this first operand according to this vision-control instruction, makes this first operand switch to the second display state from this first display state.
Wherein, when this first operand is in the first display state, user can perceive the first operand and be positioned at the first locus, when this first operand is in this second display state, user can perceive the first operand and be positioned at second space position, and this first locus is different with this second space position.
The display packing that the embodiment of the present invention provides, electronic equipment is by detecting user operation action selection operation object, again according to user operation action determination second space position, and the first operand chosen is switched to the second display state by the first display state, and then make electronic equipment in follow-up operating process, shorten distance between user and operand by the locus of change first operand.Thus solve because on operation interface, the distant user of causing of operating position of control and user by directly by touching the problem triggered by this control, and cannot improve Consumer's Experience sense.
Concrete, be described below in conjunction with specific embodiment.
Can on the basis of the embodiment of Fig. 1, with reference to shown in Fig. 2, The embodiment provides a kind of display packing, provide electronic equipment according to gather user do choose action choose electronic equipment display at least one operand, and by the display state changing operand, operand is moved to second space position by the first locus, thus make electronic equipment operand be moved to the display packing at the trigger position place of user, mainly comprise: electronic equipment moves this first operand and is less than the locus at predetermined threshold place by changing the display state of the first operand chosen to user distance, with reference to shown in Fig. 2, for a change process of the first operand display state, concrete steps are as follows:
201, electronic equipment shows at least one operand.
Wherein, this operand shows the operation result of the application corresponding with this operand after being triggered.
Here electronic equipment facilitates user to carry out selection operation according to actual needs for user shows at least one operand.
202, electronic equipment detects first operation of user.
Here electronic equipment is by catching the operational motion made of user by camera, to calculate according to the pointing direction of this operational motion the operand that user needs.
203, electronic equipment determines the first operand according to the first operation from least one operand.
Wherein, this first operand is in the first display state.
Here electronic equipment have chosen the operand of user's needs from least one operand of display according to the first operation detected, wherein, the operand that current electronic device is chosen is the first operand, this first operand is in current residing display state for choosing state, and this display state comprises: at least one item in the size of operand, color and locus.
Concrete, electronic equipment chooses the first operand by detection first operation, comprising:
A, electronic equipment gather the image of first operation of user, and the sensing of the first operation determined by analysis chart picture, and according to the first operand pointing to operand corresponding to extended line and be defined as choosing.
Here electronic equipment can pass through camera collection user behavior, wherein, when the finger of user has the action pointing to electronic equipment display plane, electronic equipment can catch this Fingers to action by camera, by the Fingers that captures to action, calculate Fingers to extended line to the position on electronic equipment display plane, if this position exists operand, then this operand is defined as the first operand.
Or,
B, electronic equipment gather image and the head image of first operation of user, analyze head image and the first image operated, and determine the sensing in the extended line direction from head to the first direction of operating, the operand at the extended line place of sensing is defined as the first operand by the first operation.
Here electronic equipment can pass through camera collection user behavior, wherein this electronic equipment can calculate the focus point of user's eyes sight line by the distance of camera collection user eye and electronic equipment display plane, again by gathering the limb action of user (such as, Fingers to), wherein, electronic equipment in conjunction with the focus point of user's eyes sight line and the Fingers of user to the direction determining the extended line pointed to finger by the focus point of user's eyes sight line, and determine that this operand is the first operand according to the operand at sensing electronic equipment display screen curtain place, this direction.
Or,
C, electronic equipment gather the described first application drawing picture of described user, analyze and the operand being less than predetermined threshold value to the subpoint distance on described electronic equipment with the described first image projection operated is defined as described first operand.
Here method is identical with a and b two kinds of methods, electronic equipment can pass through camera collection user behavior, the first operand is determined according to user's finger subpoint be projected on electronic equipment display plane, wherein, electronic equipment can according to the click action of user's finger, obtain and point the subpoint that is positioned on display plane, then the operand determining to be less than with this subpoint distance predetermined threshold value by subpoint is the first operand that user needs to choose.
204, electronic equipment generates vision-control instruction.
Wherein, the instruction of electronics production vision-control specifically comprises:
The locus determination operation body corresponding to operating body of the first operation made by a, electronic equipment locus by gathering user is second space position.
Here electronic equipment can by the locus residing for operating body of camera collection user, and wherein, the operating body of user for making the user's limbs choosing action and trigger action, such as, can comprise: finger, eyes; Concrete, operating body is for user's finger, electronic equipment can by camera collection user finger movement (such as, knock, convergent-divergent), determine that the locus residing for current finger is second space position according to the locus of the finger collected, by determining that second space position generates vision-control instruction, the first operand is moved to the second space position at finger place by the first locus.
B, electronic equipment are by determining that second space position generates vision-control instruction.
Wherein, this display state parameter comprises the display position parameter corresponding with second space position.
Here electronic equipment is by the second space position residing for the first locus residing for the first operand and the user operation body that collects, calculate the position difference of the first locus and second space position, or obtain the absolute position of second space position, and then generate vision-control instruction according to second space position.
Further, according to method described in step a: electronic equipment determination second space position specifically comprises:
The first operating position of the first operation determined by a1, electronic equipment.
Here electronic equipment can make the locus of the first operation by camera collection user operation body, be specially operating body for user's finger, the action that electronic equipment is pointed by camera collection user and user point residing locus, and user is pointed residing locus and be defined as the first operating position, namely user makes the locus of the first operand action choosing electronic equipment display.
A2, electronic equipment are according to the first operating position determination second space position.
Wherein, the distance of second space position and the first operating position is less than threshold value.
Here second space position to switch to the locus of the second display state for the first operand by first locus of vision-control instruction residing for the first display state, wherein, second space position is determined according to the locus of user operation body, and second space position is positioned at operating position predetermined direction, predetermined direction is towards the direction of display unit, and second space position and the first operating position also can be the same space position in addition.
The embodiment of the present invention only for the operational motion of camera collection user operation body and locus, but is not limited only to camera collection.
205, electronic equipment changes the display parameter of this first operand according to this vision-control instruction, makes this first operand switch to the second display state from this first display state.
Wherein, when this first operand is in the first display state, user can perceive the first operand and be positioned at the first locus, when this first operand is in this second display state, user can perceive the first operand and be positioned at second space position, and this first locus is different with this second space position.
Optionally, the display parameter that electronic equipment changes this first operand according to vision-control instruction specifically comprise:
A, electronic equipment according to second space position corresponding to the second display state for the first operand provides change first to show the space position parameter of the first locus corresponding to state.
Here electronic equipment is by gathering the image of user operation body, calculate second space location parameter and determine second space position, after vision-control instruction generates, first operand will switch to the second display state by the first display state, locus changes the first space position parameter by the first locus that the first display state is corresponding according to the second space location parameter calculated, makes the first operand move to second space position.
Or,
First operand is moved to second space position corresponding to the second display state by the first locus that the first residing display state is corresponding according to predeterminated position variable parameter by b, electronic equipment.
Here electronic equipment is by gathering the image of user operation topic, calculate second space location parameter and determine second space position, after real regulating command generates, first operand combines according to predeterminated position variable parameter the space variable parameter that the second space location parameter calculated is changed the first operand, makes the first operand move to second space position according to space variable parameter.
Wherein, change the first display state and also comprise at least one item in the size of change first operand and color.
Optionally, corresponding first operating position of the first operation, the distance between this first operating position and first locus is the first distance, and the distance between this first operating position and second space position is second distance, and wherein, this first distance is greater than second distance.
Wherein, preset first threshold value is there is between the first operating position residing for user operation body and second space position, this preset first threshold value is the space length scope that the first operating position and second space position are preset, because the first operating position in practical operation residing for user operation body also can overlap with second space position, therefore the second distance between the first operating position and second space position is less than preset first threshold value.
Optionally, second space position is positioned at second space plane, this second space plane is the space plane parallel with the display plane residing for the first operand, display plane is the plane at electronic equipment display unit place, and there is between second space plane and user the 3rd distance, have the 4th distance between display plane and user, wherein, the 3rd distance is less than the 4th distance.
Further, first locus is positioned at the first space plane, and this first space plane is the space plane parallel with the display plane residing for the first operand, and the first plane and user have the 5th distance, 5th distance is less than the 4th distance, and the 5th distance is greater than the 3rd distance.
With reference to shown in Fig. 3, the first space plane residing for first locus is different from the second space plane residing for second space position to be in 3-D display environment, electronic equipment is in display plane, wherein, the operand of display plane display is positioned at the first space plane, and display plane is parallel with the first space plane, on locus display plane and the first operating position place residing for user operation body space plane between distance be greater than the first space plane and the first operating position place residing for user operation body space plane between distance, namely the 4th distance had between display plane and user is greater than the 5th distance that the first plane and user have.Because second space plane is determined according to the first operating position, here second space plane and the first operating position can overlap in practical operation is a space plane, therefore the 5th distance is greater than the 3rd distance of second space plane and the space plane residing for the first operating position.
The embodiment of the present invention is for 3-D display environment, specifically also be applicable to two-dimentional display environment, wherein, in two-dimentional display environment, electronic equipment is positioned at display plane, at least one operand of electronic equipment display is positioned on display plane, and display plane is now the first plane, and the operating position in addition residing for user operation body also can be second space position.
206, electronic equipment detects second operation of user, judges that whether corresponding with the triggering command preset this second operation operational motion match, and if so, then determines this second operational motion being operating as corresponding triggering command.
Here electronic equipment can by the action of camera collection user operation body, and the operational motion corresponding with the triggering command preset according to the action of the user operation body collected matches, such as, user operation body is for finger, the action of click first operand is pointed by gathering user, the operational motion corresponding with the triggering command preset mates, if the click action of user's finger is mated with the operational motion of the triggering command preset, then triggering command activates and is triggered in second space position by the first operand.
The display packing that the embodiment of the present invention provides, electronic equipment is by choosing the first operand in operand by electronic equipment reality of the action that gathers user operation body, and determine that the first operand switches to the locus of the second display state by the first display state by the locus gathering user operation body, and then make electronic equipment in follow-up operating process, shorten distance between user and operand by the locus of change first operand.Thus solve because on operation interface, the distant user of causing of operating position of control and user by directly by touching the problem triggered by this control, and cannot improve Consumer's Experience sense.
The invention provides a kind of electronic equipment 3, this electronic equipment is specifically as follows any one intelligent communication terminal equipment in electronic intelligence display field, as mobile phone, panel computers etc. are arbitrary equipment that can realize two dimension or 3-D display in electronic intelligence display field, in an embodiment of the present invention concrete restriction is not done to the concrete form of electronic equipment, the above-mentioned arbitrary display packing provided can realize embodiments of the invention is as the criterion, with reference to shown in Fig. 4, comprise: display unit 31, detecting unit 32, choose unit 33, setting unit 34 and regulon 35, wherein,
Display unit 31, for showing at least one operand, wherein, the operation result of the application that display is corresponding with operand after this operand can be triggered;
Detecting unit 32, for detecting first operation of user;
Choose unit 33, from least one operand, determine the first operand for the first operation provided according to detecting unit, wherein, the first operand is in the first display state;
Setting unit 34, for generating vision-control instruction;
Regulon 35, the vision-control instruction for generating according to setting unit changes the display parameter of the first operand, makes the first operand switch to the second display state from the first display state;
Wherein, when the first operand is in the first display state, this user can perceive the first operand and be positioned at the first locus, when this first operand is in this second display state, user can perceive this first operand and be positioned at second space position, and this first locus is different with this second space position.
The electronic equipment that the embodiment of the present invention provides, electronic equipment is by detecting user operation action selection operation object, again according to user operation action determination second space position, and the first operand chosen is switched to the second display state by the first display state, and then make electronic equipment in follow-up operating process, shorten distance between user and operand by the locus of change first operand.Thus solve because on operation interface, the distant user of causing of operating position of control and user by directly by touching the problem triggered by this control, and cannot improve Consumer's Experience sense.
Optionally, choose unit 33, specifically for:
Gather the image of first operation of user, the sensing of the first operation determined by analysis chart picture, and the operand corresponding according to this sensing extended line is defined as the first operand of choosing;
Or,
Gather image and the head image of first operation of user, analyze this head image and this first image operated, and determine the sensing in the extended line direction from this head to this first direction of operating, the operand at the extended line place of this sensing is defined as the first operand by the first operation;
Or,
Gather the first application drawing picture of user, analyze and the operand being less than predetermined threshold value to the subpoint distance on electronic equipment with this first image projection operated is defined as the first operand.
Optionally, with reference to shown in Fig. 5, setting unit 34 comprises:
Gather subelement 341, for make the first operation by collection user operating body corresponding to the locus of locus determination operation body be second space position;
Arrange subelement 342, for by determining that second space position generates vision-control instruction, wherein, this display state parameter comprises the display position parameter corresponding with second space position.
Further, gather subelement 341 specifically for:
Determine the first operating position of the first operation;
According to this first operating position determination second space position, wherein, the distance of this second space position and the first operating position is less than threshold value.
Optionally, regulon 35 specifically for:
According to second space position corresponding to the second display state for the first operand provides change first to show the space position parameter of the first locus corresponding to state;
Or,
According to predeterminated position variable parameter, the first operand is moved to second space position corresponding to the second display state by the first locus that the first residing display state is corresponding.
Optionally, with reference to shown in Fig. 6, electronic equipment 3 also comprises:
Trigger element 36, for detecting second operation of user, judges that whether corresponding with the triggering command preset this second operation operational motion match, and if so, then determines this second operational motion being operating as corresponding triggering command.
Optionally, corresponding first operating position of first operation, the distance between this first operating position and this first locus is the first distance, and the distance between this first operating position and this second space position is second distance, wherein, this first distance is greater than second distance.
Optionally, second space position is positioned at second space plane, this second space plane is the space plane parallel with the display plane residing for the first operand, this display plane is the plane at electronic equipment display unit place, and between this second space plane and user, there is the 3rd distance, have the 4th distance between display plane and user, wherein, the 3rd distance is less than the 4th distance.
Further, optionally, the first locus is positioned at the first space plane, first space plane is the space plane parallel with the display plane residing for the first operand, and the first plane and user have the 5th distance, the 5th distance is less than the 4th distance, and the 5th distance is greater than the 3rd distance.
The electronic equipment that the embodiment of the present invention provides, electronic equipment is by choosing the first operand in operand by electronic equipment reality of the action that gathers user operation body, and determine that the first operand switches to the locus of the second display state by the first display state by the locus gathering user operation body, and then make electronic equipment in follow-up operating process, shorten distance between user and operand by the locus of change first operand.Thus solve because on operation interface, the distant user of causing of operating position of control and user by directly by touching the problem triggered by this control, and cannot improve Consumer's Experience sense.
One of ordinary skill in the art will appreciate that: all or part of step realizing said method embodiment can have been come by the hardware that programmed instruction is relevant, aforesaid program can be stored in a computer read/write memory medium, this program, when performing, performs the step comprising said method embodiment; And aforesaid storage medium comprises: ROM, RAM, magnetic disc or CD etc. various can be program code stored medium.
The above; be only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, is anyly familiar with those skilled in the art in the technical scope that the present invention discloses; change can be expected easily or replace, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of described claim.

Claims (18)

1. a display packing, is applied to electronic equipment, and described method comprises:
Show at least one operand, wherein, the operation result of the application that display is corresponding with described operand after described operand can be triggered;
Detect first operation of user;
From at least one operand described, determine the first operand according to described first operation, wherein, described first operand is in the first display state;
Generate vision-control instruction;
Change the display parameter of described first operand according to described vision-control instruction, make described first operand switch to the second display state from described first display state;
Wherein, when described first operand is in described first display state, described user can perceive described first operand and be positioned at the first locus, when described first operand is in described second display state, described user can perceive described first operand and be positioned at second space position, and described first locus is different with described second space position.
2. method according to claim 1, is characterized in that, described operation according to described first determines the first operand from least one operand described, comprising:
Gather the image of described first operation of described user, analyze the sensing that described image determines described first operation, and the operand corresponding according to described sensing extended line is defined as described first operand chosen;
Or,
Gather image and the head image of described first operation of described user, analyze described head image and the described first image operated, and determine the sensing from described head to the extended line direction of described first direction of operating, the operand at the extended line place of described sensing is defined as described first operand by described first operation;
Or,
Gather the described first application drawing picture of described user, analyze and the operand being less than predetermined threshold value to the subpoint distance on described electronic equipment with the described first image projection operated is defined as described first operand.
3. method according to claim 1, is characterized in that, the instruction of described generation vision-control, comprising:
The locus corresponding to operating body making described first operation by gathering described user determines that the locus of described operating body is second space position;
By determining that described second space position generates described vision-control instruction, wherein, described display state parameter comprises the display position parameter corresponding with described second space position.
4. method according to claim 3, is characterized in that, the locus corresponding to the described operating body making described first operation by gathering described user determines that the locus of described operating body is second space position, comprising:
Determine the first operating position of described first operation;
Determine described second space position according to described first operating position, wherein, the distance of described second space position and described first operating position is less than threshold value.
5. the method according to any one of Claims 1 to 4, is characterized in that, the described display parameter changing described first operand according to described vision-control instruction, comprising:
The space position parameter of described first locus that the described second space position corresponding according to described second display state provides the described first display state of change corresponding for described first operand;
Or,
According to predeterminated position variable parameter, described first operand is moved to described second space position corresponding to described second display state by described first locus that residing described first display state is corresponding.
6. method according to claim 1, is characterized in that, described method also comprises:
Detect second operation of described user, judge that whether corresponding with the triggering command preset described second operation operational motion match, if so, then determine the described second described operational motion being operating as corresponding described triggering command.
7. the method according to any one of claim 1 ~ 6, is characterized in that, described first locus is different with described second space position, comprising:
Corresponding first operating position of described first operation, distance between described first operating position and described first locus is the first distance, distance between described first operating position and described second space position is second distance, and wherein, described first distance is greater than described second distance.
8. method according to claim 1, is characterized in that, described user can perceive described first operand and be positioned at second space position, and described first locus is different with described second space position, comprising:
Described second space position is positioned at second space plane, described second space plane is the space plane parallel with the display plane residing for described first operand, described display plane is the plane at described electronic equipment display unit place, and between described second space plane and user, there is the 3rd distance, there is between display plane and described user the 4th distance, wherein, described 3rd distance is less than described 4th distance.
9. method according to claim 8, it is characterized in that, described display plane is the plane at described electronic equipment display unit place, and between described second space plane and user, there is the 3rd distance, there is between display plane and described user the 4th distance, wherein, described 3rd distance is less than described 4th distance, comprising:
Described first locus is positioned at the first space plane, described first space plane is the space plane parallel with the display plane residing for described first operand, and described first plane and described user have the 5th distance, described 5th distance is less than described 4th distance, and described 5th distance is greater than described 3rd distance.
10. an electronic equipment, is applied to a kind of display packing, it is characterized in that, comprising:
Display unit, for showing at least one operand, wherein, the operation result of the application that display is corresponding with described operand after described operand can be triggered;
Detecting unit, for detecting first operation of user;
Choose unit, from least one operand described, determine the first operand for described first operation provided according to described detecting unit, wherein, described first operand is in the first display state;
Setting unit, for generating vision-control instruction;
Regulon, the described vision-control instruction for generating according to described setting unit changes the display parameter of described first operand, makes described first operand switch to the second display state from the first display state;
Wherein, when described first operand is in described first display state, described user can perceive described first operand and be positioned at the first locus, when described first operand is in described second display state, described user can perceive described first operand and be positioned at second space position, and described first locus is different with described second space position.
11. equipment according to claim 10, is characterized in that, described in choose unit specifically for:
Gather the image of described first operation of described user, analyze the sensing that described image determines described first operation, and the operand corresponding according to described sensing extended line is defined as described first operand chosen;
Or,
Gather image and the head image of described first operation of described user, analyze described head image and the described first image operated, and determine the sensing from described head to the extended line direction of described first direction of operating, the operand at the extended line place of described sensing is defined as described first operand by described first operation;
Or,
Gather the described first application drawing picture of described user, analyze and the operand being less than predetermined threshold value to the subpoint distance on described electronic equipment with the described first image projection operated is defined as described first operand.
12. equipment according to claim 10, is characterized in that, described setting unit comprises:
Gathering subelement, determining that the locus of described operating body is second space position for the locus made corresponding to the described first operating body operated by gathering described user;
Arrange subelement, for by determining that described second space position generates described vision-control instruction, wherein, described display state parameter comprises the display position parameter corresponding with described second space position.
13. equipment according to claim 12, is characterized in that, described collection subelement specifically for:
Determine the first operating position of described first operation;
Determine described second space position according to described first operating position, wherein, the distance of described second space position and described first operating position is less than threshold value.
14. equipment according to any one of claim 10 ~ 13, is characterized in that, described regulon specifically for:
The space position parameter of described first locus that the described second space position corresponding according to described second display state provides the described first display state of change corresponding for described first operand;
Or,
According to predeterminated position variable parameter, described first operand is moved to described second space position corresponding to described second display state by described first locus that residing described first display state is corresponding.
15. equipment according to claim 10, is characterized in that, described equipment also comprises:
Trigger element, for detecting second operation of described user, judges that whether corresponding with the triggering command preset described second operation operational motion match, and if so, then determines the described second described operational motion being operating as corresponding described triggering command.
16. equipment according to any one of claim 10 ~ 15, it is characterized in that, described first locus is different with described second space position, comprising:
Corresponding first operating position of described first operation, distance between described first operating position and described first locus is the first distance, distance between described first operating position and described second space position is second distance, and wherein, described first distance is greater than described second distance.
17. equipment according to claim 10, is characterized in that, described user can perceive described first operand and be positioned at second space position, and described first locus is different with described second space position, comprising:
Described second space position is positioned at second space plane, described second space plane is the space plane parallel with the display plane residing for described first operand, described display plane is the plane at described electronic equipment display unit place, and between described second space plane and user, there is the 3rd distance, there is between display plane and described user the 4th distance, wherein, described 3rd distance is less than described 4th distance.
18. equipment according to claim 17, it is characterized in that, described display plane is the plane at described electronic equipment display unit place, and between described second space plane and user, there is the 3rd distance, there is between display plane and described user the 4th distance, wherein, described 3rd distance is less than described 4th distance, comprising:
Described first locus is positioned at the first space plane, described first space plane is the space plane parallel with the display plane residing for described first operand, and described first plane and described user have the 5th distance, described 5th distance is less than described 4th distance, and described 5th distance is greater than described 3rd distance.
CN201310062955.5A 2013-02-28 2013-02-28 A kind of display methods and equipment Active CN104714728B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310062955.5A CN104714728B (en) 2013-02-28 2013-02-28 A kind of display methods and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310062955.5A CN104714728B (en) 2013-02-28 2013-02-28 A kind of display methods and equipment

Publications (2)

Publication Number Publication Date
CN104714728A true CN104714728A (en) 2015-06-17
CN104714728B CN104714728B (en) 2018-10-12

Family

ID=53414122

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310062955.5A Active CN104714728B (en) 2013-02-28 2013-02-28 A kind of display methods and equipment

Country Status (1)

Country Link
CN (1) CN104714728B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182019A (en) * 2018-01-16 2018-06-19 维沃移动通信有限公司 A kind of suspension control display processing method and mobile terminal
CN110418059A (en) * 2019-07-30 2019-11-05 联想(北京)有限公司 Applied to the image processing method of electronic equipment, device, electronic equipment, medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1380996A (en) * 2000-05-17 2002-11-20 皇家菲利浦电子有限公司 Apparatus and method for indicating target by image processing without three-dimensional modeling
US20080005703A1 (en) * 2006-06-28 2008-01-03 Nokia Corporation Apparatus, Methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
CN102270037A (en) * 2010-06-04 2011-12-07 宏碁股份有限公司 Manual human machine interface operation system and method thereof
CN102385438A (en) * 2010-08-31 2012-03-21 索尼公司 Information processing device, information processing method, and program
CN102402379A (en) * 2010-09-14 2012-04-04 Lg电子株式会社 Mobile terminal and controlling method thereof
CN102592569A (en) * 2011-01-10 2012-07-18 联想(北京)有限公司 Electronic equipment and display method
CN102693063A (en) * 2011-03-23 2012-09-26 联想(北京)有限公司 Operation control method and device and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1380996A (en) * 2000-05-17 2002-11-20 皇家菲利浦电子有限公司 Apparatus and method for indicating target by image processing without three-dimensional modeling
US20080005703A1 (en) * 2006-06-28 2008-01-03 Nokia Corporation Apparatus, Methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
CN102270037A (en) * 2010-06-04 2011-12-07 宏碁股份有限公司 Manual human machine interface operation system and method thereof
CN102385438A (en) * 2010-08-31 2012-03-21 索尼公司 Information processing device, information processing method, and program
CN102402379A (en) * 2010-09-14 2012-04-04 Lg电子株式会社 Mobile terminal and controlling method thereof
CN102592569A (en) * 2011-01-10 2012-07-18 联想(北京)有限公司 Electronic equipment and display method
CN102693063A (en) * 2011-03-23 2012-09-26 联想(北京)有限公司 Operation control method and device and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182019A (en) * 2018-01-16 2018-06-19 维沃移动通信有限公司 A kind of suspension control display processing method and mobile terminal
CN110418059A (en) * 2019-07-30 2019-11-05 联想(北京)有限公司 Applied to the image processing method of electronic equipment, device, electronic equipment, medium

Also Published As

Publication number Publication date
CN104714728B (en) 2018-10-12

Similar Documents

Publication Publication Date Title
US11269481B2 (en) Dynamic user interactions for display control and measuring degree of completeness of user gestures
US8866781B2 (en) Contactless gesture-based control method and apparatus
EP2638461B1 (en) Apparatus and method for user input for controlling displayed information
KR20150014083A (en) Method For Sensing Inputs of Electrical Device And Electrical Device Thereof
CN103929603A (en) Image Projection Device, Image Projection System, And Control Method
CN105138076A (en) Electronic apparatus control method and electronic apparatus
JP2019087284A (en) Interaction method for user interfaces
CN103135929A (en) Method and device for controlling application interface to move and terminal device
CN105468286A (en) Mobile terminal-based status bar operating method and mobile terminal thereof
CN105912101B (en) Projection control method and electronic equipment
CN103809856A (en) Information processing method and first electronic device
CN105183538A (en) Information processing method and electronic device
CN104714728A (en) Display method and device
CN103376884B (en) Man-machine interaction method and its device
CN104484095A (en) Information processing method and electronic device
CN104407698A (en) Projecting method and electronic equipment
CN103885696A (en) Information processing method and electronic device
CN109426424A (en) A kind of operating method of terminal device, device and electronic equipment
CN104423548A (en) Control method and control device
CN109739422B (en) Window control method, device and equipment
CN104951211A (en) Information processing method and electronic equipment
CN106325655B (en) 3D application icon interaction method applied to touch terminal and touch terminal
CN104423560A (en) Information processing method and electronic equipment
CN104866071A (en) Display method and electronic equipment
CN104317486A (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant