CN104714728B - A kind of display methods and equipment - Google Patents

A kind of display methods and equipment Download PDF

Info

Publication number
CN104714728B
CN104714728B CN201310062955.5A CN201310062955A CN104714728B CN 104714728 B CN104714728 B CN 104714728B CN 201310062955 A CN201310062955 A CN 201310062955A CN 104714728 B CN104714728 B CN 104714728B
Authority
CN
China
Prior art keywords
operation object
display
user
distance
spatial position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310062955.5A
Other languages
Chinese (zh)
Other versions
CN104714728A (en
Inventor
过晓冰
向梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310062955.5A priority Critical patent/CN104714728B/en
Publication of CN104714728A publication Critical patent/CN104714728A/en
Application granted granted Critical
Publication of CN104714728B publication Critical patent/CN104714728B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A kind of display methods of offer of the embodiment of the present invention and equipment, are related to electronic apparatus application field, solve the problems, such as to cause user can not be by directly triggering the control by touching farther out due to the control in operation interface and the operating position of user distance.This method includes:Show at least one operation object, wherein the operation result of display application corresponding with operation object after operation object can be triggered;Detect the first operation of user;The first operation object is determined from least one operation object according to the first operation, wherein the first operation object is in the first dispaly state;Generate vision-control instruction;The display parameters for changing the first operation object are instructed according to vision-control so that the first operation object switches to the second dispaly state from the first dispaly state.The embodiment of the present invention for changing operation object in electronic equipment display plane dispaly state.

Description

Display method and device
Technical Field
The present invention relates to the field of electronic device applications, and in particular, to a display method and device.
Background
When a user operates a three-dimensional (3D) interface, the user is usually required to touch a control on the interface to trigger the control, and since the 3D interface has three Dimensions (X, Y, Z) in space, when the control that the user needs to touch is far from the user, the user needs to actively touch the control that the user needs to touch to trigger the control. Therefore, some controls are far away from the user, which brings inconvenience to the user.
The inventor finds that at least the following problems exist in the prior art: when a user operates a 3D interface, some controls are far away from the user and cannot be directly triggered by the touch of the user, so that the user cannot directly trigger the controls by touching.
Disclosure of Invention
The embodiment of the invention provides a display method and display equipment, which solve the problem that a user cannot trigger a control by directly touching the control because the control is far away from the operation position of the user on an operation interface.
In order to achieve the above purpose, the embodiment of the invention adopts the following technical scheme:
in a first aspect, a display method is provided, which is applied to an electronic device, and the method includes:
displaying at least one operation object, wherein the operation object can be triggered to display an operation result of an application corresponding to the operation object;
detecting a first operation of a user;
determining a first operation object from the at least one operation object according to the first operation, wherein the first operation object is in a first display state;
generating a display adjustment instruction;
changing the display parameters of the first operation object according to the display adjusting instruction, so that the first operation object is switched from a first display state to a second display state;
when the first operation object is in the first display state, the user can perceive that the first operation object is located at a first spatial position, and when the first operation object is in the second display state, the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different.
In a first possible implementation manner, with reference to the first aspect, specifically, the determining, according to the first operation, a first operation object from the at least one operation object includes:
acquiring an image of the first operation of the user, analyzing the image to determine the direction of the first operation, and determining the first operation object as the selected first operation object according to the operation object corresponding to the direction extension line;
or,
acquiring an image of the first operation of the user and a head image, analyzing the head image and the image of the first operation, determining a direction from the head to an extension line direction of the first operation direction, and determining an operation object at the extension line of the direction as the first operation object through the first operation;
or,
acquiring the first operation image of the user, and analyzing and determining an operation object with a projection point distance smaller than a preset threshold value from the first operation image to the electronic equipment as the first operation object.
In a second possible implementation manner, with reference to the first aspect, the generating a display adjustment instruction includes:
determining the spatial position of the operation body as a second spatial position by acquiring the spatial position corresponding to the operation body which performs the first operation by the user;
generating the display adjustment instruction by determining the second spatial position, wherein the display state parameter includes a display position parameter corresponding to the second spatial position.
In a third possible implementation manner, with reference to the second possible implementation manner, the determining, by collecting a spatial position corresponding to an operation body that the user performs the first operation, that the spatial position of the operation body is the second spatial position specifically includes:
determining a first operation position of the first operation;
determining the second spatial position in dependence on the first operational position, wherein a distance between the second spatial position and the first operational position is less than a threshold.
In a fourth possible implementation manner, with reference to the first aspect or one possible implementation manner of the first aspect, the changing, according to the display adjustment instruction, the display parameter of the first operation object includes:
providing a spatial position parameter for changing the first spatial position corresponding to the first display state for the first operation object according to the second spatial position corresponding to the second display state;
or,
and moving the first operation object from the first space position corresponding to the first display state to the second space position corresponding to the second display state according to a preset position variable parameter.
In a fifth possible implementation manner, with reference to the first aspect, the method further includes:
detecting a second operation of the user, judging whether the second operation is matched with an operation action corresponding to a preset trigger instruction, and if so, determining that the second operation is the operation action corresponding to the trigger instruction.
In a sixth possible implementation manner, with reference to the first aspect or any one possible implementation manner included in the first aspect, the method specifically includes that the first spatial location is different from the second spatial location, and the method includes:
the first operation corresponds to a first operation position, the distance between the first operation position and the first space position is a first distance, the distance between the first operation position and the second space position is a second distance, and the first distance is larger than the second distance.
In a seventh possible implementation manner, with reference to the first aspect, specifically, the enabling the user to perceive that the first operation object is located at a second spatial position, where the first spatial position and the second spatial position are different includes:
the second spatial position is located on a second spatial plane, the second spatial plane is a spatial plane parallel to the display plane where the first operation object is located, the display plane is a plane where the display unit of the electronic device is located, a third distance is provided between the second spatial plane and the user, and a fourth distance is provided between the display plane and the user, wherein the third distance is smaller than the fourth distance.
In an eighth possible implementation manner, with reference to the seventh possible implementation manner, specifically including that the display plane is a plane where the display unit of the electronic device is located, a third distance is provided between the second spatial plane and the user, and a fourth distance is provided between the display plane and the user, where the third distance is smaller than the fourth distance, the method includes:
the first spatial position is located on a first spatial plane, the first spatial plane is a spatial plane parallel to a display plane where the first operation object is located, a fifth distance exists between the first plane and the user, the fifth distance is smaller than the fourth distance, and the fifth distance is larger than the third distance.
In a second aspect, an electronic device applied to a display method is provided, including:
the display unit is used for displaying at least one operation object, wherein the operation object can be triggered to display the running result of the application corresponding to the operation object;
a detection unit for detecting a first operation by a user;
the selecting unit is used for determining a first operation object from the at least one operation object according to the first operation provided by the detecting unit, wherein the first operation object is in a first display state;
a setting unit for generating a display adjustment instruction;
the adjusting unit is used for changing the display parameters of the first operation object according to the display adjusting instruction generated by the setting unit so that the first operation object is switched from a first display state to a second display state;
when the first operation object is in the first display state, the user can perceive that the first operation object is located at a first spatial position, and when the first operation object is in the second display state, the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different.
In a first possible implementation manner, with reference to the second aspect, the selecting unit is specifically configured to:
acquiring an image of the first operation of the user, analyzing the image to determine the direction of the first operation, and determining the first operation object as the selected first operation object according to the operation object corresponding to the direction extension line;
or,
acquiring an image of the first operation of the user and a head image, analyzing the head image and the image of the first operation, determining a direction from the head to an extension line direction of the first operation direction, and determining an operation object at the extension line of the direction as the first operation object through the first operation;
or,
acquiring the first operation image of the user, and analyzing and determining an operation object with a projection point distance smaller than a preset threshold value from the first operation image to the electronic equipment as the first operation object.
In a second possible implementation manner, with reference to the first aspect, the setting unit includes:
the acquisition subunit is configured to determine that the spatial position of the operation body is a second spatial position by acquiring the spatial position corresponding to the operation body on which the user performs the first operation;
and the setting subunit is configured to generate the display adjustment instruction by determining the second spatial position, where the display state parameter includes a display position parameter corresponding to the second spatial position.
In a third possible implementation manner, with reference to the second possible implementation manner, specifically, the acquisition subunit is specifically configured to:
determining a first operation position of the first operation;
determining the second spatial position in dependence on the first operational position, wherein a distance between the second spatial position and the first operational position is less than a threshold.
In a fourth possible implementation manner, in combination with the second aspect or any one of the possible implementation manners of the second aspect, the adjusting unit is specifically configured to:
providing a spatial position parameter for changing the first spatial position corresponding to the first display state for the first operation object according to the second spatial position corresponding to the second display state;
or,
and moving the first operation object from the first space position corresponding to the first display state to the second space position corresponding to the second display state according to a preset position variable parameter.
In a fifth possible implementation manner, with reference to the second aspect, the apparatus further includes:
and the triggering unit is used for detecting a second operation of the user, judging whether the second operation is matched with an operation action corresponding to a preset triggering instruction, and if so, determining that the second operation is the operation action corresponding to the triggering instruction.
In a sixth possible implementation manner, with reference to the second aspect or any one of the second possible implementation manners, specifically, the method includes that the first spatial location and the second spatial location are different, and the method includes:
the first operation corresponds to a first operation position, the distance between the first operation position and the first space position is a first distance, the distance between the first operation position and the second space position is a second distance, and the first distance is larger than the second distance.
In a seventh possible implementation manner, with reference to the second aspect, specifically, the enabling the user to perceive that the first operation object is located at a second spatial position, where the first spatial position and the second spatial position are different includes:
the second spatial position is located on a second spatial plane, the second spatial plane is a spatial plane parallel to the display plane where the first operation object is located, the display plane is a plane where the display unit of the electronic device is located, a third distance is provided between the second spatial plane and the user, and a fourth distance is provided between the display plane and the user, wherein the third distance is smaller than the fourth distance.
In an eighth possible implementation manner, with reference to the seventh possible implementation manner, specifically including that the display plane is a plane where the display unit of the electronic device is located, a third distance is provided between the second spatial plane and the user, and a fourth distance is provided between the display plane and the user, where the third distance is smaller than the fourth distance, the method includes:
the first spatial position is located on a first spatial plane, the first spatial plane is a spatial plane parallel to a display plane where the first operation object is located, a fifth distance exists between the first plane and the user, the fifth distance is smaller than the fourth distance, and the fifth distance is larger than the third distance.
According to the display method and the display device provided by the embodiment of the invention, the electronic device changes the display state of the displayed operation object, selects the operation object by detecting the operation action of the user, and switches the selected first operation object from the first display state to the second display state by determining the second spatial position, so that the spatial position of the first operation object is changed. Therefore, the problem that the user cannot trigger the control directly through touch due to the fact that the control is far away from the operation position of the user on the operation interface is solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a display method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of another display method according to an embodiment of the present invention;
FIG. 3 is a flowchart of a display method according to another embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of another electronic device according to an embodiment of the present invention. (ii) a
Fig. 6 is a schematic structural diagram of another electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An embodiment of the present invention provides a display method, which is shown in fig. 1, and includes:
101. the electronic equipment displays at least one operation object.
And the operation object can be triggered to display the running result of the application corresponding to the operation object.
The electronic equipment displays at least one operation object for the user, so that the user can conveniently select and operate according to actual needs.
102. The electronic device detects a first operation of a user.
The electronic device can capture the operation action made by the user through the camera, so as to determine the operation object required by the user according to the direction calculation of the operation action.
103. The electronic equipment determines a first operation object from at least one operation object according to the first operation.
Wherein the first operation object is in a first display state.
The electronic device selects an operation object required by a user from at least one displayed operation object according to the detected first operation, wherein the operation object selected by the electronic device at present is a first operation object, a display state of the first operation object at present is a selection state, and the display state includes: at least one of a size, a color, and a spatial position of the operation object.
104. The electronic device generates a display adjustment instruction.
105. The electronic equipment changes the display parameters of the first operation object according to the display adjusting instruction, so that the first operation object is switched from the first display state to the second display state.
When the first operation object is in the first display state, the user can perceive that the first operation object is located at a first spatial position, and when the first operation object is in the second display state, the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different.
According to the display method provided by the embodiment of the invention, the electronic equipment selects the operation object by detecting the operation action of the user, determines the second spatial position according to the operation action of the user, switches the selected first operation object from the first display state to the second display state, and further shortens the distance between the user and the operation object in the subsequent operation process by changing the spatial position of the first operation object. Therefore, the problem that the user cannot trigger the control directly through touch due to the fact that the control is far away from the operation position of the user on the operation interface is solved, and the user experience is improved.
Specifically, the following description will be given with reference to specific examples.
On the basis of the embodiment of fig. 1, referring to fig. 2, an embodiment of the present invention provides a display method, in which an electronic device selects at least one operation object displayed by the electronic device according to a selection action performed by a user, and moves the operation object from a first spatial position to a second spatial position by changing a display state of the operation object, so that the electronic device moves the operation object to a trigger position of the user, and the display method mainly includes: the electronic equipment moves the selected first operation object to a spatial position at which the distance from the user is less than a preset threshold value by changing the display state of the first operation object; referring to fig. 2, in order to change the display state of the first operation object, the specific steps are as follows:
201. the electronic equipment displays at least one operation object.
And the operation object can be triggered to display the running result of the application corresponding to the operation object.
The electronic equipment displays at least one operation object for the user, so that the user can conveniently select and operate according to actual needs.
202. The electronic device detects a first operation of a user.
The electronic device captures the operation action made by the user through the camera so as to calculate the operation object required by the user according to the pointing direction of the operation action.
203. The electronic equipment determines a first operation object from at least one operation object according to the first operation.
Wherein the first operation object is in a first display state.
The electronic device selects an operation object required by a user from at least one displayed operation object according to the detected first operation, wherein the operation object selected by the electronic device at present is a first operation object, a display state of the first operation object at present is a selection state, and the display state includes: at least one of a size, a color, and a spatial position of the operation object.
Specifically, the electronic device selects the first operation object by detecting the first operation, including:
a. the electronic equipment collects an image of a first operation of a user, analyzes the image to determine the direction of the first operation, and determines the first operation object as a selected first operation object according to the operation object corresponding to the direction extension line.
The electronic equipment can acquire the user behavior through the camera, wherein when the fingers of the user have the action of pointing to the display plane of the electronic equipment, the electronic equipment can capture the pointing action of the fingers through the camera, the position of the finger pointing extension line on the display plane of the electronic equipment is calculated through the captured pointing action of the fingers, and if an operation object exists at the position, the operation object is determined as a first operation object.
Or,
b. the electronic equipment collects an image of a first operation of a user and a head image, analyzes the head image and the image of the first operation, determines the direction from the head to the direction of the extension line of the first operation direction, and determines an operation object at the extension line of the direction as the first operation object through the first operation.
The electronic device can acquire the user behavior through the camera, wherein the electronic device can calculate a focus point of the sight line of the eyes of the user by acquiring the distance between the eyes of the user and the display plane of the electronic device through the camera and then acquire the body movement (for example, finger pointing) of the user, wherein the electronic device determines the direction from the focus point of the sight line of the eyes of the user to an extension line pointed by the finger by combining the focus point of the sight line of the eyes of the user and the finger pointing direction of the user, and determines that the operation object is a first operation object according to the operation object pointed by the direction to the display screen of the electronic device.
Or,
c. the electronic equipment acquires the first operation image of the user, and an operation object which is less than a preset threshold value away from a projection point of the first operation image on the electronic equipment is analyzed and determined as the first operation object.
The method is the same as the method a and the method b, the electronic equipment can acquire user behaviors through the camera, and determine the first operation object according to the projection point projected by the finger of the user onto the display plane of the electronic equipment, wherein the electronic equipment can acquire the projection point of the finger on the display plane according to the clicking action of the finger of the user, and then determine the operation object with the distance from the projection point smaller than the preset threshold value as the first operation object required to be selected by the user through the projection point.
204. The electronic device generates a display adjustment instruction.
Wherein, the electronic production display adjustment instruction specifically comprises:
a. the electronic equipment determines the spatial position of the operation body as a second spatial position by acquiring the spatial position corresponding to the operation body of the first operation performed by the user.
Here, the electronic device may acquire a spatial position where an operation body of the user is located through the camera, where the operation body of the user may be a limb of the user performing the selecting action and the triggering action, and the spatial position includes, for example: fingers, eyes; specifically, the operation body takes a finger of the user as an example, the electronic device may collect a motion (e.g., tapping, zooming) of the finger of the user through the camera, determine the spatial position of the current finger as the second spatial position according to the collected spatial position of the finger, so as to generate a display adjustment instruction by determining the second spatial position, and move the first operation object from the first spatial position to the second spatial position where the finger is located.
b. The electronic device generates a display adjustment instruction by determining the second spatial location.
Wherein the display state parameter comprises a display position parameter corresponding to the second spatial position.
The electronic equipment calculates a position difference value between the first space position and the second space position or obtains an absolute position of the second space position through the first space position where the first operation object is located and the collected second space position where the user operation body is located, and then generates a display adjusting instruction according to the second space position.
Further, according to the method in step a: the determining, by the electronic device, the second spatial location specifically includes:
a1, the electronic device determines a first operation position for the first operation.
The electronic device can acquire the spatial position of the first operation performed by the user operation body through the camera, specifically, the operation body takes the finger of the user as an example, the electronic device acquires the action of the finger of the user and the spatial position of the finger of the user through the camera, and determines the spatial position of the finger of the user as the first operation position, that is, the spatial position of the action of the first operation object displayed by the electronic device performed by the user.
a2, the electronic device determines a second spatial position according to the first operating position.
Wherein the distance between the second spatial position and the first operational position is less than a threshold value.
The second spatial position is a spatial position where the first operation object is switched from the first spatial position in the first display state to the second display state through the display adjustment instruction, wherein the second spatial position is determined according to the spatial position of the user operation body, and the second spatial position is located in an operation position predetermined direction, the predetermined direction is a direction toward the display unit, and the second spatial position and the first operation position may be the same spatial position.
The embodiment of the invention takes the camera as an example for collecting the operation action and the spatial position of the user operation body, but is not limited to the camera collection.
205. The electronic equipment changes the display parameters of the first operation object according to the display adjusting instruction, so that the first operation object is switched from the first display state to the second display state.
When the first operation object is in the first display state, the user can perceive that the first operation object is located at a first spatial position, and when the first operation object is in the second display state, the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different.
Optionally, the step of changing, by the electronic device, the display parameter of the first operation object according to the display adjustment instruction specifically includes:
a. the electronic equipment provides a spatial position parameter for changing the first spatial position corresponding to the first display state for the first operation object according to the second spatial position corresponding to the second display state.
The electronic equipment calculates a second spatial position parameter and determines a second spatial position by acquiring an image of a user operation body, after a display adjustment instruction is generated, a first operation object is switched from a first display state to a second display state, and the first spatial position parameter is changed at a first spatial position corresponding to the first display state on the spatial position according to the calculated second spatial position parameter, so that the first operation object moves to the second spatial position.
Or,
b. the electronic equipment moves the first operation object from a first space position corresponding to the first display state to a second space position corresponding to the second display state according to the preset position variable parameter.
The electronic equipment calculates a second space position parameter and determines a second space position by acquiring an image of a user operation question, and after a real adjusting instruction is generated, the first operation object obtains a space variable parameter for changing the first operation object according to a preset position variable parameter and the calculated second space position parameter, so that the first operation object moves to the second space position according to the space variable parameter.
Wherein changing the first display state further comprises changing at least one of a size and a color of the first operation object.
Optionally, the first operation corresponds to a first operation position, a distance between the first operation position and the first spatial position is a first distance, and a distance between the first operation position and the second spatial position is a second distance, where the first distance is greater than the second distance.
The first operation position of the user operation body can be coincided with the second space position in actual operation, so that the second distance between the first operation position and the second space position is smaller than the preset first threshold value.
Optionally, the second spatial position is located on a second spatial plane, the second spatial plane is a spatial plane parallel to the display plane where the first operation object is located, the display plane is a plane where the display unit of the electronic device is located, a third distance is provided between the second spatial plane and the user, and a fourth distance is provided between the display plane and the user, where the third distance is smaller than the fourth distance.
Further, the first spatial position is located on a first spatial plane, the first spatial plane is a spatial plane parallel to the display plane where the first operation object is located, and the first plane and the user have a fifth distance, the fifth distance is smaller than the fourth distance, and the fifth distance is greater than the third distance.
Referring to fig. 3, a first spatial plane where the first spatial position is located is different from a second spatial plane where the second spatial position is located in that, in the three-dimensional display environment, the electronic device is located in a display plane, where an operation object displayed by the display plane is located in the first spatial plane, and the display plane is parallel to the first spatial plane, and a distance between the display plane and a spatial plane where the first operation position where the user operation body is located is greater than a distance between the first spatial plane and a spatial plane where the first operation position where the user operation body is located, that is, a fourth distance between the display plane and the user is greater than a fifth distance between the first plane and the user. Since the second spatial plane is determined according to the first operation position, where the second spatial plane and the first operation position can coincide as one spatial plane in actual operation, the fifth distance is larger than the third distance between the second spatial plane and the spatial plane where the first operation position is located.
The embodiment of the present invention takes a three-dimensional display environment as an example, and is particularly suitable for a two-dimensional display environment, where in the two-dimensional display environment, the electronic device is located on a display plane, at least one operation object displayed by the electronic device is located on the display plane, the display plane at this time is a first plane, and an operation position where a user operation body is located may also be a second spatial position.
206. The electronic equipment detects a second operation of the user, judges whether the second operation is matched with an operation action corresponding to a preset trigger instruction, and if so, determines that the second operation is the operation action corresponding to the trigger instruction.
The electronic device may collect the motion of the user operation body through the camera, and match the collected motion of the user operation body with the operation motion corresponding to the preset trigger instruction, for example, the user operation body takes a finger as an example, and matches the motion of the user finger clicking the first operation object with the operation motion corresponding to the preset trigger instruction by collecting the motion of the user finger clicking the first operation object, and if the click motion of the user finger matches the operation motion of the preset trigger instruction, the trigger instruction activates to trigger the first operation object at the second spatial position.
According to the display method provided by the embodiment of the invention, the electronic equipment selects the first operation object from the actual operation objects of the electronic equipment by acquiring the action of the user operation body, determines the spatial position of the first operation object switched from the first display state to the second display state by acquiring the spatial position of the user operation body, and further shortens the distance between the user and the operation object in the subsequent operation process of the electronic equipment by changing the spatial position of the first operation object. Therefore, the problem that the user cannot trigger the control directly through touch due to the fact that the control is far away from the operation position of the user on the operation interface is solved, and the user experience is improved.
The present invention provides an electronic device 3, which may specifically be any intelligent communication terminal device in the field of electronic intelligent display, such as a mobile phone, a tablet computer, and any device capable of implementing two-dimensional or three-dimensional display in the field of electronic intelligent display, and in the embodiment of the present invention, specific limitations are not imposed on the specific form of the electronic device, and with reference to any display method that may be provided by the embodiment of the present invention, as shown in fig. 4, the method includes: a display unit 31, a detection unit 32, a selection unit 33, a setting unit 34 and an adjustment unit 35, wherein,
the display unit 31 is configured to display at least one operation object, where the operation object can be triggered to display an operation result of an application corresponding to the operation object;
a detection unit 32 for detecting a first operation by a user;
the selecting unit 33 is configured to determine a first operation object from the at least one operation object according to the first operation provided by the detecting unit, where the first operation object is in a first display state;
a setting unit 34 for generating a display adjustment instruction;
an adjusting unit 35, configured to change a display parameter of the first operation object according to the display adjustment instruction generated by the setting unit, so that the first operation object is switched from the first display state to the second display state;
when the first operation object is in the first display state, the user can perceive that the first operation object is located at a first spatial position, and when the first operation object is in the second display state, the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different.
According to the electronic device provided by the embodiment of the invention, the electronic device selects the operation object by detecting the operation action of the user, determines the second spatial position according to the operation action of the user, switches the selected first operation object from the first display state to the second display state, and further shortens the distance between the user and the operation object in the subsequent operation process by changing the spatial position of the first operation object. Therefore, the problem that the user cannot trigger the control directly through touch due to the fact that the control is far away from the operation position of the user on the operation interface is solved, and the user experience is improved.
Optionally, the selecting unit 33 is specifically configured to:
acquiring an image of a first operation of a user, analyzing the image to determine the direction of the first operation, and determining an operation object corresponding to the direction extension line as a selected first operation object;
or,
acquiring an image of a first operation of a user and a head image, analyzing the head image and the image of the first operation, determining a direction from the head to an extension line direction of the first operation direction, and determining an operation object at the extension line of the direction as a first operation object through the first operation;
or,
the method comprises the steps of collecting a first operation image of a user, analyzing an operation object with a projection point distance smaller than a preset threshold value, wherein the projection point distance from the first operation image to the electronic equipment is determined as the first operation object.
Alternatively, as shown in fig. 5, the setting unit 34 includes:
the acquisition subunit 341 is configured to determine, by acquiring a spatial position corresponding to an operation body that a user performs a first operation, that the spatial position of the operation body is a second spatial position;
the setting subunit 342 is configured to generate a display adjustment instruction by determining a second spatial position, where the display state parameter includes a display position parameter corresponding to the second spatial position.
Further, the collecting subunit 341 is specifically configured to:
determining a first operation position of a first operation;
and determining a second spatial position according to the first operation position, wherein the distance between the second spatial position and the first operation position is less than a threshold value.
Optionally, the adjusting unit 35 is specifically configured to:
providing a spatial position parameter for changing a first spatial position corresponding to the first display state for the first operation object according to a second spatial position corresponding to the second display state;
or,
and moving the first operation object from a first space position corresponding to the first display state to a second space position corresponding to the second display state according to the preset position variable parameter.
Optionally, as shown in fig. 6, the electronic device 3 further includes:
the triggering unit 36 is configured to detect a second operation of the user, determine whether the second operation matches an operation action corresponding to a preset triggering instruction, and if so, determine that the second operation is the operation action corresponding to the triggering instruction.
Optionally, the first operation corresponds to a first operation position, a distance between the first operation position and the first spatial position is a first distance, and a distance between the first operation position and the second spatial position is a second distance, where the first distance is greater than the second distance.
Optionally, the second spatial position is located on a second spatial plane, the second spatial plane is a spatial plane parallel to the display plane where the first operation object is located, the display plane is a plane where the display unit of the electronic device is located, a third distance is provided between the second spatial plane and the user, and a fourth distance is provided between the display plane and the user, where the third distance is smaller than the fourth distance.
Further, optionally, the first spatial position is located on a first spatial plane, the first spatial plane is a spatial plane parallel to the display plane where the first operation object is located, and a fifth distance exists between the first plane and the user, where the fifth distance is smaller than the fourth distance, and the fifth distance is greater than the third distance.
According to the electronic device provided by the embodiment of the invention, the electronic device selects the first operation object from the actual operation objects of the electronic device by acquiring the action of the user operation body, determines the spatial position of the first operation object switched from the first display state to the second display state by acquiring the spatial position of the user operation body, and further shortens the distance between the user and the operation object in the subsequent operation process of the electronic device by changing the spatial position of the first operation object. Therefore, the problem that the user cannot trigger the control directly through touch due to the fact that the control is far away from the operation position of the user on the operation interface is solved, and the user experience is improved.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (12)

1. A display method is applied to electronic equipment, and the method comprises the following steps:
displaying at least one operation object, wherein the operation object can be triggered to display an operation result of an application corresponding to the operation object;
detecting a first operation of a user;
determining a first operation object from the at least one operation object according to the first operation, wherein the first operation object is in a first display state;
generating a display adjustment instruction, specifically comprising: determining the spatial position of the operation body as a second spatial position by acquiring the spatial position corresponding to the operation body which performs the first operation by the user; generating the display adjustment instruction by determining the second spatial position, wherein display parameters include a display position parameter corresponding to the second spatial position;
changing the display parameters of the first operation object according to the display adjusting instruction, so that the first operation object is switched from the first display state to a second display state;
wherein, when the first operation object is in the first display state, the user can perceive that the first operation object is located at a first spatial position, and when the first operation object is in the second display state, the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different;
the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different, including:
the second spatial position is located on a second spatial plane, the second spatial plane is a spatial plane parallel to a display plane where the first operation object is located, the display plane is a plane where the display unit of the electronic device is located, a third distance is provided between the second spatial plane and a user, and a fourth distance is provided between the display plane and the user, wherein the third distance is smaller than the fourth distance; the first space position is located on a first space plane, the first space plane is a space plane parallel to a display plane where the first operation object is located, a fifth distance exists between the first space plane and the user, the fifth distance is smaller than the fourth distance, and the fifth distance is larger than the third distance.
2. The method of claim 1, wherein determining a first operand from the at least one operand according to the first operation comprises:
acquiring an image of the first operation of the user, analyzing the image to determine the direction of the first operation, and determining the first operation object as the selected first operation object according to the operation object corresponding to the direction extension line;
or,
acquiring an image of the first operation of the user and a head image, analyzing the head image and the image of the first operation, determining a direction from the head to an extension line direction of the first operation direction, and determining an operation object at the extension line of the direction as the first operation object through the first operation;
or,
acquiring the first operation image of the user, and analyzing and determining an operation object with a projection point distance smaller than a preset threshold value from the first operation image to the electronic equipment as the first operation object.
3. The method according to claim 1, wherein the determining that the spatial position of the operation body is the second spatial position by collecting the spatial position corresponding to the operation body that the user performed the first operation comprises:
determining a first operation position of the first operation;
determining the second spatial position in dependence on the first operational position, wherein a distance between the second spatial position and the first operational position is less than a threshold.
4. The method according to any one of claims 1 to 3, wherein the changing the display parameter of the first operation object according to the display adjustment instruction comprises:
providing a spatial position parameter for changing the first spatial position corresponding to the first display state for the first operation object according to the second spatial position corresponding to the second display state;
or,
and moving the first operation object from the first space position corresponding to the first display state to the second space position corresponding to the second display state according to a preset position variable parameter.
5. The method of claim 1, further comprising:
detecting a second operation of the user, judging whether the second operation is matched with an operation action corresponding to a preset trigger instruction, and if so, determining that the second operation is the operation action corresponding to the trigger instruction.
6. The method of any one of claims 1 to 3, 5, wherein the first and second spatial locations are different, comprising:
the first operation corresponds to a first operation position, the distance between the first operation position and the first space position is a first distance, the distance between the first operation position and the second space position is a second distance, and the first distance is larger than the second distance.
7. An electronic device applied to a display method, comprising:
the display unit is used for displaying at least one operation object, wherein the operation object can be triggered to display the running result of the application corresponding to the operation object;
a detection unit for detecting a first operation by a user;
the selecting unit is used for determining a first operation object from the at least one operation object according to the first operation provided by the detecting unit, wherein the first operation object is in a first display state;
a setting unit for generating a display adjustment instruction;
the adjusting unit is used for changing the display parameters of the first operation object according to the display adjusting instruction generated by the setting unit so that the first operation object is switched from a first display state to a second display state;
wherein, when the first operation object is in the first display state, the user can perceive that the first operation object is located at a first spatial position, and when the first operation object is in the second display state, the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different;
the user can perceive that the first operation object is located at a second spatial position, and the first spatial position and the second spatial position are different, including:
the second spatial position is located on a second spatial plane, the second spatial plane is a spatial plane parallel to a display plane where the first operation object is located, the display plane is a plane where the display unit of the electronic device is located, a third distance is provided between the second spatial plane and a user, and a fourth distance is provided between the display plane and the user, wherein the third distance is smaller than the fourth distance; the first spatial position is located on a first spatial plane, the first spatial plane is a spatial plane parallel to a display plane where the first operation object is located, a fifth distance exists between the first spatial plane and the user, the fifth distance is smaller than the fourth distance, and the fifth distance is larger than the third distance;
the setting unit includes:
the acquisition subunit is configured to determine that the spatial position of the operation body is a second spatial position by acquiring the spatial position corresponding to the operation body on which the user performs the first operation;
and the setting subunit is configured to generate the display adjustment instruction by determining the second spatial position, where the display parameter includes a display position parameter corresponding to the second spatial position.
8. The device according to claim 7, wherein the selecting unit is specifically configured to:
acquiring an image of the first operation of the user, analyzing the image to determine the direction of the first operation, and determining the first operation object as the selected first operation object according to the operation object corresponding to the direction extension line;
or,
acquiring an image of the first operation of the user and a head image, analyzing the head image and the image of the first operation, determining a direction from the head to an extension line direction of the first operation direction, and determining an operation object at the extension line of the direction as the first operation object through the first operation;
or,
acquiring the first operation image of the user, and analyzing and determining an operation object with a projection point distance smaller than a preset threshold value from the first operation image to the electronic equipment as the first operation object.
9. The apparatus according to claim 7, wherein the acquisition subunit is specifically configured to:
determining a first operation position of the first operation;
determining the second spatial position in dependence on the first operational position, wherein a distance between the second spatial position and the first operational position is less than a threshold.
10. The apparatus according to any one of claims 7 to 9, wherein the adjustment unit is specifically configured to:
providing a spatial position parameter for changing the first spatial position corresponding to the first display state for the first operation object according to the second spatial position corresponding to the second display state;
or,
and moving the first operation object from the first space position corresponding to the first display state to the second space position corresponding to the second display state according to a preset position variable parameter.
11. The apparatus of claim 7, further comprising:
and the triggering unit is used for detecting a second operation of the user, judging whether the second operation is matched with an operation action corresponding to a preset triggering instruction, and if so, determining that the second operation is the operation action corresponding to the triggering instruction.
12. The apparatus of any of claims 7-9, 11, wherein the first and second spatial locations are different, comprising:
the first operation corresponds to a first operation position, the distance between the first operation position and the first space position is a first distance, the distance between the first operation position and the second space position is a second distance, and the first distance is larger than the second distance.
CN201310062955.5A 2013-02-28 2013-02-28 A kind of display methods and equipment Active CN104714728B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310062955.5A CN104714728B (en) 2013-02-28 2013-02-28 A kind of display methods and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310062955.5A CN104714728B (en) 2013-02-28 2013-02-28 A kind of display methods and equipment

Publications (2)

Publication Number Publication Date
CN104714728A CN104714728A (en) 2015-06-17
CN104714728B true CN104714728B (en) 2018-10-12

Family

ID=53414122

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310062955.5A Active CN104714728B (en) 2013-02-28 2013-02-28 A kind of display methods and equipment

Country Status (1)

Country Link
CN (1) CN104714728B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182019B (en) * 2018-01-16 2020-03-17 维沃移动通信有限公司 Suspension control display processing method and mobile terminal
CN110418059B (en) * 2019-07-30 2021-12-24 联想(北京)有限公司 Image processing method and device applied to electronic equipment, electronic equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1380996A (en) * 2000-05-17 2002-11-20 皇家菲利浦电子有限公司 Apparatus and method for indicating target by image processing without three-dimensional modeling
CN102270037A (en) * 2010-06-04 2011-12-07 宏碁股份有限公司 Manual human machine interface operation system and method thereof
CN102385438A (en) * 2010-08-31 2012-03-21 索尼公司 Information processing device, information processing method, and program
CN102402379A (en) * 2010-09-14 2012-04-04 Lg电子株式会社 Mobile terminal and controlling method thereof
CN102592569A (en) * 2011-01-10 2012-07-18 联想(北京)有限公司 Electronic equipment and display method
CN102693063A (en) * 2011-03-23 2012-09-26 联想(北京)有限公司 Operation control method and device and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8086971B2 (en) * 2006-06-28 2011-12-27 Nokia Corporation Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1380996A (en) * 2000-05-17 2002-11-20 皇家菲利浦电子有限公司 Apparatus and method for indicating target by image processing without three-dimensional modeling
CN102270037A (en) * 2010-06-04 2011-12-07 宏碁股份有限公司 Manual human machine interface operation system and method thereof
CN102385438A (en) * 2010-08-31 2012-03-21 索尼公司 Information processing device, information processing method, and program
CN102402379A (en) * 2010-09-14 2012-04-04 Lg电子株式会社 Mobile terminal and controlling method thereof
CN102592569A (en) * 2011-01-10 2012-07-18 联想(北京)有限公司 Electronic equipment and display method
CN102693063A (en) * 2011-03-23 2012-09-26 联想(北京)有限公司 Operation control method and device and electronic equipment

Also Published As

Publication number Publication date
CN104714728A (en) 2015-06-17

Similar Documents

Publication Publication Date Title
EP2638461B1 (en) Apparatus and method for user input for controlling displayed information
US10642372B2 (en) Apparatus and method for remote control using camera-based virtual touch
CN108469899B (en) Method of identifying an aiming point or area in a viewing space of a wearable display device
US9405373B2 (en) Recognition apparatus
EP2908215B1 (en) Method and apparatus for gesture detection and display control
JP6165485B2 (en) AR gesture user interface system for mobile terminals
US20160048214A1 (en) Using distance between objects in touchless gestural interfaces
KR101631011B1 (en) Gesture recognition apparatus and control method of gesture recognition apparatus
EP3021206B1 (en) Method and device for refocusing multiple depth intervals, and electronic device
KR20120045667A (en) Apparatus and method for generating screen for transmitting call using collage
JP2012238293A (en) Input device
TW201428545A (en) Input device, apparatus, input method, and recording medium
CN104135610B (en) A kind of information processing method and electronic equipment
CN107797748B (en) Virtual keyboard input method and device and robot
CN109389082B (en) Sight line acquisition method, device, system and computer readable storage medium
CN104714728B (en) A kind of display methods and equipment
CN103870146B (en) Information processing method and electronic equipment
CN104007928A (en) Information processing method and electronic device
CN103902202B (en) A kind of information processing method and electronic equipment
WO2018076609A1 (en) Terminal and method for operating terminal
CN104866163B (en) Image display method, device and electronic equipment
EP3015953A1 (en) Method and system for detecting objects of interest
CN104965647A (en) Information processing method and electronic device
CN105711096B (en) Data processing method and electronic equipment
CN104679400B (en) A kind of method and terminal of contactless input information

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant