WO2011069435A1 - 操作对象的操作控制的方法及终端设备 - Google Patents

操作对象的操作控制的方法及终端设备 Download PDF

Info

Publication number
WO2011069435A1
WO2011069435A1 PCT/CN2010/079507 CN2010079507W WO2011069435A1 WO 2011069435 A1 WO2011069435 A1 WO 2011069435A1 CN 2010079507 W CN2010079507 W CN 2010079507W WO 2011069435 A1 WO2011069435 A1 WO 2011069435A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
image
operation direction
terminal device
window
Prior art date
Application number
PCT/CN2010/079507
Other languages
English (en)
French (fr)
Inventor
张渊毅
刘俊峰
王茜莺
贺志强
Original Assignee
北京联想软件有限公司
联想(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN200910241771.9A external-priority patent/CN102087571B/zh
Priority claimed from CN200910242423.3A external-priority patent/CN102096517B/zh
Application filed by 北京联想软件有限公司, 联想(北京)有限公司 filed Critical 北京联想软件有限公司
Priority to US13/513,948 priority Critical patent/US9836139B2/en
Publication of WO2011069435A1 publication Critical patent/WO2011069435A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to a handheld terminal device, and more particularly to a method and a terminal device for operating control of an operation object.
  • handheld devices More and more handheld devices are pursuing thin and large screens, and touch technology is becoming more widely used on handheld devices. As the computing power of handheld devices has increased, it has become common to display 3D graphics, such as street view maps and stereoscopic menu displays.
  • panning and rotation are two common modes of operation that can be done by touch.
  • only one operation mode is usually supported at a time, that is, only a pan operation or a rotation operation can be performed. If you want to do another type of operation, you must switch, otherwise the operation will be ambiguous.
  • FIG. 1 it is an effect diagram of a pan operation performed by using a touch screen thereof in an existing handheld device.
  • FIG. 2 it is an effect diagram of a rotation operation of a touch panel using an existing handheld device.
  • the panning operation and the rotation operation cannot be performed simultaneously. If the effect of both translation and rotation is to be achieved, the translation operation must be performed, after the operation object is translated to the predetermined position, and then switched to the rotation operation mode, the rotation operation is performed. The object is then rotated at the position to which the operational object is translated.
  • the technical problem to be solved by the present invention is to provide a method and a terminal device for realizing operation control of an operation object simultaneously performed by two operations, which can realize an operation effect in which two operations are simultaneously performed.
  • the technical problem to be solved by the present invention is to provide a terminal device and an input method, which do not affect the display effect of the user screen when implementing touch control.
  • an embodiment of the present invention provides a method for controlling operation of an operation object, including: acquiring a first operation direction and a second operation direction of an operation object; determining the first operation direction and the first The operation corresponding to the direction combination of the two operation directions; the operation is performed on the operation object.
  • the step of determining an operation corresponding to a direction combination relationship between the first operation direction and the second operation direction is: determining the first operation direction and the second according to an operation type of the operation object The direction of the direction of operation combines the operations corresponding to the relationship.
  • the operation type of the operation object is: when the operation of the stereo operation object is performed, determining an operation corresponding to the direction combination relationship between the first operation direction and the second operation direction according to the operation type of the operation object
  • the step is specifically: when the first operation direction and the second operation direction are the same, determining that the operation is: a translation operation of the stereoscopic operation object in the first operation direction or the second operation direction Or when the first operation direction and the second operation direction are opposite, the vertical line of the trajectory line formed by the first operation direction and the second operation direction is rotated for the axis.
  • the operation type of the operation object is: when the operation of the plane operation object is performed, determining, according to the operation type of the operation object, an operation corresponding to the direction combination relationship of the first operation direction and the second operation direction
  • the step is specifically: when the first operation direction and the second operation direction are the same, determining that the operation is: the planar operation object is in the first operation direction or the second operation direction a panning operation; or when the first operating direction and the second operating direction are opposite, determining that the operation is: an overall zooming operation of the planar operating object.
  • the operation type of the operation object is: when the operation of the window operation object is performed, determining an operation corresponding to the direction combination relationship between the first operation direction and the second operation direction according to the operation type of the operation object
  • the step is specifically: when the first operation direction and the second operation direction are the same first direction, determining that the operation is: an opening operation of the window operation object; or determining an operation object with the window operation
  • the operation of the opposite direction of the opening operation is: the closing operation of the window operation object; or when the first operation direction and the second operation direction are the same second direction, determining that the operation is:
  • the operation of maximizing the operation of the window operation object; or determining the operation in the opposite direction to the maximum operation of the window operation object is: a reduction operation of the window operation object; wherein the first direction and the second direction are different.
  • determining that the operation is a first operation
  • when the first operation direction and the second operation direction are opposite determining The operation is the second operation.
  • the method further includes: selecting a third operation of an positioning point on the operation object; performing a first operation away from the positioning point with respect to the positioning point, performing amplification with respect to the positioning point .
  • An embodiment of the present invention further provides a terminal device, including a housing, a first operating unit disposed on the housing, further comprising: a second operating unit disposed on the housing; the first operation a unit, configured to acquire a first operation direction of an operation object, a second operation unit, configured to acquire a second operation direction of the operation object, and a processing unit, configured to determine the first operation direction and the first An operation corresponding to the direction combination relationship of the two operation directions; an operation execution unit, configured to perform the operation on the operation object, and output the execution result of the operation output.
  • the processing unit includes: a first processing subunit, configured to determine an operation type of the operation object according to an attribute feature of the operation object; and a second processing subunit, configured to operate according to the operation object And determining an operation corresponding to a combination of directions of the first operation direction and the second operation direction.
  • the first processing sub-unit determines that the operation type of the operation object is: when the operation is performed on the stereoscopic operation object, the second processing sub-unit is specifically configured to: in the first operation direction and the first When the two operation directions are on the same line and are the same, the operation is determined as: a translation operation of the stereoscopic operation object in the first operation direction or the second operation direction; or in the first operation direction and When the second operation direction is opposite, the vertical line of the trajectory line formed by the first operation direction and the second operation direction is rotated as an axis.
  • the first processing sub-unit determines that the operation type of the operation object is: when the operation of the planar operation object is performed, the second processing sub-unit is specifically configured to: in the first operation direction and the first When the two operation directions are the same, determining that the operation is: a translation operation of the planar operation object in the first operation direction or the second operation direction; or in the first operation direction and the second When the operation direction is reversed, it is determined that the operation is: an overall enlargement operation of the planar operation object.
  • the first processing sub-unit determines that the operation type of the operation object is: when the window operation object is operated, the second processing sub-unit is specifically configured to: in the first operation direction and the first When the second direction of operation is the same, the operation is determined as: An opening operation of the image; or an operation of determining a direction opposite to the opening operation of the window operation object is: a closing operation of the window operation object; or the same in the first operation direction and the second operation direction In the case of the two directions, the operation is determined to be: a maximization operation of the window operation object; or an operation of determining a direction opposite to the maximization operation of the window operation object is: a reduction operation of the window operation object.
  • the first operating unit is disposed at a first position of the housing; and the second operating unit is disposed at a second position of the housing opposite the first position.
  • the operation when the first operation direction is the same as the second operation direction, the operation is a first operation; when the first operation direction is opposite to the second operation, the operation is a second operation. operating.
  • the first operation unit includes: an image collection unit; and a transparent window disposed on the image collection channel of the image collection unit, the transparent window being away from the image collection unit a surface is spaced apart from the image collecting unit to form a space; wherein the image collecting unit is configured to collect an image when the pointing object contacts the first surface of the transparent window, and the image is
  • the processing unit is configured to calculate a trajectory of the pointing object according to the image, and acquire a first operating direction of the operating object according to the trajectory.
  • the first operating unit further comprises: at least one illuminating device for illuminating light into the space; the illuminating device and the image concentrating unit are located on the same side of the first surface.
  • the image collection unit has a shooting mode or a positioning mode, and when the image collecting unit operates in the positioning mode, the lighting device is in an activated state.
  • the illuminating device is fixedly disposed, and the illuminating direction is toward the transparent window.
  • the illuminating device is fixedly disposed, and the first operating unit further includes an optical device disposed in a light emitting direction of the illuminating device for guiding light emitted by the illuminating device to the space.
  • the illuminating device is fixedly disposed, and the illuminating device is an annular illuminator disposed around the image concentrating unit.
  • the illuminating device is fixedly disposed in the transparent window at one end of the transparent window, and the illuminating direction is toward the other end of the transparent window.
  • the illuminating device is adjustable, and the illuminating device comprises: an illuminating unit; an angle adjusting module, connected to the illuminating unit, configured to work on the positioning modulating unit in the image concentrating unit In the formula, the illuminator is adjusted to transmit the illuminant to the space.
  • the illuminating device is adjustable, and the illuminating device comprises: an illuminating unit; an optical device, configured to adjust a ray path of the illuminating body when the image concentrating unit operates in a positioning mode, so that The illuminator transmits light to the space, and when the image concentrating unit operates in a shooting mode, adjusts a ray path of the illuminator to cause the illuminator to transmit light to a space outside the transparent window.
  • the optical device when the image collecting unit is operated in the positioning mode, the optical device is located on a light path of the illuminant, and the light emitted by the illuminating body is emitted into the space through the illuminant.
  • the optical device When the image collecting unit operates in the shooting mode, the optical device is located outside the light path of the light emitting body, and the light emitted by the light emitting body is emitted to the outside through the transparent window.
  • the second operation unit includes: an image collection unit, and a transparent window disposed on the image collection channel of the image collection unit, the transparent window being away from the image collection unit a surface is spaced apart from the image collecting unit to form a space; wherein the image collecting unit is configured to collect an image when the pointing object contacts the first surface of the transparent window, and The processing unit is configured to calculate a trajectory of the pointing object according to the image, and acquire a second operating direction of the operating object according to the trajectory.
  • An embodiment of the present invention further provides a method for controlling operation of an operation object, the operation object including at least one display object, comprising: determining a priority of the display object; receiving a first operation direction and the second to the display object An instruction to operate the direction; when the first operation direction and the second operation direction are opposite, display information of the current priority of the display object and display information lower than the current priority are displayed.
  • the embodiment of the present invention further provides a terminal device, including: a storage unit, configured to store an operation object including at least one display object; a processing unit, configured to determine a priority of the display object; An operation direction and a second operation direction instruction; when the first operation direction and the second operation direction are opposite, generating display information indicating a current priority of the display object and an instruction lower than a current priority display information a display unit, configured to display display information of the display object according to the instruction.
  • a terminal device including: a storage unit, configured to store an operation object including at least one display object; a processing unit, configured to determine a priority of the display object; An operation direction and a second operation direction instruction; when the first operation direction and the second operation direction are opposite, generating display information indicating a current priority of the display object and an instruction lower than a current priority display information
  • a display unit configured to display display information of the display object according to the instruction.
  • An embodiment of the present invention further provides a terminal device, including: a housing; a main board disposed in the housing; an operation unit disposed on the housing and connected to the main board; and being disposed in the housing and the a processing unit connected to the motherboard; wherein the operating unit comprises: an image collection And a transparent window disposed on the image collection channel of the image collection unit, the transparent window being spaced apart from the image collection unit by a distance from the first surface of the image collection unit, forming a
  • the image collection unit is configured to collect an image when the pointing object contacts the first surface of the transparent window; and the processing unit is configured to calculate a trajectory of the pointing object according to the image And generating a corresponding input instruction according to the trajectory.
  • An embodiment of the present invention further provides an input method, which is applied to the terminal device, and the method includes: collecting an image when the pointing object contacts the first surface of the transparent window; and calculating the pointing according to the image The trajectory of the object, and corresponding input instructions are generated according to the trajectory.
  • the first operation direction and the second operation direction of an operation object are simultaneously acquired, and according to the combination relationship of the first operation direction and the second operation direction, the corresponding operation is determined, and two operations are simultaneously performed. And can get the effect of the two operations performed at the same time.
  • the existing image collection unit is used to capture the image of the pointing object on the transparent window surface, and then the analysis is performed to determine the position of the pointing object, which does not need to add additional equipment, realizes low cost, and is very important for small portable devices. .
  • FIG. 1 is an effect diagram of a panning operation using a touch screen thereof in a conventional handheld device
  • FIG. 2 is an effect diagram of a rotating operation using a touch screen thereof in a conventional handheld device
  • FIG. 3 is an operation control of an operation object of the present invention Method flow chart
  • FIG. 4 is a structural diagram of a terminal device of the present invention.
  • FIG. 5 is a schematic diagram of an operation direction of the terminal device shown in FIG. 4 performing an operation on an operation object
  • FIG. 6 is an operation effect diagram of the operation object shown in FIG. 5;
  • Figure 7 is a schematic illustration of the operation direction of another operation performed by the terminal device of Figure 4 on an operation object
  • FIG. 8 is an operation effect diagram of the operation object shown in FIG. 7;
  • FIG. 9 is a schematic diagram of the operation direction and opening of the terminal device shown in FIG. 4 to the address book operation object;
  • FIG. 10 is a schematic diagram of the operation of the terminal device shown in FIG.
  • FIG. 11 is a schematic diagram showing a relative positional relationship between a transparent window and an image collection unit according to an embodiment of the present invention;
  • FIG. 12 to FIG. 15 are schematic diagrams showing possible relative positional relationships between a light-emitting device, an image collection unit, and a transparent window when a light-emitting device having a fixed setting is provided in an embodiment of the present invention
  • FIG. 16 is a schematic diagram showing a relative positional relationship between a light-emitting device, an image collecting unit, and a transparent window in a normal imaging mode when an adjustable light-emitting device is provided in the embodiment of the present invention
  • FIG. 16 when an adjustable illuminating device is provided, in a positioning mode, a schematic diagram of a relative positional relationship between the illuminating device, the image concentrating unit, and the transparent window.
  • the method for controlling the operation of the operation object of the present invention includes:
  • Step 31 Obtain a first operation direction and a second operation direction of an operation object.
  • Step 32 Determine an operation corresponding to a direction combination relationship between the first operation direction and the second operation direction;
  • Step 33 Perform the operation on the operation object.
  • the method simultaneously determines the first operation direction and the second operation direction of an operation object, and determines a corresponding operation according to the combination relationship of the first operation direction and the second operation direction, thereby realizing two operations simultaneously.
  • the foregoing step 32 determines an operation corresponding to the direction combination relationship between the first operation direction and the second operation direction according to the operation type of the operation object.
  • the operation type of the operation object may include: operations on the stereo operation object, operations on the planar operation object, or operations on the window operation object, but are not limited thereto, and may also perform other operation objects according to specific requirements.
  • the definition of the operation in all directions, but the implementation is similar to the operation types of these kinds of operation objects.
  • the operation type of the operation object is: the operation of the stereoscopic operation object
  • the step of determining the operation corresponding to the direction combination relationship of the first operation direction and the second operation direction according to the operation type of the operation object Specifically for the following situations:
  • the trajectory line formed by the second operation direction is parallel and the direction is the same, or the trajectory line is considered to be parallel by the system, that is, the trajectory of the first operation direction and the second operation direction may have a certain angle, but after the system processes If the operation is parallel, determining that the operation is: a translation operation of the stereoscopic operation object in the first operation direction or the second operation direction;
  • the operation is: in the first operation
  • first operation direction and the second operation direction are not strictly in the X-axis or Y-axis direction, as long as the first operation
  • the operation direction and the second operation direction may be the same in the same direction, such as the direction of the angle bisector along the X axis and the Y axis, and the angle bisector of the angle bisector and the X axis, or the angle bisector and the Y axis
  • the direction of the angle bisector, and so on, all operations in the same direction are possible.
  • the operation is: performing a vertical line of the trajectory line formed by the first operation direction and the second operation direction as an axis
  • the rotation operation ie, the trajectory lines formed by the first operation direction and the second operation direction are parallel or approximately parallel, and the directions are opposite, wherein approximately parallel means that the trajectories of the first operation direction and the second operation direction may have a certain angle, but the system It is considered to be parallel; for example, the perpendicular line of the trajectory line formed by the first operation direction and the second operation direction is an axis, a rotation operation of the first operation direction or a rotation operation of the second operation direction.
  • the operation is: the stereo operation The rotation operation of the object in the first operation direction centered on the X axis or the rotation operation centered on the Y axis in the second operation direction;
  • the vertical operation object rotates with the perpendicular line of the trajectory line formed by the first operation direction and the perpendicular line of the trajectory line formed by the second operation direction as the axis, for example, the vertical line of the trajectory line formed by the first operation direction is the axis Rotation of the operation direction while rotating the second operation direction with the perpendicular line of the trajectory line formed by the second operation direction as the axis;
  • the operation is: centering on the X-axis Rotation in the first operational direction while rotating in the second operational direction centered on the Y axis;
  • the axis rotates in the forward direction; the axis rotates in the forward direction;
  • the axis rotates in the negative direction; the axis rotates in the negative direction;
  • the X-axis and the ⁇ axis in Table 3 can be defined as components on the X-axis and the ⁇ -axis in any two directions of different dimensions.
  • the system maps the curves to a straight line for processing.
  • the specific mapping method may be: using a start point and an end point of the track to determine a straight line, or mapping the curve to a plurality of consecutive straight lines, and processing in sequence.
  • the operation type of the operation object is: the operation of the planar operation object, determining the operation corresponding to the direction combination relationship of the first operation direction and the second operation direction according to the operation type of the operation object Specifically for the following situations:
  • the translation of the plane operation object if the plane operation object is a webpage or a plane image, the webpage is selected in the same direction by the operation of the first operation direction and the operation of the second operation direction, and the webpage is in accordance with the The operation direction and the second operation direction are moved to realize the function of the scroll bar on the webpage when the webpage is displayed on the small screen of the handheld device.
  • the system detects the operation in the first operation direction and the operation in the second operation direction, the webpage is moved along the first operation direction and the second operation direction, and the click trigger operation corresponding to the operation in the first operation direction is ignored, thereby being solved The problem of a webpage link being triggered by mistake.
  • the first operation direction is: from the X-axis negative direction to the X-axis positive direction (that is, from the user's point of view, from left to right), and the second operation direction is: from the X-axis positive direction to the X-axis negative direction. (from the user's point of view, from right to left), representing the overall magnification of the planar operation object; or
  • the first operating direction is: from the negative Y-axis to the positive Y-axis (ie from bottom to top from the user's point of view), and the second operating direction is: from the positive Y-axis to the negative Y-axis ( That is, from the user's point of view, from top to bottom), representing the overall magnification of the planar operation object; or
  • the first operational direction is: a first direction along the diagonal of the X-axis and the Y-axis (from the user's point of view, from the lower left to the upper right), and the second operational direction is: along the X-axis and the Y-axis
  • the second direction of the diagonal represents the overall magnification operation of the planar operation object
  • the combination of the directions of the first operational direction and the second operational direction is not limited to these, and as long as the directions are opposite, the overall amplification operation can be represented.
  • determining an operation opposite to an overall enlargement operation direction of the planar operation object is: an overall reduction operation of the planar operation object;
  • the first operation direction is: from the positive X-axis to the negative X-axis (from the user's point of view, right to left)
  • the second operation direction is: from the X-axis negative direction to the X-axis positive direction (that is, from the user's point of view, from left to right), representing the overall reduction of the planar operation object;
  • the first operational direction is: from the positive Y-axis to the negative Y-axis (ie from top to bottom from the user's point of view), while the second operational direction is: from the negative Y-axis to the positive Y-axis Toward (ie, from the user's point of view, from bottom to top), representing the overall reduction of the planar operation object;
  • the first operational direction is: a second direction along the diagonal of the X-axis and the Y-axis (from the user's point of view, from the upper right to the lower left), and the second operational direction is: along the X-axis and The first direction of the diagonal of the Y-axis (from the user's point of view, from the lower left to the upper right), represents the overall reduction operation of the planar operation object;
  • the reduction operation of the planar operation object is not limited to the above-described combination of the direction of the first operation direction and the second operation direction, and the operation of all the directions opposite to the overall enlargement operation can be defined as the overall reduction operation.
  • the rotation operation of the plane operation object on the plane may be defined, and the combination relationship between the first operation direction and the second operation direction corresponding to the rotation operation may be different from the direction of the operation defined above. No longer.
  • local enlargement can also be implemented on the planar operation object: if the operation operation of the second operation direction can be performed only on the planar operation object, that is, the planar operation object is positioned by the second operation The portion to be partially enlarged is subjected to movement in the second operational direction to achieve partial enlargement of the planar operation object.
  • the specific implementation process of the local operation of the planar operation object is as follows: generating a partial amplification control point; correspondingly generating a partial amplification area according to the partial amplification control point; and controlling the position of the partial amplification control point by the second operation direction; The display content of the control point is enlarged and displayed in the partial enlargement area.
  • the user operation can be prevented, and the problem of false triggering can be avoided.
  • first operation direction and the second operation direction are both positively along the X axis and the X axis are positive, which represents an open program operation; of course, the combination relationship between the first operation direction and the second operation direction is not limited thereto, and may be Any other direction of operation in the same direction.
  • the first operation direction and the second operation are based on the enlargement operation of the window defined in the above 1)
  • the direction is negative along the X-axis and the X-axis in the negative direction, which means that the window operation is closed; of course, the combination of the first operation direction and the second operation direction is not limited thereto, and the operation is reversed for all directions that can represent the amplification operation.
  • first operation direction and the second operation direction are the first movement directions of the diagonal lines along the X-axis and the Y-axis at the same time (ie, from the user's point of view, while doing the lower left obliquely to the upper right movement), representing the window maximization operation.
  • the combination relationship between the first operation direction and the second operation direction is not limited thereto, and may be any other two operation directions having the same direction;
  • determining an operation in a direction opposite to the maximization operation of the window operation object is: a reduction operation of the window operation object;
  • the first operational direction and the second operational direction are the second direction of motion of the diagonal along the X-axis and the Y-axis simultaneously (ie, from the perspective of the user, while doing the right upper oblique Move to the lower left), which minimizes the operation on behalf of the window.
  • the method further includes:
  • the implementation process is as shown in FIG. 10, an positioning point 102 of the operation object 101 is selected, and the same direction movement 103 is performed away from the positioning point, and the operation object is enlarged relative to the positioning point; An operation 104 opposite to the direction of the magnifying operation is performed, and the operating object is zoomed out relative to the positioning point.
  • the operation object here may be a stereo operation object, a plane operation object, or another operation object such as a window.
  • the stereoscopic operation object may be a stereoscopic object, a street view map, a stereoscopic menu, or the like;
  • the planar operation object may be a planar image in the screen, etc.;
  • the window operation object may be a window corresponding to various applications;
  • the above method of the present invention can also automatically determine according to the current scene according to the validity of the operation type of the operation object, for example, the current window display content is 3D solid object, the meaning of the touch motion is defined according to the operation type 1 of the above-mentioned stereoscopic operation object;
  • the current window display content is a planar image (such as a picture), the meaning of the touch motion is defined according to the operation type 2;
  • the current window is displayed as other content or Program Manager, the meaning of touch motion is defined by operation type 3.
  • an embodiment of the present invention further provides a terminal device 40, including a housing 41, and a first operating unit 42 disposed on the housing 41, further comprising: disposed on the housing 41 Second operating unit 44;
  • the first operation unit 41 is configured to acquire a first operation direction of an operation object 43.
  • the second operation unit 44 is configured to acquire a second operation direction of the operation object 43.
  • the processing unit is not shown. An operation for determining a direction combination relationship between the first operation direction and the second operation direction; when the processing unit is specifically implemented, the processor may be implemented by the processor of the terminal device;
  • An operation execution unit (not shown), configured to perform the operation on the operation object, and output an execution result of the operation; when the operation execution unit is specifically implemented, the operation may be performed by the terminal device To achieve.
  • the first operating unit and the second operating unit are physically separated two operating units, the first operating unit may be disposed at a first position of the housing; the second operating unit is disposed at the a second position of the housing opposite the first position; this facilitates user operation and conforms to the user's normal operating habits.
  • the first operation unit is a touch screen on the front side (the user-facing side) of the terminal device
  • the second operation unit is a touch panel disposed on the back side of the terminal device; of course, the touch panel can also be disposed on the side of the terminal device, as long as it is An operating unit is physically separated and can be positioned differently.
  • the terminal device may further include:
  • a display module (such as a display that can be a terminal device) is used to display the result of the operation performed on the operation object. If the display screen is a touch screen, the display screen can also serve as the first operation unit as a display unit.
  • the specific implementation of the processing unit may include:
  • a first processing subunit configured to determine an operation type of the operation object according to an attribute feature of the operation object
  • a second processing sub-unit configured to determine an operation corresponding to a direction combination relationship between the first operation direction and the second operation direction according to an operation type of the operation object.
  • the second processing sub-unit is specifically configured to:
  • the vertical line of the trajectory line formed by the first operation direction and the second operation direction is rotated for the axis, for example, a perpendicular line of the trajectory line formed by the first operation direction and the second operation direction is an axis, a rotation operation of the first operation direction or a rotation operation of the second operation direction; or
  • determining that the operation is: performing a first operation on the axis of the trajectory formed by the stereoscopic operation object in the first operation direction
  • the rotation of the direction while the vertical line of the trajectory formed by the second operation direction is the rotation of the second operation direction.
  • the stereoscopic operation object is centered on the Y axis, facing the user.
  • One is facing the X-axis forward rotation; the effect diagram of the three-dimensional operation object rotation is shown in FIG. 6.
  • the drawings merely show the first operational direction and the second operational direction. Depending on the position of the first operational unit and the second operational unit, the first operational direction and the second operational direction may or may not coincide.
  • the stereoscopic operation object is in the first operation direction and the second operation.
  • the perpendicular line of the trajectory formed by the direction is a rotation of the first operation direction or the second operation direction of the axis, such as in the system in which the operation object is located, with the X axis as the center, and a Y-axis forward rotation facing the user;
  • the effect diagram of the rotation of the operation object is shown in Fig. 8.
  • FIG. 5 and FIG. 6, or FIG. 7 and FIG. 8 represent only one of the operations, and other operations may be performed according to the definitions of Table 1, Table 2, and Table 3 above, and details are not described herein again.
  • the second processing sub-unit is specifically configured to:
  • the operation of determining the opposite direction of the overall enlargement operation of the planar operation object is: an overall reduction operation of the planar operation object.
  • the first processing sub-unit determines that the operation type of the operation object is: when the operation of the object is operated on the plane
  • the second processing sub-unit is further configured to:
  • the terminal only needs to be single on the touch panel.
  • the touch locates the part of the plane operation object to be partially enlarged, and then moves on the touch panel to trigger a partial enlargement of the part corresponding to the partial enlargement, such as a small link text on the webpage, and positioning the text through the touchpad
  • a partial enlargement of the part corresponding to the partial enlargement such as a small link text on the webpage
  • the area, and moving on the touchpad enlarges the link text area, and the magnification is preset.
  • the partially enlarged text area or image may be restored to the original size after being enlarged, and the normal webpage or image may be restored by double-clicking on the touch panel to cancel the enlargement; or the partially enlarged text area may be preset. Or the image will automatically return to the original size after a few seconds on the screen, or the enlarged text area or image will return to normal when leaving the touchpad; this can effectively solve the problem that the text on the handheld device is small, and the finger is difficult to click accurately. The problem with the link.
  • the operation object is an operation object having priority, and the first operation unit and the second operation unit are moved in opposite directions, that is, through the first operation direction of the first operation unit and through the second operation unit.
  • the operation object is opened step by step;
  • the specific implementation process is as follows: the operation object has at least one display object, and the display object includes high priority display information and low priority display information;
  • the priority of the displayed information may be three or more levels.
  • the foregoing implementation process may further include:
  • the display font of the display information of each display object can be reduced, and the current priority display object and the display object lower than the current priority can be displayed.
  • the display information is displayed; the display font of the displayed information can also be kept, and the operation object is displayed in a page.
  • each record in the address book is a display object, and each record includes multiple display contents with different priorities: if the name identification information is high priority; and the name identification information includes the phone number
  • the information and the like are medium priority display information; and the other identification information included in the name identification information is low priority display information, such as fax information, communication address, etc.;
  • the display module only displays the high-priority display content, that is, only the name identification information is displayed; in the interface displayed by the name display object, the first operation and the second operation with the opposite receiving directions may be: The operation acquired by the unit is downward movement relative to the display screen, and the operation acquired by the second operation unit is upward movement relative to the display screen; or the operation acquired by the first operation unit is upward movement relative to the display screen, and the operation acquired by the second operation unit is relative The display moves down in the address book;
  • the display module displays the high priority display content and the medium priority display content, that is, displays the identification information.
  • the display module displays high priority display content, medium priority display content, and low priority display content, that is, displays all content.
  • an embodiment of the present invention further provides a terminal device, where the terminal device includes: a storage unit, configured to store an operation object including at least one display object;
  • a processing unit configured to determine a priority of the display object; receive an instruction for the first operation direction and the second operation direction of the display object; and when the first operation direction and the second operation direction are opposite, generate the display Displaying display information of the current priority of the object and an instruction of displaying information lower than the current priority;
  • a display unit configured to display display information of the display object according to the instruction.
  • the processing unit receives an instruction of the first operation direction and the second operation direction; when the first operation direction and the second operation direction are opposite, displaying display information of the current priority of the display object and lower than the current Priority display information.
  • the display font of the display information of each display object may be reduced, so that the current priority display object and the current priority are lower than the current priority.
  • the display information of the display object is displayed; the display font of the display information may be maintained, and the operation object may be displayed in a page.
  • the second processing sub-unit is specifically configured to:
  • Determining an operation in a direction opposite to an opening operation of the window operation object is: a closing operation of the window operation object; or
  • the operation of determining the opposite direction to the maximize operation of the window operation object is: the window operation operation of the object reduction operation.
  • the above terminal device of the present invention may further have the following features:
  • the operation is the first operation, that is, the first operation is performed on the operation object;
  • the operation is a second operation, i.e., a second operation is performed on the operational object.
  • the method may further include:
  • an locating point 102 of the operating object 101 is selected by the first operating unit, and the first operating unit and the second operating unit simultaneously move in the same direction away from the positioning point. And performing an operation of the operation object with respect to the positioning point.
  • the operation object here may be a stereo operation object, a plane operation object, or other operation object such as a window.
  • the terminal device embodiment of the present invention may also have other applications:
  • one positioning point in the planar image may be selected first, and then the overall magnification with respect to the positioning point is performed, for example, an positioning point in the planar image is selected by the first operating unit, and then Further, by the movement of the first operation unit and the second operation unit in opposite directions, the overall enlargement of the planar image relative to the positioning point can be achieved, and the operation opposite to the enlarged direction is an overall reduction.
  • the terminal device of the present invention may specifically be a handheld device, and may specifically be operated by one hand.
  • the above solution of the present invention provides a new operation mode by adding a second operation unit to the terminal device, and the user can simultaneously translate and rotate the operation object, and can realize the operation effect of simultaneously shifting and rotating the operation object. It is more convenient to interactively control the screen content, providing users with richer interaction modes; it can be widely used for display and interactive control of street view maps, 3D stereoscopic images, etc. on handheld devices.
  • the pointing can be taken with the existing image collection unit as described below.
  • the image of the object on the surface of the transparent window is then analyzed to determine the location of the pointing object.
  • a transparent window is disposed on the image collection channel of the image collection unit of the terminal device, and the first window of the transparent window away from the image collection unit is spaced apart from the image collection unit. a distance is formed, and the image collecting unit collects an image when the pointing object contacts the first surface of the transparent window, and sends the image to the processing unit, and calculates a trajectory of the pointing object according to the image, and according to the image The trace generates a corresponding input command. Since the pointing object does not slide on the surface of the display, but operates on a transparent window different from the display, it does not affect the user. Watch what is displayed.
  • the terminal device of the embodiment of the invention includes a main board, a processing unit and an image collection unit disposed in the housing and connected to the main board, wherein:
  • a transparent window is disposed on the image collection channel of the image collection unit, and the transparent window is spaced apart from the image collection unit by a distance from the first surface of the image collection unit to form a space;
  • the terminal device further includes:
  • the image collection unit is configured to collect an image when the pointing object contacts the first surface of the transparent window
  • the processing unit is configured to calculate a trajectory of the pointing object according to the image, and generate a corresponding input instruction according to the trajectory.
  • the transparent window is spaced apart from the image collection unit by a distance from the first surface of the image collection unit, and a space is formed.
  • the space may include the transparent window corresponding to the image.
  • a portion of the collection channel may also include a portion of the transparent window corresponding to a portion of the image collection channel and a lower surface of the transparent window and the image collection unit.
  • the image collecting unit needs to collect the image when the pointing object contacts the first surface of the transparent window, it is required that the space between the first surface of the transparent window and the image collecting unit is illuminated by light, generally In other words, it can be achieved as follows.
  • the transparent window is set larger. As shown in FIG. 11, since the surface area of the transparent window is large, the pointing object (the user's finger in FIG. 11) can only contact a part of the transparent window, and the light can still pass through the transparent window. The other portion that is not touched by the pointing object enters the space between the first surface of the transparent window and the image collecting unit, and is captured by the image collecting unit, so that the image captured by the image collecting unit can be obtained to obtain the pointing object. position.
  • the terminal device is further provided with at least one illuminating device for illuminating the space to the space, and is connected to the main board, and when the pointing object contacts the first surface of the transparent window, the illuminating device is sent out The light reflection, and the image collection unit captures the emitted light image, so analyzing the image captured by the image collection unit can obtain the pointing object s position.
  • the image collection unit has two working modes: a shooting mode and a positioning mode, and the processing unit needs to activate the lighting device when the image collecting unit operates in the positioning mode.
  • the transparent window may be a transparent window that is separately provided, or may be a transparent protective layer that the image collection unit (such as a camera) itself has.
  • the illuminating device is fixedly set or adjustable, respectively, and is described below.
  • the light emitting device 21 is fixedly disposed, and the light emitting direction is toward the transparent window.
  • the light emitted from the illuminating device 21 is irradiated onto the finger touched on the outer surface of the transparent window 23, and is reflected in the image collecting unit 22, and the image collecting unit 22 forms an image in which the position of the finger is recorded.
  • the illuminating device 21 is fixedly disposed, and the terminal device further includes an optical device disposed in a light emitting direction of the illuminating device 21 for guiding light emitted by the illuminating device 21 to the space. twenty four.
  • the light emitted by the illuminating device 21 is reflected by the illuminating device 24 multiple times, exits the optical device 24, and is irradiated onto the finger touched on the outer surface of the transparent window 23, and is reflected into the image collecting unit 22 by the image collecting unit 22.
  • An image is recorded which records the position of the finger.
  • the optical device 24 is disposed under the optical device 24, but the relative position between the two is also in other forms.
  • the light-emitting device 21 is fixedly disposed, and the terminal device further includes a setting. In the light emitting direction of the light emitting device 21 (the optical device 24 is disposed on the left side of the optical device 24), the optical device 24 for guiding the light emitted by the light emitting device 21 to the space.
  • the light emitted by the illuminating device 21 is reflected by the illuminating device 24 multiple times, exits the optical device 24, and is irradiated onto the finger touched on the outer surface of the transparent window 23, and is reflected into the image collecting unit 22 by the image collecting unit 22.
  • An image is recorded which records the position of the finger.
  • the illuminating devices are located outside the transparent window, and the image concentrating unit is located on the same side of the transparent window, but the illuminating device may also be disposed inside the transparent window, as shown in FIG. 15 .
  • the light-emitting device 21 is fixedly disposed in the transparent window 23 at one end of the transparent window 23, and the light-emitting direction is toward the other end of the transparent window 23.
  • the light beam emitted from the illuminating device 21 is reflected from the transparent window 23 to the inside thereof, that is, after the inner surface of the transparent window 23.
  • the surface of the transparent protective layer is air, when the angle of the incident light satisfies certain conditions, the light is completely reflected on the surface of the transparent protective layer.
  • a substance having a relatively high refractive index e.g., a finger
  • the condition of total reflection on the surface of the transparent window 23 is broken, and part of the light beam is transmitted through the surface and projected onto the surface of the finger.
  • the uneven finger surface causes scattering of the light beam (diffuse reflection), and the scattered light passes through the transparent window 23 to reach the image collecting unit 22, and the image collecting unit 22 forms an image for recording the position of the finger.
  • the light emitting device may select the annular light emitting body and set around the image collecting unit.
  • the pointing object can obtain the same effect imaging at any position on the upper surface of the transparent window to meet the needs of subsequent image analysis.
  • the illuminating device 21 is fixedly disposed, but it is considered that the illuminating device can be used for the imaging module in the normal photography mode or in the imaging module.
  • the light emitted by the illuminating device should be transmitted to the outer surface through the transparent window as much as possible, and in the second case, the light emitted from the illuminating device should be It is projected into the transparent window as much as possible. Therefore, in both cases, in order to meet different needs, it is necessary to adjust the light-emitting direction of the light-emitting device so that the light emitted by the light-emitting device is irradiated to the predetermined space.
  • existing terminal devices such as mobile phones, PDAs, etc.
  • the flash is configured.
  • the camera and the camera that the terminal device already have can be directly utilized.
  • the flash is multiplexed to serve as an image collection unit and a light-emitting device. Therefore, the utilization of the function modules on the existing device is maximized, and the hardware cost is not increased, and the application range of the embodiment of the present invention is improved.
  • the apparatus is applied to a schematic diagram in a conventional camera mode, including a lighting device 25 provided with an angle adjustment module 251 for adjusting the lighting device 25 to cause the lighting device 25 to transmit light to the space.
  • the angle adjustment module 251 is controlled to adjust the illumination device 25 such that the illumination device 25 emits light at a first angle through which light as much light as possible can be transmitted to the outside.
  • the device should be applied with a schematic diagram for the positioning of the positioning mode, wherein the middle package includes a hair set to be provided with an angular angle adjustment mode module block 225511.
  • the illuminating light device is provided with 2255, and in the fixed positioning mode mode, the control angle is adjusted to adjust the mode module block 225511, so that the modulating and illuminating device is prepared.
  • the illuminating device 2255 is used to emit the ray of light at the second angle of the second angle, and at this time, the ray of light can be exhausted, and the light ray can be illuminated. Shooted into the pre-determined space. .
  • the test considers that the illuminating device is prepared in the normal normal camera mode mode, and the required illuminating intensity is relatively large, so as to be full. Satisfy the needs of the demand, but in the fixed positioning mode mode, it only needs to illuminate a relatively small area, which is relatively small, at this time , in order to be strong, in order to facilitate the root according to the situation of the same different conditions to adjust the intensity of the intensity of the light, the full satisfaction needs. .
  • the device for illuminating the light-emitting device can be adjusted, and it should be understood that the light-emitting device is set. 1100
  • the light path of the light path can be adjusted and adjusted, or the light path of the light source can be adjusted, and when the path of the light path can be adjusted, the description
  • the specific package of the illuminating device is included:
  • a light optical device device for use in the positioning of a picture unit, such as when the image unit is set to work in a fixed position mode mode, Depicting a path of the light ray path of the illuminating body, and causing the illuminating body to transmit the light ray line between the vacant spaces, as described above
  • the light path path diameter of the light emitting body body described in the adjusting section is adjusted.
  • the illuminating light body emits a light ray line between the empty spaces outside the transparent transparent window. .
  • Adjusting the illuminating path path of the illuminating section which may be implemented by adjusting the illuminating optics device by over-adjusting, as in the image-like collection of the image
  • the optical optical device device is located on a path of the light ray path of the illuminating body body, And the light ray line emitted by the illuminating body is emitted between the empty space and the illuminating body, and is in the image
  • the setting of the illuminating device is only set to Figure 1166 and And Figure 1177 is carried out to explain, but, but should be understood properly, the illuminating device is equipped with an adjustable setting method, and its
  • the emitted light ray line may also be passed through the optical optics device to facilitate the projection of the light ray line to a better pre-scheduled projection.
  • the bit position of 2255 is set, and it can also be set to be placed in its other position, and it is not mentioned here---detailed in detail. .
  • the specific position position of the image unit is not limited to the specific position of the image unit. It may be located at each position of the terminal device, such as the surface surface of the upper surface, the back surface, the side surface, or even the corner angle. It may be possible, as long as the user's finger can be used to reach the position that can be reached. .
  • the image of the image is collected into the camera mode, and the camera is taken.
  • Genus The original function of the image collection unit.
  • the illuminating device After the user initiates the positioning process, when the illuminating device is present, the illuminating device is activated, and the light emitted by the illuminating device is irradiated onto the finger touched on the outer surface of the transparent window, and is reflected into the image concentrating unit, and the recording finger is formed by the image concentrating unit.
  • the image of the head position is subjected to a positioning process by the processing unit, that is, a trajectory of the pointing object is calculated according to the image, and a corresponding input command is generated according to the trajectory.
  • the angle should be adjusted so that as much light as possible can be radiated to the predetermined space.
  • the above-mentioned image capturing unit of the pointing object on the transparent window surface and then analyzing to determine the position of the pointing object can be used for the first operating unit of the foregoing terminal device or The second operating unit or both the first operating unit and the second operating unit.
  • the above configuration may be separately applied to any terminal device using the touch screen, or a method of controlling the operation of the operation object simultaneously with the two operations.
  • the terminal device is used in combination, and the embodiments of the present invention are not intended to limit this.

Description

操作对象的操作控制的方法及终端设备 技术领域
本发明涉及手持终端设备,特别是指一种操作对象的操作控制的方法及 终端设备 背景技术
越来越多的手持设备追求轻薄和大屏幕,触摸技术在手持设备上的使用 也越来越广泛。 随着手持设备计算能力的加强, 显示 3D图形已经很常见, 比如街景地图、 立体菜单的显示。
对于这些 3D图形的显示, 平移和旋转是常用的两种操作方式, 可以通 过触摸来进行。 但是在平面的触摸设备上, 在同一时间内通常只支持一种操 作方式, 即只能进行平移操作或者只能进行旋转操作。 如果希望作另一类型 的操作, 必须进行切换, 否则操作会产生歧义。
如图 1所示, 为现有的手持设备中, 利用其触摸屏进行平移操作的效果 图。
如图 2所示, 为现有的手持设备中, 利用其触摸屏进行旋转操作的效果 图。
但平移操作和旋转操作无法同时进行,如果要想达到既平移又旋转的效 果, 必须进行平移操作后, 操作对象平移到预定位置后, 再切换到旋转操作 模式下, 进行旋转操作, 所述操作对象再在所述操作对象平移到的位置进行 旋转。
这样的操作方式非常不方便, 无法满足用户的需求。
并且, 在全触摸控制的触摸屏中, 对于现有的终端设备而言, 由于触摸 屏的面积非常有限, 导致用户进行触摸控制时会遮挡部分的触摸屏, 影响了 用户的观看。 发明内容
本发明要解决的技术问题是提供一种实现两种操作同时进行的操作对 象的操作控制的方法及终端设备, 能够实现两种操作同时被执行的操作效 果。 本发明要解决的技术问题还在于提供一种终端设备及输入方法,在实现 触摸控制时不会影响用户屏幕的显示效果。
为解决上述技术问题,本发明的实施例提供一种操作对象的操作控制的 方法, 包括: 获取一操作对象的第一操作方向和第二操作方向; 确定所述第 一操作方向与所述第二操作方向的方向组合关系对应的操作; 对所述操作对 象执行所述操作。
优选的,确定所述第一操作方向与所述第二操作方向的方向组合关系对 应的操作的步骤具体为: 根据所述操作对象的操作类型, 确定所述第一操作 方向与所述第二操作方向的方向组合关系对应的操作。
优选的, 所述操作对象的操作类型为: 对立体操作对象的操作时, 根据 所述操作对象的操作类型,确定所述第一操作方向与所述第二操作方向的方 向组合关系对应的操作的步骤具体为: 在所述第一操作方向和所述第二操作 方向相同时, 则确定所述操作为: 所述立体操作对象在所述第一操作方向或 者第二操作方向上的平移操作; 或者在所述第一操作方向和所述第二操作方 向相反时, 则以所述第一操作方向和所述第二操作方向形成的轨迹线的垂线 为轴进行旋转操作。
优选的, 所述操作对象的操作类型为: 对平面操作对象的操作时, 根据 所述操作对象的操作类型,确定所述第一操作方向与所述第二操作方向的方 向组合关系对应的操作的步骤具体为: 在所述第一操作方向和所述第二操作 方向相同时, 则确定所述操作为: 所述平面操作对象在所述第一操作方向或 者所述第二操作方向上的平移操作; 或者在所述第一操作方向和所述第二操 作方向相反时, 则确定所述操作为: 所述平面操作对象的整体放大操作。
优选的, 所述操作对象的操作类型为: 对窗口操作对象的操作时, 根据 所述操作对象的操作类型,确定所述第一操作方向与所述第二操作方向的方 向组合关系对应的操作的步骤具体为: 在所述第一操作方向和所述第二操作 方向相同的第一方向时,则确定所述操作为:所述窗口操作对象的打开操作; 或者确定与所述窗口操作对象的打开操作相反的方向的操作为: 所述窗口操 作对象的关闭操作; 或者在所述第一操作方向和所述第二操作方向相同的第 二方向时, 则确定所述操作为: 所述窗口操作对象的最大化操作; 或者确定 与所述窗口操作对象的最大化操作相反方向的操作为: 所述窗口操作对象的 缩小操作; 其中, 所述第一方向和所述第二方向不同。 优选的, 在所述第一操作方向和所述第二操作方向相同时, 则确定所述 操作为第一操作; 在所述第一操作方向和所述第二操作方向相反时, 则确定 所述操作为第二操作。
优选的,上述方法还包括:选定所述操作对象上的一定位点的第三操作; 相对于所述定位点作远离所述定位点的第一操作, 进行相对于所述定位点的 放大。
本发明的实施例还提供一种终端设备, 包括壳体,设置在所述壳体上的 第一操作单元, 还包括: 设置在所述壳体上的第二操作单元; 所述第一操作 单元, 用于获取一操作对象的第一操作方向; 所述第二操作单元, 用于获取 所述操作对象的第二操作方向; 处理单元, 用于确定所述第一操作方向与所 述第二操作方向的方向组合关系对应的操作; 操作执行单元, 用于对所述操 作对象执行所述操作, 并将所述操作的执行结果输出显示。
优选的, 所述处理单元包括: 第一处理子单元, 用于根据所述操作对象 的属性特征, 确定所述操作对象的操作类型; 第二处理子单元, 用于根据所 述操作对象的操作类型,确定所述第一操作方向与所述第二操作方向的方向 组合关系对应的操作。
优选的, 所述第一处理子单元确定所述操作对象的操作类型为: 对立体 操作对象的操作时, 所述第二处理子单元具体用于: 在所述第一操作方向和 所述第二操作方向在同一直线上且相同时, 则确定所述操作为: 所述立体操 作对象在所述第一操作方向或者第二操作方向上的平移操作; 或者在所述第 一操作方向和所述第二操作方向相反时, 则以所述第一操作方向和所述第二 操作方向形成的轨迹线的垂线为轴进行旋转操作。
优选的, 所述第一处理子单元确定所述操作对象的操作类型为: 对平面 操作对象的操作时, 所述第二处理子单元具体用于: 在所述第一操作方向和 所述第二操作方向相同时, 则确定所述操作为: 所述平面操作对象在所述第 一操作方向或者所述第二操作方向上的平移操作; 或者在所述第一操作方向 和所述第二操作方向相反时, 则确定所述操作为: 所述平面操作对象的整体 放大操作。
优选的, 所述第一处理子单元确定所述操作对象的操作类型为: 对窗口 操作对象的操作时, 所述第二处理子单元具体用于: 在所述第一操作方向和 所述第二操作方向相同的第一方向时, 则确定所述操作为: 所述窗口操作对 象的打开操作; 或者确定与所述窗口操作对象的打开操作相反的方向的操作 为: 所述窗口操作对象的关闭操作; 或者在所述第一操作方向和所述第二操 作方向相同的第二方向时, 则确定所述操作为: 所述窗口操作对象的最大化 操作; 或者确定与所述窗口操作对象的最大化操作相反方向的操作为: 所述 窗口操作对象的缩小操作。
优选的, 所述第一操作单元设置在所述壳体的第一位置; 所述第二操作 单元设置在所述壳体的与所述第一位置相对的第二位置。
优选的, 所述第一操作方向与所述第二操作方向相同时, 则所述操作为 第一操作; 所述第一操作方向与所述第二操作相反时, 则所述操作为第二操 作。
优选的, 所述第一操作单元包括: 一图像釆集单元; 以及在所述图像釆 集单元的图像釆集通道上设置的一透明窗, 所述透明窗远离所述图像釆集单 元的第一表面与所述图像釆集单元间隔一定距离, 形成有一空间; 其中, 所 述图像釆集单元用于在指点物接触于所述透明窗的所述第一表面时釆集图 像, 且所述处理单元用于根据所述图像计算所述指点物的轨迹, 并根据所述 轨迹获取所述操作对象的第一操作方向。
优选的, 所述第一操作单元还包括: 一用于将光线照射到所述空间的至 少一个发光设备; 所述发光设备与所述图像釆集单元位于所述第一表面的同 侧。
优选的, 所述图像釆集单元具有拍摄模式或定位模式, 所述图像釆集单 元工作于所述定位模式时, 所述发光设备处于启动状态。
优选的, 所述发光设备固定设置, 发光方向朝向所述透明窗。
优选的, 所述发光设备固定设置, 所述第一操作单元还包括设置于所述 发光设备的发光方向上, 用于将所述发光设备发出的光线导向所述空间的光 学器件。
优选的, 所述发光设备固定设置, 所述发光设备为一环状发光体, 围绕 所述图像釆集单元设置。
优选的, 所述发光设备固定设置于所述透明窗内, 位于透明窗的一端, 且发光方向朝向所述透明窗的另一端。
优选的, 所述发光设备可调, 所述发光设备具体包括: 一发光单元; 角 度调节模块, 与所述发光单元连接, 用于在所述图像釆集单元工作于定位模 式时, 调节所述发光体, 使所述发光体向所述空间发送光线。
优选的, 所述发光设备可调, 所述发光设备具体包括: 一发光单元; 一 光学器件, 用于在所述图像釆集单元工作于定位模式时, 调节所述发光体的 光线路径, 使所述发光体向所述空间发送光线, 在所述图像釆集单元工作于 拍摄模式时, 调节所述发光体的光线路径, 使所述发光体向透明窗之外的空 间发送光线。
优选的, 在所述图像釆集单元工作于定位模式时, 所述光学器件位于所 述发光体的光线路径上, 所述发光体发出的光线经过所述发光体发射到所述 空间, 在所述图像釆集单元工作于拍摄模式时, 所述光学器件位于所述发光 体的光线路径外, 所述发光体发出的光线透过所述透明窗发射到外部。 优选的, 所述第二操作单元包括: 一图像釆集单元, 以及在所述图像釆 集单元的图像釆集通道上设置的一透明窗, 所述透明窗远离所述图像釆集单 元的第一表面与所述图像釆集单元间隔一定距离, 形成有一空间; 其中, 所 述图像釆集单元用于在指点物接触于所述透明窗的所述第一表面时釆集图 像, 并且所述处理单元用于根据所述图像计算所述指点物的轨迹, 并根据所 述轨迹获取所述操作对象的第二操作方向。
本发明的实施例还提供一种操作对象的操作控制的方法,所述操作对象 包括至少一个显示对象, 包括: 确定显示对象的优先级; 接收对所述显示对 象的第一操作方向和第二操作方向的指令; 在所述第一操作方向和第二操作 方向相反时,显示所述显示对象当前优先级的显示信息以及低于当前优先级 的显示信息。
本发明的实施例还提供一种终端设备, 包括: 存储单元, 用于存储包括 有至少一个显示对象的操作对象; 处理单元, 用于确定显示对象的优先级; 接收对所述显示对象的第一操作方向和第二操作方向的指令; 在所述第一操 作方向和第二操作方向相反时,产生显示所述显示对象的当前优先级的显示 信息以及低于当前优先级的显示信息的指令; 显示单元, 用于根据所述指令 显示所述显示对象的显示信息。
本发明的实施例还提供一种终端设备, 包括: 壳体; 设置于壳体内的主 板; 设置在所述壳体上与所述主板连接的操作单元; 和设置在所述壳体内与 所述主板连接的处理单元; 其特征在于, 所述操作单元包括: 一图像釆集单 元; 和一透明窗, 设置在所述图像釆集单元的图像釆集通道上, 所述透明窗 远离所述图像釆集单元的第一表面与所述图像釆集单元间隔一定距离, 形成 有一空间; 其中, 所述图像釆集单元用于在指点物接触于所述透明窗的所述 第一表面时釆集图像; 并且所述处理单元用于根据所述图像计算所述指点物 的轨迹, 并根据所述轨迹生成对应的输入指令。
本发明的实施例还提供一种输入方法, 应用于上述终端设备, 其特征在 于, 包括: 在指点物接触于所述透明窗的第一表面时釆集图像; 根据所述图 像计算所述指点物的轨迹, 并根据所述轨迹生成对应的输入指令。
本发明的上述技术方案的有益效果如下:
上述方案中, 同时获取一操作对象的第一操作方向和第二操作方向, 并 根据该第一操作方向和第二操作方向的方向组合关系, 确定相应的操作, 实 现了两种操作同时进行, 并能够得到两种操作同时被执行的操作效果。
上述方案中, 利用已有的图像釆集单元来拍摄指点物在透明窗表面的图 像, 进而进行分析, 确定指点物位置, 其不用增加额外的设备, 实现成本小, 且对于小型便携式设备非常重要。
上述方案中, 由于指点物并不是在触摸屏的表面进行滑动, 而是在透明 窗上进行操作, 不会影响用户观看显示的内容。 附图说明
图 1为现有手持设备中, 利用其触摸屏进行平移操作的效果图; 图 2为现有手持设备中, 利用其触摸屏进行旋转操作的效果图; 图 3为本发明的操作对象的操作控制的方法流程图;
图 4为本发明的终端设备的结构图;
图 5为图 4所示终端设备对一操作对象进行一种操作的操作方向示意 图;
图 6为图 5所示操作对象的操作效果图;
图 7为图 4所示终端设备对一操作对象进行的另一种操作的操作方向示 意图;
图 8为图 7所示操作对象的操作效果图;
图 9为图 4所示终端设备对通讯录操作对象的操作方向及打开示意图; 图 10为图 4所示终端设备对操作对象的定点放大的操作示意图。 图 11为本发明实施例中, 透明窗与图像釆集单元的相对位置关系的示 意图;
图 12-图 15为本发明实施例中, 设置有固定设置的发光设备时, 发光设 备、 图像釆集单元、 透明窗三者之间的可能相对位置关系的示意图;
图 16为本发明实施例中, 设置有可调节的发光设备时, 在通常成像模 式下,发光设备、 图像釆集单元、透明窗三者之间的相对位置关系的示意图; 图 17为本发明实施例中,设置有可调节的发光设备时, 在定位模式下, 发光设备、 图像釆集单元、 透明窗三者之间的相对位置关系的示意图。 具体实施方式
为使本发明要解决的技术问题、技术方案和优点更加清楚, 下面将结合 附图及具体实施例进行详细描述。
如图 3所示, 本发明的操作对象的操作控制的方法, 包括:
步骤 31 , 获取一操作对象的第一操作方向和第二操作方向;
步骤 32 , 确定所述第一操作方向与所述第二操作方向的方向组合关系 对应的操作;
步骤 33 , 对所述操作对象执行所述操作。
该方法通过同时获取一操作对象的第一操作方向和第二操作方向,并根 据该第一操作方向和第二操作方向的方向组合关系, 确定相应的操作, 实现 了两种操作同时进行。
其中, 上述步骤 32在具体实现时, 根据所述操作对象的操作类型, 确 定所述第一操作方向与所述第二操作方向的方向组合关系对应的操作。
其中, 操作对象的操作类型可以包括: 对立体操作对象的操作、 对平面 操作对象的操作或者对窗口操作对象的操作, 但并不限这些, 还可以根据具 体的需求, 对其它的操作对象进行各个方向的操作的定义, 但实现方式与这 几种操作对象的操作类型相似。
操作类型 1 :
当所述操作对象的操作类型为: 对立体操作对象的操作时,根据所述操 作对象的操作类型,确定所述第一操作方向与所述第二操作方向的方向组合 关系对应的操作的步骤具体为以下几种情况:
1 )在所述第一操作方向和所述第二操作方向相同时(即第一操作方向 和第二操作方向形成的轨迹线平行并且方向相同, 或者该轨迹线被系统认为 是平行的, 也就是说, 第一操作方向和第二操作方向的轨迹可以具有一定的 角度, 但系统处理后是平行的), 则确定所述操作为: 所述立体操作对象在 所述第一操作方向或者第二操作方向上的平移操作;
如在所述立体操作对象所在的坐标系中,第一操作方向和第二操作方向 均沿 X轴方向或者 Y轴方向且方向相同时, 则所述操作是: 在所述第一操作
Figure imgf000010_0001
表 1
当然该表 1中仅列举出了在 X轴方向或者 Y轴方向的移动操作,如果第 一操作方向和第二操作方向不是严格意义上的 X轴或者 Y轴方向的移动操 作, 只要该第一操作方向和第二操作方向在同一方向相同即可, 如沿 X轴和 Y轴的角平分线的方向, 以及该角平分线与 X轴的角平分线, 或者该角平分 线与 Y轴的角平分线的方向, 依次类推, 所有方向相同的操作都是可以的。
2 )在所述第一操作方向和所述第二操作方向相反时, 则确定所述操作 为: 以所述第一操作方向和所述第二操作方向形成的轨迹线的垂线为轴进行 旋转操作 (即第一操作方向和第二操作方向形成的轨迹线平行或近似平行, 并且方向相反, 其中近似平行是指第一操作方向和第二操作方向的轨迹可以 具有一定的角度, 但系统认为是平行的); 比如, 以所述第一操作方向和所 述第二操作方向形成的轨迹线的垂线为轴进行, 第一操作方向的旋转操作或 者第二操作方向的旋转操作。 如在所述立体操作对象所在的坐标系中,所述第一操作方向和所述第二 操作方向均沿 X轴方向或者 γ轴方向且方向不同时, 则所述操作是: 所述立 体操作对象以 X轴为中心在所述第一操作方向上的旋转操作或者以 Y轴为中 心在所述第二操作方向上的旋转操作; 举例说明:
Figure imgf000011_0001
表 2
当然该表 2中仅列举出了在 X轴方向或者 Y轴方向的旋转操作,如果第 一操作方向和第二操作方向不是严格意义上的 X轴或者 γ轴方向的移动操 作, 只要该第一操作方向和第二操作方向的方向相反即可, 如沿 X轴和 Y轴 的角平分线的方向, 以及该角平分线与 X轴的角平分线, 或者该角平分线与 Y轴的角平分线的方向, 依次类推, 所有方向相反的操作都是可以的。
3 )在所述第一操作方向和所述第二操作方向不相同也不相反时 (即第 一操作方向和第二操作方向形成的轨迹线相交), 则确定所述操作是: 所述 立体操作对象以第一操作方向形成的轨迹线的垂线和第二操作方向形成的 轨迹线的垂线为轴进行旋转,如以第一操作方向形成的轨迹线的垂线为轴进 行第一操作方向的旋转, 同时以第二操作方向形成的轨迹线的垂线为轴进行 第二操作方向的旋转;
如在所述立体操作对象所在的坐标系中,所述第一操作方向为 X轴方向 且所述第二操作方向为 Y轴方向时, 则所述操作是: 以 X轴为中心在所述第 一操作方向上的旋转, 同时以 Y轴为中心在所述第二操作方向上的旋转; 举 例说明:
Figure imgf000012_0001
的一面向 Υ 的一面向 Υ
轴正向旋转; 轴正向旋转;
同时以 Υ轴 同时以 Υ轴
为中心, 面向 为中心, 面向
用户的一面 用户的一面
向 X轴正向 向 X轴负向
旋转 旋转
以 X轴为中 以 X轴为中
心, 面向用户 心, 面向用户
的一面向 Υ 的一面向 Υ
轴负向旋转; 轴负向旋转;
Y轴负向 同时以 Υ轴 同时以 Υ轴
为中心, 面向 为中心, 面向
用户的一面 用户的一面
向 X轴正向 向 X轴负向
旋转 旋转
表 3
当然该表 3中的 X轴和 Υ轴可以定义为:在不同维度的任意两个方向分 别在 X轴和 Υ轴上的分量。
另外,如果第一操作方向形成的轨迹或者第二操作方向形成的轨迹为曲 线时, 则系统将这些曲线映射为直线进行处理。 具体的映射方式可以是: 用 该轨迹的起点和终点来确定一条直线, 或者将该曲线映射为多段连续的直 线, 依次处理。
操作类型 1··
当所述操作对象的操作类型为: 对平面操作对象的操作时,根据所述操 作对象的操作类型,确定所述第一操作方向与所述第二操作方向的方向组合 关系对应的操作的步骤具体为以下几种情况:
1 )在所述第一操作方向和所述第二操作方向相同时, 则确定所述操作 为: 所述平面操作对象在所述第一操作方向或者所述第二操作方向上的平移 操作;
该平面操作对象的平移操作的具体情况和上述表 1所述的操作相同,在 此不再赘述。
举例说明平面操作对象的平移:如平面操作对象为一网页或者一平面图 像, 通过上述第一操作方向的操作和第二操作方向的操作同时选定该网页, 同方向移动, 则使网页按照第一操作方向和第二操作方向进行移动, 以实现 手持设备小屏幕上显示网页时, 网页上的滚动条的作用。 系统检测到沿第一 操作方向的操作和第二操作方向的操作时, 沿第一操作方向和第二操作方向 移动网页, 并忽略沿第一操作方向的操作对应的点击触发操作, 从而可以解 决网页链接被误触发的问题。
2 )在所述第一操作方向和所述第二操作方向相反时, 则确定所述操作 为: 所述平面操作对象的整体放大操作;
如第一操作方向为: 从 X轴负向到 X轴正向(即从用户的角度来看是从 左到右), 同时, 第二操作方向为: 从 X轴正向到 X轴负向 (从用户的角度 来看是从右到左), 代表平面操作对象的整体放大; 或者
第一操作方向为: 从 Y轴负向到 Y轴正向(即从用户的角度来看是从下 到上), 同时, 第二操作方向为: 从 Y轴正向到 Y轴负向 (即从用户的角度 来看是从上到下), 代表平面操作对象的整体放大; 或者
第一操作方向为: 沿 X轴和 Y轴的对角线的第一方向(如从用户的角度 来看, 从左下到右上), 同时, 第二操作方向为: 沿 X轴和 Y轴的对角线的 第二方向 (如从用户的角度来看, 从右上到左下), 代表平面操作对象的整 体放大操作;
上述第一操作方向和第二操作方向的方向组合关系并不限于这些,只要 二者方向相反, 均可代表整体放大操作。
3 )确定与所述平面操作对象的整体放大操作方向相反的操作为: 所述 平面操作对象的整体缩小操作;
如在上述 2 ) 中定义的方向组合关系对应的操作的基础上, 同时定义: 第一操作方向为: 从 X轴正向到 X轴负向(从用户的角度来看是从右到 左), 同时, 第二操作方向为: 从 X轴负向到 X轴正向 (即从用户的角度来 看是从左到右), 代表平面操作对象的整体缩小;
同样, 第一操作方向为: 从 Y轴正向到 Y轴负向(即从用户的角度来看 是从上到下), 同时, 第二操作方向为: 从 Y轴负向到 Y轴正向 (即从用户 的角度来看是从下到上 ), 代表平面操作对象的整体缩小; 同样的, 第一操作方向为: 沿 X轴和 Y轴的对角线的第二方向(如从用 户的角度来看, 从右上到左下), 同时, 第二操作方向为: 沿 X轴和 Y轴的 对角线的第一方向 (如从用户的角度来看, 从左下到右上), 代表平面操作 对象的整体缩小操作;
当然,对平面操作对象的缩小操作, 也不限于上述第一操作方向和第二 操作方向的方向组合关系,对于所有可以代表整体放大操作的方向相反的操 作均可以定义为整体缩小操作。
当然也可以定义该平面操作对象在平面上的旋转操作,该旋转操作所对 应的第一操作方向和第二操作方向的组合关系, 只要与上述已经定义过的操 作的方向不同即可, 在此不再赘述。
另外, 除了对平面操作对象的整体放大操作外, 还可以对平面操作对象 实现局部放大: 如可以通过只对所述平面操作对象执行第二操作方向的操 作, 即通过第二操作定位平面操作对象上要局部放大的部分, 并执行第二操 作方向上的移动, 实现平面操作对象的局部放大。 具体来讲: 平面操作对象 局部放大的具体实现过程如下: 生成局部放大控制点; 依据所述局部放大控 制点, 对应生成局部放大区域; 通过第二操作方向控制所述局部放大控制点 的位置; 将所述控制点所在的显示内容放大, 并显示在局部放大区域中。
尤其当沿第二操作方向在非显示区域进行操作时, 可以不遮挡用户操 作, 还可以避免误触发的问题。
操作类型 3:
当所述操作对象的操作类型为: 对窗口操作对象的操作时,根据所述操 作对象的操作类型,确定所述第一操作方向与所述第二操作方向的方向组合 关系对应的操作的步骤具体为以下几种情况:
1 )在所述第一操作方向和所述第二操作方向相同的第一方向时, 则确 定所述操作为: 所述窗口操作对象的打开操作;
如第一操作方向和第二操作方向同时沿 X轴负向到 X轴正向,代表打开 程序操作; 当然, 该第一操作方向和第二操作方向的组合关系并不限于此, 还可以是任何其它的方向相同的两个操作方向。
2 )确定与所述窗口操作对象的打开操作相反的方向的操作为: 所述窗 口操作对象的关闭操作;
如在上述 1 )定义的窗口的放大操作的基础上, 第一操作方向和第二操 作方向同时沿 X轴正向到 X轴负向, 代表关闭窗口操作; 当然, 该第一操作 方向和第二操作方向的组合关系并不限于此,对于所有可以代表放大操作的 方向相反的操作均可以定义为关闭操作。
3 )在所述第一操作方向和所述第二操作方向相同的第二方向时, 则确 定所述操作为: 所述窗口操作对象的最大化操作; 其中, 所述第一方向和所 述第二方向不同; 这样可以避免同一种操作对应不同的操作效果;
如第一操作方向和第二操作方向是同时沿 X轴和 Y轴的对角线的第一运 动方向 (即从用户角度来看, 同时做左下斜向右上运动), 代表进行窗口最 大化操作; 当然, 该第一操作方向和第二操作方向的组合关系并不限于此, 还可以是任何其它的方向相同的两个操作方向;
4 )确定与所述窗口操作对象的最大化操作相反方向的操作为: 所述窗 口操作对象的缩小操作;
如在上述 3 )所举实例的基础上, 第一操作方向和第二操作方向是同时 沿 X轴和 Y轴的对角线的第二运动方向(即从用户角度来看, 同时做右上斜 向左下运动), 代表窗口最小化操作。
本发明的上述方法中,可以根据第一操作方向和第二操作方向的方向组 合关系, 确定对该操作对象执行什么样的操作, 实现了对同一操作对象同时 执行两种操作的效果, 如同时进行平移和旋转操作。
另外, 对于本发明的上述方法实施例, 还包括:
选定所述操作对象上的一定位点的第三操作;
相对于所述定位点作远离所述定位点的第一操作 ,进行相对于所述定位 点的放大, 相对于所述定位点靠近所述定位点, 进行相对于所述定位点的缩 小; 具体实现过程如图 10所示, 选择所述操作对象 1 01的一定位点 102 , 同 时作远离所述定位点的同方向移动 103 , 对所述操作对象进行相对于所述定 位点的放大; 同时作与放大操作方向相反的操作 104 , 对所述操作对象进行 相对于所述定位点的缩小。 这里的操作对象可以是立体操作对象, 平面操作 对象或者窗口等其它操作对象。
本发明的上述实施例中, 立体操作对象可以是立体物体、街景地图或立 体菜单等; 平面操作对象可以是屏幕中的平面图像等; 窗口操作对象可以是 各种应用程序所对应的窗口; 进一步的, 本发明的上述方法还可以根据操作 对象的操作类型的有效性根据当前场景自动判断, 如: 当前窗口显示内容为 3D立体物体, 触摸运动的含义按上述立体操作对象的操作类型 1进行定义; 当前窗口显示内容为平面图像(如图片 ), 触摸运动的含义按操作类型 2进 行定义; 当前窗口显示为其它内容或程序管理器, 触摸运动的含义按操作类 型 3进行定义。
如图 4所示, 本发明的实施例还提供一种终端设备 40 , 包括壳体 41 , 设置在所述壳体 41上的第一操作单元 42 ,还包括:设置在所述壳体 41上的 第二操作单元 44 ;
所述第一操作单元 41 , 用于获取一操作对象 43的第一操作方向; 所述第二操作单元 44 , 用于获取所述操作对象 43的第二操作方向; 处理单元(图中未示出), 用于确定所述第一操作方向与所述第二操作 方向的方向组合关系对应的操作; 该处理单元具体实现时, 可以由该终端设 备的处理器来实现;
操作执行单元(图中未示出), 用于对所述操作对象执行所述操作, 并 将所述操作的执行结果输出显示; 该操作执行单元具体实现时, 也可以由该 终端设备的处理器来实现。
优选的,上述第一操作单元和第二操作单元是物理上分开的两个操作单 元, 该第一操作单元可以设置在所述壳体的第一位置; 所述第二操作单元设 置在所述壳体的与所述第一位置相对的第二位置; 这样方便用户操作, 并符 合用户的正常操作习惯。
如第一操作单元为终端设备正面(面向用户的一面)的触摸屏, 第二操 作单元为设置在终端设备背面的触摸板; 当然, 该触摸板也可以设置在终端 设备的侧面, 只要是与第一操作单元物理上分开, 位置上不同即可。
优选的, 该终端设备还可包括有:
显示模块(如可以是终端设备的显示屏), 用于显示对该操作对象执行 的操作结果。该显示屏如果是触摸屏的话,该显示屏在作为显示单元的同时, 也可以作为上述第一操作单元。
其中, 所述处理单元具体实现时, 可以包括:
第一处理子单元, 用于根据所述操作对象的属性特征, 确定所述操作对 象的操作类型;
第二处理子单元, 用于根据所述操作对象的操作类型, 确定所述第一操 作方向与所述第二操作方向的方向组合关系对应的操作。 当所述第一处理子单元确定所述操作对象的操作类型为:对立体操作对 象的操作时, 所述第二处理子单元具体用于:
在所述第一操作方向和所述第二操作方向相同时, 则确定所述操作为: 所述立体操作对象在所述第一操作方向或者第二操作方向上的平移操作; 或 者
在所述第一操作方向和所述第二操作方向相反时,则以所述第一操作方 向和所述第二操作方向形成的轨迹线的垂线为轴进行旋转操作, 比如, 以所 述第一操作方向和所述第二操作方向形成的轨迹线的垂线为轴进行, 第一操 作方向的旋转操作或者第二操作方向的旋转操作; 或者
在所述第一操作方向和所述第二操作方向不相同且不相反时,则确定所 述操作是: 所述立体操作对象以第一操作方向形成的轨迹的垂线为轴进行第 一操作方向的旋转, 同时以第二操作方向形成的轨迹的垂线为轴进行第二操 作方向的旋转。
举例说明 1 :
如图 5所示, 其中箭头 51代表第一操作方向, 箭头 52代表第二操作方 向, 该第一操作方向和第二操作方向方向相反, 则该立体操作对象以 Y轴为 中心, 面向用户的一面向 X轴正向旋转; 该立体操作对象旋转的效果图如图 6所示。 附图仅仅表示第一操作方向和第二操作方向的示意, 依据第一操作 单元和第二操作单元设置的位置, 第一操作方向和第二操作方向可以重合, 也可以不重合。
举例说明 2 :
如图 7所示, 其中箭头 71代表第一操作方向, 箭头 72代表第二操作方 向, 该第一操作方向和第二操作方向方向相反, 则该立体操作对象以第一操 作方向和第二操作方向形成的轨迹的垂线为轴进行第一操作方向或者第二 操作方向的旋转, 如在操作对象所在的体系中, 以 X轴为中心, 面向用户的 一面向 Y轴正向旋转; 该立体操作对象旋转的效果图如图 8所示。
当然, 该图 5和图 6 , 或者图 7和图 8代表的只是其中一种操作, 其它 的操作可以按照上述表 1、 表 2和表 3的定义进行, 在此不再赘述。
当所述第一处理子单元确定所述操作对象的操作类型为:对平面操作对 象的操作时, 所述第二处理子单元具体用于:
在所述第一操作方向和所述第二操作方向相同时, 则确定所述操作为: 所述平面操作对象在所述第一操作方向或者所述第二操作方向上的平移操 作; 或者
在所述第一操作方向和所述第二操作方向相反时, 则确定所述操作为: 所述平面操作对象的整体放大操作; 或者
确定与所述平面操作对象的整体放大操作方向相反的操作为:所述平面 操作对象的整体缩小操作。
当然, 所述第一处理子单元确定所述操作对象的操作类型为: 对平面操 作对象的操作时, 所述第二处理子单元还用于:
根据所述第二操作单元选定的所述平面操作对象局部放大的部分,和所 述第二操作单元对选定的部分进行拖动的操作,对所述选定的部分进行预定 倍数的放大。
如果终端设备的第一操作方向是在终端设备的触摸屏上进行,第二操作 方向是在终端设备与所述触摸屏位置相对的第二位置上的触摸板上进行, 则 只需要在触摸板上单指触摸, 定位该平面操作对象要局部放大的部分, 再在 触摸板上移动, 触发相对应要局部放大的部分进行固定倍数的放大, 如网页 上的链接文字较小, 通过触摸板定位该文字区域, 并在触摸板上移动, 实现 该链接文字区域的放大, 放大的倍数可预先设置。
相应的, 局部放大的文字区域或者图像, 在放大后, 要恢复到原来的大 小, 则可以通过在触摸板上双击代表取消放大, 恢复正常网页或者图像; 或 者预先设定该局部放大的文字区域或者图像在屏幕上停留几秒钟后自动恢 复到原来的大小,或者离开触摸板时,放大的文字区域或者图像就恢复正常; 这样可以有效解决手持设备上文字较小, 手指难以准确点击想打开的链接的 问题。
除了上述对网页上的文字链接区域的局部放大或者图像的局部放大,还 可以具有如下应用:
如图 9所示,操作对象为具有优先级的操作对象, 通过第一操作单元和 第二操作单元, 作相反方向的移动, 即通过第一操作单元的第一操作方向与 通过第二操作单元的第二操作方向相反时, 对所述操作对象进行逐级打开; 具体实现过程如下: 该操作对象具有至少一个显示对象, 显示对象包括 高优先级显示信息和低优先级显示信息;
确定显示对象的优先级; 接收对所述显示对象的第一操作方向和第二操作方向的指令; 在所述第一操作方向和第二操作方向相反时,显示所述显示对象当前优 先级的显示信息以及低于当前优先级的显示信息。
其中, 显示信息的优先级可以为三级或者三级以上。
另外, 上述实现流程还可以包括:
当增加显示对象后, 若由多个显示对象组成的操作对象进行信息显示 时, 可以缩小每个显示对象的显示信息的显示字体, 使当前优先级的显示对 象和低于当前优先级的显示对象的显示信息均被显示; 也可以保持显示信息 的显示字体, 将所述操作对象分页显示。
以通讯录操作对象为例, 通讯录中的每一条记录为一显示对象,每一条 记录包括多项优先级不同的显示内容: 如姓名标识信息为高优先级; 而该姓 名标识信息所包括电话信息等为中优先级显示信息; 而该姓名标识信息包括 的其它标识信息为低优先级显示信息, 如传真信息, 通讯地址等;
通常情况下, 显示模块只显示高优先级显示内容, 即只显示姓名标识信 息; 在该姓名显示对象所显示的界面中, 接收方向相反的第一操作和第二操 作具体可以是: 第一操作单元获取的操作为相对显示屏向下运动, 第二操作 单元获取的操作为相对显示屏向上运动; 或者第一操作单元获取的操作为相 对显示屏向上运动, 第二操作单元获取的操作为相对显示屏在通讯录上向下 运动;
显示模块显示高优先级显示内容和中优先级显示内容,即显示标识信息
(姓名)和电话信息 (主要内容);
进一步接收方向相反的第一操作和第二操作;
显示模块显示高优先级显示内容、中优先级显示内容和低优先级显示内 容, 即显示全部内容。
相应的, 本发明的实施例还提供一种终端设备, 该终端设备包括: 存储单元, 用于存储包括有至少一个显示对象的操作对象;
处理单元, 用于确定显示对象的优先级; 接收对所述显示对象的第一操 作方向和第二操作方向的指令; 在所述第一操作方向和第二操作方向相反 时,产生显示所述显示对象的当前优先级的显示信息以及低于当前优先级的 显示信息的指令;
显示单元, 用于根据所述指令显示所述显示对象的显示信息。 具体来讲, 该处理单元接收第一操作方向和第二操作方向的指令; 在所 述第一操作方向和第二操作方向相反时,显示所述显示对象当前优先级的显 示信息以及低于当前优先级的显示信息。
优选的, 当增加显示对象后, 若由多个显示对象组成的操作对象进行信 息显示时, 可以缩小每个显示对象的显示信息的显示字体, 使当前优先级的 显示对象和低于当前优先级的显示对象的显示信息均被显示; 也可以保持显 示信息的显示字体, 将所述操作对象分页显示。
当所述第一处理子单元确定所述操作对象的操作类型为:对窗口操作对 象的操作时, 所述第二处理子单元具体用于:
在所述第一操作方向和所述第二操作方向相同的第一方向时,则确定所 述操作为: 所述窗口操作对象的打开操作; 或者
确定与所述窗口操作对象的打开操作相反的方向的操作为:所述窗口操 作对象的关闭操作; 或者
在所述第一操作方向和所述第二操作方向相同的第二方向时,则确定所 述操作为: 所述窗口操作对象的最大化操作; 或者
确定与所述窗口操作对象的最大化操作相反方向的操作为:所述窗口操 作对象的缩小操作。
另外, 本发明的上述终端设备, 还可以具体有以下特征:
通过第一操作单元操作的第一操作方向与通过第二操作单元操作的第 二操作方向相同时, 则所述操作为第一操作, 即对所述操作对象执行第一操 作;
通过第一操作单元操作的第一操作方向与通过第二操作单元操作的第 二操作相反时, 则所述操作为第二操作, 即对所述操作对象执行第二操作。
优选的, 在执行上述第一操作时, 还可以包括:
如图 10所示, 通过所述第一操作单元选择所述操作对象 101的一定位 点 102 , 通过所述第一操作单元和所述第二操作单元同时作远离所述定位点 的同方向移动 103 , 对所述操作对象进行相对于所述定位点的放大; 通过所 述第一操作单元和所述第二操作单元同时作与放大操作方向相反的操作 104 , 对所述操作对象进行相对于所述定位点的缩小。 这里的操作对象可以 是立体操作对象, 平面操作对象或者窗口等其它操作对象。
当然, 本发明的该终端设备实施例还可以具有其它方面的应用: 对平面图像的整体放大时,还可以先选定平面图像中一个定位点, 然后 再进行相对于该定位点的整体放大,如通过第一操作单元选定平面图像中的 一个定位点,然后,再通过第一操作单元和第二操作单元作相反方向的运动, 可以实现该平面图像相对于该定位点的整体放大, 与该放大的方向相反的操 作为整体缩小。
同样的操作还可以应用于网页这样的平面操作对象,操作方法与上述相 同, 不再赘述。
需要说明的是:上述方法实施例的所有特征均适用于该终端设备的实施 例, 能达到与上述方法相同的技术效果。
本发明的终端设备具体可以是手持设备, 具体可以是单手操作的
mul t i_touch。 用户在握持手持设备时, 通常拇指在上, 四指在下。 用户使 用此方案提出的手持设备, 可用拇指在正面进行触摸操作, 食指在背面进行 触摸操作。
本发明的上述方案, 通过在终端设备上增设第二操作单元,提供了一种 新的操作方式, 用户可用其同时对操作对象进行平移和旋转, 并能实现操作 对象同时平移和旋转的操作效果, 更方便的对屏幕内容进行交互控制, 为用 户提供更丰富的交互方式; 可广泛应用于手持设备上街景地图、 3D立体画面 等的显示和交互控制。
但是, 上面已经提到, 在全触摸控制的触摸屏中, 对于现有的终端设备 而言, 由于触摸屏的面积非常有限, 导致用户进行触摸控制时会遮挡部分的 触摸屏, 影响了用户的观看。
因此, 对于上述终端设备中的第一操作单元或第二操作单元, 例如, 设 置在终端设备后侧或侧面的触摸板, 可以釆用如下所述的用已有的图像釆集 单元来拍摄指点物在透明窗表面的图像进而进行分析以确定指点物位置的 配置。
本发明实施例中, 在终端设备的图像釆集单元的图像釆集通道上, 设置 一透明窗, 所述透明窗远离所述图像釆集单元的第一表面与所述图像釆集单 元间隔一定距离, 形成有一空间; 而该图像釆集单元在指点物接触于所述透 明窗的第一表面时釆集图像, 交给处理单元, 根据所述图像计算所述指点物 的轨迹, 并根据所述轨迹生成对应的输入指令。 由于指点物并不是在显示屏 的表面进行滑动, 而是在不同于显示屏的透明窗上进行操作, 不会影响用户 观看显示的内容。
本发明实施例的终端设备, 包括主板、 设置于所述壳体内, 且与所述主 板连接的处理单元和图像釆集单元, 其中:
在所述图像釆集单元的图像釆集通道上, 设置有一透明窗, 所述透明窗 远离所述图像釆集单元的第一表面与所述图像釆集单元间隔一定距离, 形成 有一空间;
所述终端设备还包括:
所述图像釆集单元用于在指点物接触于所述透明窗的第一表面时釆集 图像;
所述处理单元用于根据所述图像计算所述指点物的轨迹, 并根据所述轨 迹生成对应的输入指令。
在本发明的具体实施例中, 所述透明窗远离所述图像釆集单元的第一表 面与所述图像釆集单元间隔一定距离, 形成有一空间, 该空间可以是包括该 透明窗对应于图像釆集通道的一部分, 也可以是包括该透明窗对应于图像釆 集通道的一部分和透明窗的下表面与图像釆集单元之间的部分。
对于如何利用图像来计算指点物的轨迹, 并根据所述轨迹生成对应的输 入指令属于现有技术的范畴, 在本发明的具体实施例中不再详细描述。
由于图像釆集单元需要在指点物接触于所述透明窗的第一表面时釆集 图像, 所以需要该透明窗的第一表面与所述图像釆集单元之间的空间有光线 照明, 一般而言, 可以通过如下方式来实现。
将透明窗设置得大一些, 如图 11所示, 由于透明窗的表面积较大, 此 时指点物 (图 11中为用户手指)仅能接触透明窗的一部分, 而光线还是能 够透过透明窗其他未被指点物接触的部分进入该透明窗的第一表面与所述 图像釆集单元之间的空间, 被图像釆集单元捕捉, 因此分析图像釆集单元拍 摄的图像就能够得到指点物的位置。
然而, 上述的实现方式在外界环境非常暗, 或者完全黑暗、 或者指点物 接触面积大于透明窗的表面积时, 会导致成像效果非常差, 为了避免上述情 况的发生, 在本发明的另一实现方式中, 该终端设备中还设置一用于将光线 照射到所述空间的至少一个发光设备, 与所述主板连接, 指点物接触于所述 透明窗的第一表面时, 会将该发光设备发出的光线反射, 而图像釆集单元捕 捉该发射的光线成像, 因此分析图像釆集单元拍摄的图像就能够得到指点物 的位置。
在本发明的具体实施例中, 该图像釆集单元具有两种工作模式: 拍摄模 式和定位模式, 处理单元在所述图像釆集单元工作于定位模式时, 需要启动 所述发光设备。
在本发明的具体实施例中, 该透明窗可以是单独设置的透明窗, 也可以 是图像釆集单元(如摄像头)本身就具有的透明保护层。
下面对上述的设置发光设备的方式进行进一步详细说明。
在本发明的具体实施例中, 该所述发光设备固定设置或可调, 分别说明 ^口下。
如图 12所示, 所述发光设备 21固定设置, 且发光方向朝向所述透明窗
23。
发光设备 21发出的光线照射到触摸在透明窗 23外表面的手指上, 并反 射到图像釆集单元 22中, 由图像釆集单元 22形成记录手指头位置的图像。
如图 13所示, 所述发光设备 21固定设置, 所述终端设备还包括设置于 所述发光设备 21的发光方向上, 用于将所述发光设备 21发出的光线导向所 述空间的光学器件 24。
发光设备 21发出的光线在光学器件 24内多次反射后射出光学器件 24, 并照射到触摸在透明窗 23外表面的手指上, 并反射到图像釆集单元 22中, 由图像釆集单元 22形成记录手指头位置的图像。
图 13中, 光学器件 24设置于光学器件 24的下方, 但二者之间的相对 位置也可是其他的形式, 如图 14所示, 所述发光设备 21固定设置, 所述终 端设备还包括设置于所述发光设备 21的发光方向上(光学器件 24设置于光 学器件 24的左边)、 用于将所述发光设备 21发出的光线导向所述空间的光 学器件 24。
发光设备 21发出的光线在光学器件 24内多次反射后射出光学器件 24, 并照射到触摸在透明窗 23外表面的手指上, 并反射到图像釆集单元 22中, 由图像釆集单元 22形成记录手指头位置的图像。
图 12到图 14的各种实现方式中, 所述发光设备都位于透明窗外部, 与 所述图像釆集单元位于透明窗的同侧, 但发光设备也可以设置于透明窗内 部, 如图 15所示, 所述发光设备 21固定设置于所述透明窗 23内, 位于透 明窗 23的一端, 且发光方向朝向所述透明窗 23的另一端。 发光设备 21发出的光束从透明窗 23截面照向其内部, 即透明窗 23的 内表面后, 将产生反射。 如果透明保护层表层是空气, 当入射光的角度满足 一定条件时, 光就会在透明保护层表面完全反射。 但是如果有个折射率比较 高的物质 (例如手指)压住透明窗 23的外表面, 透明窗 23表面全反射的条件 就会被打破, 部分光束透过表面, 投射到手指表面。 凹凸不平的手指表面导 致光束产生散射 (漫反射), 散射光透过透明窗 23后到达图像釆集单元 22 , 由图像釆集单元 22形成记录手指头位置的图像。
在上述的各种实现方式中, 为了保证指点物在透明窗上表面的任意位置 都能得到效果相同的成像, 所述发光设备都可以选择环状发光体, 围绕所述 图像釆集单元设置,使得指点物在透明窗上表面的任意位置都能得到效果相 同的成像, 以满足后续的图像分析的需要。
在上述图 12到图 15的各种实现方式中, 该发光设备 21都是固定设置 的, 但是考虑到该发光设备既可以用于成像模块在通常的照相模式下应用, 也可以在成像模块在定位模式下应用, 在这种第一种情况下, 应该是将发光 设备发出的光线尽可能通过该透明窗, 传送到外表面, 而在第二种情况下, 应该是将发光设备发出的光线尽可能投射到该透明窗内, 因此, 在这两种情 况下, 为了满足不同的需求, 需要调整该发光设备的发光方向, 以使得发光 设备发出的光线照射到预定空间。
一般而言, 现有的终端设备(如手机、 PDA等)都已经将摄像头作为标 准配置, 而且同时配置了闪光灯, 在本发明的具体实施例中, 能够直接利用 这些终端设备已经具有的摄像头和闪光灯进行复用, 来作为图像釆集单元和 发光设备, 因此, 最大化现有设备上的功能模块的利用率, 同时也不用增加 硬件成本, 提高了本发明实施例的应用范围。 设备应用于通常照相模式下的示意图, 其中包括一设置有角度调节模块 251 的发光设备 25 , 该角度调节模块用于调节所述发光设备 25 , 使所述发光设 备 25向所述空间发送光线。
在通常照相模式下, 控制该角度调节模块 251 , 使其调节发光设备 25 , 使发光设备 25以第一角度发射光线, 此时光线尽可能多的光线能够通过该 透明窗, 传送到外部。 设设备备应应用用于于定定位位模模式式下下的的示示意意图图,, 其其中中包包括括一一设设置置有有角角度度调调节节模模块块 225511的的发发 光光设设备备 2255 ,,在在定定位位模模式式下下,,控控制制该该角角度度调调节节模模块块 225511 ,,使使其其调调节节发发光光设设备备 2255 ,, 使使发发光光设设备备 2255以以第第二二角角度度发发射射光光线线,, 此此时时光光线线尽尽可可能能多多的的光光线线能能够够照照射射到到 预预定定空空间间。。
55 当当然然,, 考考虑虑到到该该发发光光设设备备在在通通常常摄摄像像模模式式时时,, 需需要要的的发发光光强强度度较较大大,, 以以 满满足足需需求求,, 但但在在定定位位模模式式下下,, 其其仅仅需需要要照照亮亮一一个个相相对对较较小小的的区区域域,, 此此时时,, 为为 强强度度,, 以以便便于于根根据据不不同同的的情情况况进进行行发发光光强强度度的的调调节节,, 满满足足需需要要。。
当当然然,, 上上述述的的终终端端设设备备,, 所所述述发发光光设设备备可可调调,, 应应当当理理解解的的是是,, 是是发发光光设设 1100 备备本本身身可可调调,, 或或者者其其发发出出的的光光线线的的光光线线路路径径可可调调,, 在在光光线线路路径径可可调调时时,, 所所述述 发发光光设设备备具具体体包包括括::
一一发发光光单单元元;;
一一光光学学器器件件,, 用用于于在在所所述述图图像像釆釆集集单单元元工工作作于于定定位位模模式式时时,, 调调节节所所述述发发 光光体体的的光光线线路路径径,, 使使所所述述发发光光体体向向所所述述空空间间发发送送光光线线,, 在在所所述述图图像像釆釆集集单单元元 1155 工工作作于于拍拍摄摄模模式式时时,, 调调节节所所述述发发光光体体的的光光线线路路径径,, 使使所所述述发发光光体体向向透透明明窗窗之之 外外的的空空间间发发送送光光线线。。
调调节节其其发发光光路路径径,, 可可以以通通过过调调节节光光学学器器件件来来实实现现,, 如如在在所所述述图图像像釆釆集集单单 元元工工作作于于定定位位模模式式时时,, 所所述述光光学学器器件件位位于于所所述述发发光光体体的的光光线线路路径径上上,, 所所述述发发 光光体体发发出出的的光光线线经经过过所所述述发发光光体体发发射射到到所所述述空空间间,,在在所所述述图图像像釆釆集集单单元元工工作作 2200 于于拍拍摄摄模模式式时时,, 所所述述光光学学器器件件位位于于所所述述发发光光体体的的光光线线路路径径外外,, 所所述述发发光光体体发发 出出的的光光线线透透过过所所述述透透明明窗窗发发射射到到外外部部。。
当当然然,, 应应该该理理解解的的是是,, 虽虽然然可可调调方方式式设设置置的的发发光光设设备备的的设设置置方方式式仅仅以以图图 1166和和图图 1177进进行行了了说说明明,, 但但应应当当理理解解的的是是,, 发发光光设设备备以以可可调调方方式式设设置置时时,, 其其 发发出出的的光光线线也也可可以以通通过过光光学学器器件件进进行行传传递递,, 以以便便于于将将光光线线更更好好的的投投射射到到预预定定 2255 的的位位置置,, 其其也也可可以以是是设设置置于于其其他他的的位位置置,, 在在此此不不————详详细细说说明明。。
在在本本发发明明的的具具体体实实施施例例中中,, 并并不不限限定定该该图图像像釆釆集集单单元元的的具具体体位位置置,, 其其可可 以以位位于于终终端端设设备备的的各各个个位位置置,, 如如上上表表面面、、 背背面面、、 侧侧面面,, 甚甚至至转转角角处处都都可可能能,, 只只要要用用户户手手指指能能够够达达到到的的位位置置均均可可。。
Figure imgf000026_0001
在在用用户户启启动动拍拍摄摄流流程程后后,, 图图像像釆釆集集单单元元进进入入摄摄像像模模式式,, 进进行行拍拍摄摄,, 这这属属 于图像釆集单元的本来的功能。
在用户启动定位流程后, 在存在发光设备时, 启动发光设备, 发光设备 发出的光线照射到触摸在透明窗外表面的手指上, 并反射到图像釆集单元 中, 由图像釆集单元形成记录手指头位置的图像, 交由处理单元进行定位处 理, 也就是根据所述图像计算所述指点物的轨迹, 并根据所述轨迹生成对应 的输入指令。
当然, 在发光设备可调时, 还应该调整其角度, 使其发出的尽可能多的 光线能够照射到预定空间。
当然, 本领域技术人员可以理解, 上述用已有的图像釆集单元来拍摄指 点物在透明窗表面的图像进而进行分析以确定指点物位置的配置可以用于 前述终端设备的第一操作单元或第二操作单元或第一操作单元和第二操作 单元两者。 此外, 除了前述终端设备的第一操作单元和第二操作单元以外, 上述配置可以单独应用于釆用触摸屏的任意终端设备中, 或者与实现两种操 作同时进行的操作对象的操作控制的方法及终端设备结合地使用, 本发明的 实施例并不意在对此进行限制。
以上所述是本发明的优选实施方式, 应当指出, 对于本技术领域的普通 技术人员来说, 在不脱离本发明所述原理的前提下, 还可以作出若干改进和 润饰, 这些改进和润饰也应视为本发明的保护范围。

Claims

权利要求书
1. 一种操作对象的操作控制的方法, 其特征在于, 包括:
获取一操作对象的第一操作方向和第二操作方向;
确定所述第一操作方向与所述第二操作方向的方向组合关系对应的操 作;
对所述操作对象执行所述操作。
2. 根据权利要求 1所述的方法, 其特征在于, 确定所述第一操作方向 与所述第二操作方向的方向组合关系对应的操作的步骤具体为:
根据所述操作对象的操作类型,确定所述第一操作方向与所述第二操作 方向的方向组合关系对应的操作。
3. 根据权利要求 2所述的方法, 其特征在于, 所述操作对象的操作类 型为: 对立体操作对象的操作时, 根据所述操作对象的操作类型, 确定所述 第一操作方向与所述第二操作方向的方向组合关系对应的操作的步骤具体 为:
在所述第一操作方向和所述第二操作方向相同时, 则确定所述操作为: 所述立体操作对象在所述第一操作方向或者第二操作方向上的平移操作; 或 者
在所述第一操作方向和所述第二操作方向相反时, 则确定所述操作为: 所述立体操作对象以所述第一操作方向和所述第二操作方向形成的轨迹线 的垂线为轴进行旋转操作。
4. 根据权利要求 2所述的方法, 其特征在于, 所述操作对象的操作类 型为: 对平面操作对象的操作时, 根据所述操作对象的操作类型, 确定所述 第一操作方向与所述第二操作方向的方向组合关系对应的操作的步骤具体 为:
在所述第一操作方向和所述第二操作方向相同时, 则确定所述操作为: 所述平面操作对象在所述第一操作方向或者所述第二操作方向上的平移操 作; 或者
在所述第一操作方向和所述第二操作方向相反时, 则确定所述操作为: 所述平面操作对象的整体放大操作。
5. 根据权利要求 2所述的方法, 其特征在于, 所述操作对象的操作类 型为: 对窗口操作对象的操作时, 根据所述操作对象的操作类型, 确定所述 第一操作方向与所述第二操作方向的方向组合关系对应的操作的步骤具体 为:
在所述第一操作方向和所述第二操作方向相同的第一方向时,则确定所 述操作为: 所述窗口操作对象的打开操作; 或者
确定与所述窗口操作对象的打开操作相反的方向的操作为:所述窗口操 作对象的关闭操作; 或者
在所述第一操作方向和所述第二操作方向相同的第二方向时,则确定所 述操作为: 所述窗口操作对象的最大化操作; 或者
确定与所述窗口操作对象的最大化操作相反方向的操作为:所述窗口操 作对象的缩小操作; 其中, 所述第一方向和所述第二方向不同。
6. 根据权利要求 1所述的方法, 其特征在于, 在所述第一操作方向和 所述第二操作方向相同时, 则确定所述操作为第一操作;
在所述第一操作方向和所述第二操作方向相反时,则确定所述操作为第 二操作。
7. 根据权利要求 1所述的方法, 其特征在于, 还包括:
选定所述操作对象上的一定位点的第三操作;
相对于所述定位点作远离所述定位点的所述第一操作,进行相对于所述 定位点的放大。
8. 一种终端设备, 包括壳体, 设置在所述壳体上的第一操作单元, 其 特征在于, 还包括: 设置在所述壳体上的第二操作单元;
所述第一操作单元, 用于获取一操作对象的第一操作方向;
所述第二操作单元, 用于获取所述操作对象的第二操作方向;
处理单元,用于确定所述第一操作方向与所述第二操作方向的方向组合 关系对应的操作;
操作执行单元, 用于对所述操作对象执行所述操作, 并将所述操作的执 行结果输出显示。
9. 根据权利要求 8所述的终端设备, 其特征在于, 所述处理单元包括: 第一处理子单元, 用于根据所述操作对象的属性特征, 确定所述操作对 象的操作类型;
第二处理子单元, 用于根据所述操作对象的操作类型, 确定所述第一操 作方向与所述第二操作方向的方向组合关系对应的操作。
10. 根据权利要求 9所述的终端设备, 其特征在于, 所述第一处理子单 元确定所述操作对象的操作类型为: 对立体操作对象的操作时, 所述第二处 理子单元具体用于:
在所述第一操作方向和所述第二操作方向相同时, 则确定所述操作为: 所述立体操作对象在所述第一操作方向或者第二操作方向上的平移操作; 或 者
在所述第一操作方向和所述第二操作方向相反时,则以所述第一操作方 向和所述第二操作方向形成的轨迹线的垂线为轴进行旋转操作。
11. 根据权利要求 9所述的终端设备, 其特征在于, 所述第一处理子单 元确定所述操作对象的操作类型为: 对平面操作对象的操作时, 所述第二处 理子单元具体用于:
在所述第一操作方向和所述第二操作方向相同时, 则确定所述操作为: 所述平面操作对象在所述第一操作方向或者所述第二操作方向上的平移操 作; 或者
在所述第一操作方向和所述第二操作方向相反时, 则确定所述操作为: 所述平面操作对象的整体放大操作。
12. 根据权利要求 9所述的终端设备, 其特征在于, 所述第一处理子单 元确定所述操作对象的操作类型为: 对窗口操作对象的操作时, 所述第二处 理子单元具体用于:
在所述第一操作方向和所述第二操作方向相同的第一方向时,则确定所 述操作为: 所述窗口操作对象的打开操作; 或者
确定与所述窗口操作对象的打开操作相反的方向的操作为:所述窗口操 作对象的关闭操作; 或者
在所述第一操作方向和所述第二操作方向相同的第二方向时,则确定所 述操作为: 所述窗口操作对象的最大化操作; 或者
确定与所述窗口操作对象的最大化操作相反方向的操作为:所述窗口操 作对象的缩小操作。
1 3. 根据权利要求 8 - 12任一项所述的终端设备, 其特征在于, 所述第 一操作单元设置在所述壳体的第一位置; 所述第二操作单元设置在所述壳体 的与所述第一位置相对的第二位置。
14. 根据权利要求 8所述的终端设备, 其特征在于, 所述第一操作方向 与所述第二操作方向相同时, 则所述操作为第一操作;
所述第一操作方向与所述第二操作相反时, 则所述操作为第二操作。
15. 根据权利要求 8所述的终端设备, 其特征在于, 所述第一操作单元 包括:
一图像釆集单元, 以及
在所述图像釆集单元的图像釆集通道上设置的一透明窗, 所述透明窗远 离所述图像釆集单元的第一表面与所述图像釆集单元间隔一定距离, 形成有 一空间;
其中,
所述图像釆集单元用于在指点物接触于所述透明窗的所述第一表面时 釆集图像;
所述处理单元用于根据所述图像计算所述指点物的轨迹,并根据所述轨 迹获取所述操作对象的第一操作方向。
16. 根据权利要求 15所述的终端设备,其特征在于,所述第一操作单元 还包括:
一用于将光线照射到所述空间的至少一个发光设备;
所述发光设备与所述图像釆集单元位于所述第一表面的同侧。
17. 根据权利要求 16所述的终端设备,其特征在于,所述图像釆集单元 具有拍摄模式或定位模式, 所述图像釆集单元工作于所述定位模式时, 所述 发光设备处于启动状态。
18. 根据权利要求 16所述的终端设备,其特征在于,所述发光设备固定 设置, 发光方向朝向所述透明窗。
19. 根据权利要求 16所述的便携式电子设备,其特征在于,所述发光设 备固定设置, 所述第一操作单元还包括设置于所述发光设备的发光方向上, 用于将所述发光设备发出的光线导向所述空间的光学器件。
20. 根据权利要求 16所述的终端设备,其特征在于,所述发光设备固定 设置, 所述发光设备为一环状发光体, 围绕所述图像釆集单元设置。
21. 根据权利要求 16所述的终端设备,其特征在于,所述发光设备固定 设置于所述透明窗内, 位于透明窗的一端, 且发光方向朝向所述透明窗的另 一端。
22. 根据权利要求 16所述的终端设备,其特征在于,所述发光设备可调, 所述发光设备具体包括:
一发光单元;
角度调节模块, 与所述发光单元连接, 用于在所述图像釆集单元工作于 定位模式时, 调节所述发光体, 使所述发光体向所述空间发送光线。
23. 根据权利要求 16所述的终端设备,其特征在于,所述发光设备可调, 所述发光设备具体包括:
一发光单元;
一光学器件, 用于在所述图像釆集单元工作于定位模式时, 调节所述发 光体的光线路径, 使所述发光体向所述空间发送光线, 在所述图像釆集单元 工作于拍摄模式时, 调节所述发光体的光线路径, 使所述发光体向透明窗之 外的空间发送光线。
24. 根据权利要求 23所述的终端设备,其特征在于,在所述图像釆集单 元工作于定位模式时, 所述光学器件位于所述发光体的光线路径上, 所述发 光体发出的光线经过所述发光体发射到所述空间,在所述图像釆集单元工作 于拍摄模式时, 所述光学器件位于所述发光体的光线路径外, 所述发光体发 出的光线透过所述透明窗发射到外部。
25. 根据权利要求 22、 23或 24所述的便携式电子设备, 其特征在于,
26. 根据权利要求 8所述的终端设备, 其特征在于, 所述第二操作单元 包括:
一图像釆集单元, 以及
在所述图像釆集单元的图像釆集通道上设置的一透明窗, 所述透明窗远 离所述图像釆集单元的第一表面与所述图像釆集单元间隔一定距离, 形成有 一空间;
其中,
所述图像釆集单元用于在指点物接触于所述透明窗的所述第一表面时 釆集图像;
所述处理单元用于根据所述图像计算所述指点物的轨迹,并根据所述轨 迹获取所述操作对象的第二操作方向。
27. 一种操作对象的操作控制的方法, 所述操作对象包括至少一个显示 对象, 其特征在于, 包括:
确定显示对象的优先级;
接收对所述显示对象的第一操作方向和第二操作方向的指令;
在所述第一操作方向和第二操作方向相反时,显示所述显示对象当前优 先级的显示信息以及低于当前优先级的显示信息。
28. 一种终端设备, 该终端设备包括:
存储单元, 用于存储包括有至少一个显示对象的操作对象;
处理单元, 用于确定显示对象的优先级; 接收对所述显示对象的第一操 作方向和第二操作方向的指令; 在所述第一操作方向和第二操作方向相反 时,产生显示所述显示对象的当前优先级的显示信息以及低于当前优先级的 显示信息的指令;
显示单元, 用于根据所述指令显示所述显示对象的显示信息。
29. 一种终端设备, 包括:
壳体,
设置于壳体内的主板,
设置在所述壳体上与所述主板连接的操作单元, 和
设置在所述壳体内与所述主板连接的处理单元,
其特征在于, 所述操作单元包括:
一图像釆集单元; 和
一透明窗, 设置在所述图像釆集单元的图像釆集通道上, 所述透明窗远 离所述图像釆集单元的第一表面与所述图像釆集单元间隔一定距离, 形成有 一空间;
其中, 所述图像釆集单元用于在指点物接触于所述透明窗的所述第一表 面时釆集图像; 并且
所述处理单元用于根据所述图像计算所述指点物的轨迹,并根据所述轨 迹生成对应的输入指令。
30. 一种输入方法, 应用于权利要求 29所述的终端设备, 其特征在于, 包括:
在指点物接触于所述透明窗的第一表面时釆集图像;
根据所述图像计算所述指点物的轨迹,并根据所述轨迹生成对应的输入 指令。
PCT/CN2010/079507 2009-12-07 2010-12-07 操作对象的操作控制的方法及终端设备 WO2011069435A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/513,948 US9836139B2 (en) 2009-12-07 2010-12-07 Method and terminal device for operation control of operation object

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN200910241771.9 2009-12-07
CN200910241771.9A CN102087571B (zh) 2009-12-07 2009-12-07 操作对象的操作控制的方法及终端设备
CN200910242423.3 2009-12-11
CN200910242423.3A CN102096517B (zh) 2009-12-11 2009-12-11 一种便携式电子设备及输入方法

Publications (1)

Publication Number Publication Date
WO2011069435A1 true WO2011069435A1 (zh) 2011-06-16

Family

ID=44145106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2010/079507 WO2011069435A1 (zh) 2009-12-07 2010-12-07 操作对象的操作控制的方法及终端设备

Country Status (2)

Country Link
US (1) US9836139B2 (zh)
WO (1) WO2011069435A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103500062A (zh) * 2013-09-17 2014-01-08 华为技术有限公司 一种终端、终端控制方法及装置
CN104714635B (zh) * 2013-12-16 2018-07-06 联想(北京)有限公司 信息处理的方法及电子设备
CN105335116B (zh) * 2014-07-30 2018-11-09 联想(北京)有限公司 一种显示控制方法及电子设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1926500A (zh) * 2004-03-05 2007-03-07 诺基亚公司 控制器和控制器配置
CN101441541A (zh) * 2007-11-19 2009-05-27 乐金显示有限公司 多点触摸平板显示模块
CN101533649A (zh) * 2008-03-10 2009-09-16 创新科技有限公司 使用动作变化模拟按键操作的方法和便携媒体播放器

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11341114A (ja) 1998-05-27 1999-12-10 Mitsubishi Electric Corp 携帯型電話機
US7508377B2 (en) 2004-03-05 2009-03-24 Nokia Corporation Control and a control arrangement
KR100667817B1 (ko) * 2005-09-26 2007-01-11 삼성전자주식회사 직하발광형 백라이트 유닛 및 이를 적용한 칼라 필터를사용하지 않는 액정표시장치
JP4567028B2 (ja) * 2006-09-26 2010-10-20 エルジー ディスプレイ カンパニー リミテッド マルチタッチ感知機能を有する液晶表示装置とその駆動方法
WO2009076503A1 (en) * 2007-12-11 2009-06-18 Firstpaper Llc Touch-sensitive illuminated display apparatus and method of operation thereof
US20100177053A2 (en) * 2008-05-09 2010-07-15 Taizo Yasutake Method and apparatus for control of multiple degrees of freedom of a display
US8477139B2 (en) * 2008-06-09 2013-07-02 Apple Inc. Touch screen device, method, and graphical user interface for manipulating three-dimensional virtual objects
US8130207B2 (en) * 2008-06-18 2012-03-06 Nokia Corporation Apparatus, method and computer program product for manipulating a device using dual side input devices
EP2294375B8 (en) * 2008-06-19 2024-02-14 Massachusetts Institute of Technology Tactile sensor using elastomeric imaging
US8345014B2 (en) * 2008-07-12 2013-01-01 Lester F. Ludwig Control of the operating system on a computing device via finger angle using a high dimensional touchpad (HDTP) touch user interface
JP2012504817A (ja) * 2008-10-02 2012-02-23 ネクスト ホールディングス リミティド タッチ検出システムにおいてマルチタッチを解像するステレオ光センサ
US8669945B2 (en) * 2009-05-07 2014-03-11 Microsoft Corporation Changing of list views on mobile device
KR101604692B1 (ko) * 2009-06-30 2016-03-18 엘지전자 주식회사 이동 단말기 및 그 제어 방법
US8497884B2 (en) * 2009-07-20 2013-07-30 Motorola Mobility Llc Electronic device and method for manipulating graphic user interface elements
JP2011028345A (ja) * 2009-07-22 2011-02-10 Olympus Imaging Corp 条件変更装置、カメラ、携帯機器、およびプログラム
US20110055753A1 (en) * 2009-08-31 2011-03-03 Horodezky Samuel J User interface methods providing searching functionality
WO2011037558A1 (en) * 2009-09-22 2011-03-31 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8839128B2 (en) * 2009-11-25 2014-09-16 Cooliris, Inc. Gallery application for content viewing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1926500A (zh) * 2004-03-05 2007-03-07 诺基亚公司 控制器和控制器配置
CN101441541A (zh) * 2007-11-19 2009-05-27 乐金显示有限公司 多点触摸平板显示模块
CN101533649A (zh) * 2008-03-10 2009-09-16 创新科技有限公司 使用动作变化模拟按键操作的方法和便携媒体播放器

Also Published As

Publication number Publication date
US20120242611A1 (en) 2012-09-27
US9836139B2 (en) 2017-12-05

Similar Documents

Publication Publication Date Title
CN107566721B (zh) 一种信息显示方法、终端及计算机可读存储介质
US9264694B2 (en) Hand-held imaging apparatus and storage medium storing program
US9430045B2 (en) Special gestures for camera control and image processing operations
WO2021115479A1 (zh) 显示控制方法及电子设备
CN107317993B (zh) 一种视频通话方法及移动终端
CN109891874A (zh) 一种全景拍摄方法及装置
EP2189835A1 (en) Terminal apparatus, display control method, and display control program
WO2019227281A1 (zh) 一种拍摄的方法及电子设备
WO2014133278A1 (en) Apparatus and method for positioning image area using image sensor location
JP6561141B2 (ja) タッチパッドを用いて携帯端末の撮影焦点距離を調整する方法および携帯端末
CN106687887B (zh) 投影的交互式虚拟桌面
US8310504B2 (en) System and method for panning and selecting on large displays using mobile devices without client software
CN106657455A (zh) 一种带可旋转摄像头的电子设备
CN206323415U (zh) 一种带可旋转摄像头的电子设备
US7359003B1 (en) Display, input and form factor for portable instruments
US9088720B2 (en) Apparatus and method of displaying camera view area in portable terminal
JP2002539742A (ja) 対話のための装置
KR20240004839A (ko) 촬영 방법, 장치 및 전자기기
JP4190935B2 (ja) 携帯端末装置
TW202004432A (zh) 電子裝置及電子裝置的操作控制方法
WO2011069435A1 (zh) 操作对象的操作控制的方法及终端设备
CN112637495B (zh) 拍摄方法、装置、电子设备及可读存储介质
CN210323721U (zh) 一种云台相机
TW202018486A (zh) 多螢幕操作方法與使用此方法的電子系統
TW201714074A (zh) 使用手勢的拍照方法、系統與電子裝置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10835470

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13513948

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10835470

Country of ref document: EP

Kind code of ref document: A1