WO2012119548A1 - 控制方法、控制装置、显示装置以及电子设备 - Google Patents

控制方法、控制装置、显示装置以及电子设备 Download PDF

Info

Publication number
WO2012119548A1
WO2012119548A1 PCT/CN2012/072033 CN2012072033W WO2012119548A1 WO 2012119548 A1 WO2012119548 A1 WO 2012119548A1 CN 2012072033 W CN2012072033 W CN 2012072033W WO 2012119548 A1 WO2012119548 A1 WO 2012119548A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
display
parameter
detection distance
operating body
Prior art date
Application number
PCT/CN2012/072033
Other languages
English (en)
French (fr)
Inventor
刘俊峰
王茜莺
郜远
Original Assignee
联想(北京)有限公司
北京联想软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201110054512.2A external-priority patent/CN102681702B/zh
Priority claimed from CN201110061129.XA external-priority patent/CN102681750B/zh
Application filed by 联想(北京)有限公司, 北京联想软件有限公司 filed Critical 联想(北京)有限公司
Priority to US14/003,687 priority Critical patent/US10345912B2/en
Publication of WO2012119548A1 publication Critical patent/WO2012119548A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Definitions

  • Control method control device, display device, and electronic device
  • Embodiments of the invention relate to the field of electronic devices. More particularly, the present invention relates to a control method and corresponding control device applied to an electronic device, a method of moving a display object applied to the electronic device, and a corresponding display device, and an electronic device. Background technique
  • a display screen and a touch sensing unit are usually stacked to form a touch display screen.
  • the touch sensing unit is configured to receive input from a user to facilitate user operations.
  • the touch sensing unit may include a touch sensing area that may be comprised of a sensor element such as a capacitive touch sensor or a resistive touch sensor.
  • the user of such an electronic device contacts the touch display screen through a certain operating body such as a fingertip or a pen tip.
  • the user can perform actions such as clicking, double-clicking, dragging, etc. on the touch control area of the electronic device to implement the corresponding control functions.
  • the electronic device senses the contact of the operating body with the touch display screen through the touch sensing unit, and performs a corresponding operation in response to such contact.
  • the user needs to move a display object (such as a slider) from a first position on the touch display screen to a second position.
  • a display object such as a slider
  • the stroke of the sliding gesture is equal to the distance between the second position and the first position. This is a relatively long distance.
  • it does not meet the user's operating habits, which is likely to cause user fatigue and result in poor user experience.
  • the user in order to see content that is not currently displayed on the touch display screen, in one implementation, the user needs to always make a swipe gesture until the desired content is displayed. On the touch screen. In the case of a long stroke, this implementation is also prone to user fatigue, resulting in a poor user experience.
  • the user A quick swipe gesture is required to cause the content displayed on the touch display to scroll quickly and gradually stop scrolling with a certain negative acceleration. In this implementation, the user cannot intuitively recognize the relationship between the speed of the gesture and the scrolling speed of the content while making the gesture, so that the desired content cannot be accurately displayed on the touch display screen.
  • One embodiment of the present invention provides a control method for use in an electronic device.
  • the electronic device includes a sensing unit, wherein the sensing unit has a first detection distance and a second detection distance. Specifically, the first detection distance and the second detection distance are distances with respect to the electronic device, and the second detection distance is smaller than the first detection distance.
  • the control method includes: detecting a three-dimensional motion trajectory when the relative distance of the operating body relative to the electronic device is less than the first detection distance, wherein the three-dimensional motion trajectory is that one end is located between the first detection distance and the second detection distance and the other end is equal to or a continuous motion trajectory smaller than the second detection distance; generating a state change signal when the operating body reaches or exits from the second detection distance of the proximity sensing unit; and divides the three-dimensional trajectory according to the state change signal to obtain a relative distance equal to Or a second trajectory when the first trajectory and the relative distance are smaller than the second detection distance; and the corresponding control command is executed according to the first trajectory and the second trajectory.
  • the control device includes: a sensing unit having a first detection distance and a second detection distance, configured to detect a three-dimensional motion trajectory when the relative distance of the operating body relative to the electronic device is less than the first detection distance, wherein the first detection distance and The second detection distance is a distance from the electronic device, and the second detection distance is smaller than the first detection distance, and the three-dimensional motion track is that one end is located between the first detection distance and the second detection distance and the other end is equal to or smaller than the second detection distance.
  • a state detecting unit configured to generate a state change signal when the operating body reaches or exits from the second detecting distance of the sensing unit;
  • the track dividing unit is configured to divide the three-dimensional according to the state changing signal a trajectory to obtain a first trajectory when the relative distance is equal to or smaller than the second detection distance and a second trajectory when the relative distance is higher than the second detection distance; and an execution unit configured to perform the corresponding according to the first trajectory and the second trajectory control commands.
  • Another embodiment of the present invention provides an electronic device including: a sensing unit and a processing unit.
  • the sensing unit has a first detection distance and a second detection distance, and is configured to detect an operating body relative to The relative distance of the electronic device is smaller than the three-dimensional motion trajectory when the first detection distance is, wherein the first detection distance and the second detection distance are distances relative to the electronic device, and the second detection distance is smaller than the first detection distance, and the three-dimensional motion trajectory is one end A continuous motion trajectory between the first detection distance and the second detection distance and the other end is equal to or smaller than the second detection distance.
  • the processing unit includes: a state detecting module configured to generate a state change signal when the operating body reaches or exits from the second detecting distance of the sensing unit; the track dividing module is configured to divide the three-dimensional track according to the state changing signal And obtaining a second trajectory when the relative distance is equal to or smaller than the second detection distance, and the second trajectory when the relative distance is higher than the second detection distance; and executing the module, configured to perform corresponding control according to the first trajectory and the second trajectory command.
  • a state detecting module configured to generate a state change signal when the operating body reaches or exits from the second detecting distance of the sensing unit
  • the track dividing module is configured to divide the three-dimensional track according to the state changing signal And obtaining a second trajectory when the relative distance is equal to or smaller than the second detection distance, and the second trajectory when the relative distance is higher than the second detection distance
  • executing the module configured to perform corresponding control according to the first trajectory and the second trajectory command.
  • the three-dimensional trajectory of the operating body in the process of approaching or leaving the electronic device can be detected, and the three-dimensional trajectory of the operating body in the process of approaching or leaving the electronic device and the operating body on the sensing unit or
  • the control command is executed in conjunction with an operation performed at a predetermined height from the electronic device.
  • the present invention provides a method, a display device, and an electronic device for moving a display object, which allows a user to operate with a simple gesture conforming to an operation habit, thereby improving the user experience.
  • a method for moving a display object is provided, which is applied to an electronic device, where the electronic device includes a display unit, and the display object and the display area of the display unit are used for displaying a first relative positional relationship between the object display areas of the display object, the method comprising: sensing a first operation of an operating body relative to the electronic device; according to the first operation, obtaining the operating body and a first distance between the electronic devices; determining whether the first distance is less than or equal to the first detection distance; and determining that the first distance is less than or equal to the first detection distance, according to the first operation, obtaining the a first operating parameter and a second operating parameter of the operating body, wherein the first operating parameter is different from the second operating parameter; and the display object is based on the first operating parameter and the second operating parameter Changing from the first relative positional relationship to the second relative positional relationship with the object display area; sensing the operating body relative to the electronic device According to the second operation, obtaining a second distance between the operating body and
  • a display device wherein a display area includes an object display area, the object display area is for displaying a display object, and the display object and the object display area are first a relative positional relationship, the display device includes: a first sensing unit that senses a first operation of an operating body relative to the display device; and a first obtaining unit that obtains the operating body according to the first operation a first distance between the display devices; a first determining unit, determining whether the first distance is less than or equal to a first detection distance; and a second obtaining unit, when determining that the first distance is less than or equal to the first detection distance, Obtaining, according to the first operation, a first operating parameter and a second operating parameter of the operating body, where the first operating parameter is different from the second operating parameter; a first changing unit, based on the first The operating parameter and the second operating parameter change a relationship between the display object and the object display area from a first relative positional relationship to a first a second sensing unit,
  • an electronic device includes: a display including an object display area in a display area, the object display area for displaying a display object, and the display object and the object display area a first relative positional relationship; a first sensor sensing a first operation of the operating body relative to the electronic device; and a second sensor sensing a second operation of the operating body relative to the electronic device;
  • the processor is configured to: obtain a first distance between the operating body and the electronic device according to the first operation; and determine whether the first distance is smaller than a first detection distance; When determining that the first distance is smaller than the first detection distance, obtaining, according to the first operation, a first operation parameter and a second operation parameter of the operating body, where the first operation parameter and the second Different operating parameters; based on the first operating parameter and the second operating parameter, between the display object and the object display area Relatively The positional relationship is changed to a second relative positional relationship; according to the second operation, obtaining a second distance between the operating body and the electronic device; determining whether
  • the display device, and the electronic device according to the embodiment of the present invention by combining both the contact detection and the proximity detection, the user can conveniently control the display device or the electronic device with a simple gesture conforming to the operation habit. Perform the appropriate actions to improve the user experience.
  • FIG. 1 is a flow chart describing a control method according to an embodiment of the present invention.
  • FIG. 3 is a flow chart illustrating steps of a method for moving a display object according to an embodiment of the present invention
  • 4A is an explanatory diagram showing an exemplary case of detecting a movement trajectory of an operating body between a first detection distance and a second detection distance by an infrared sensing element;
  • 4B is a schematic diagram illustrating a proximity detection principle of a capacitive touch screen
  • FIGS. 5A-5C are schematic diagrams showing display on an electronic device according to an embodiment of the present invention
  • FIGS. 5D and 5E are schematic views respectively illustrating a boundary state and a boundary state according to an embodiment of the present invention
  • FIGS. 6A-6C are diagrams showing display on an electronic device according to an embodiment of the present invention
  • FIGS. 6D and 6E are schematic views respectively illustrating a boundary state and a boundary state according to an embodiment of the present invention
  • FIG. 7 is a block diagram showing an exemplary structure of a control device according to an embodiment of the present invention.
  • FIG. 8 is a block diagram showing an exemplary configuration of a control device according to another embodiment of the present invention
  • FIG. 9 is a block diagram illustrating a main configuration of a display device according to an embodiment of the present invention
  • 10 is a block diagram illustrating a more detailed configuration of a display device according to an embodiment of the present invention
  • FIG. 11 is a block diagram showing an exemplary structure of an electronic device according to an embodiment of the present invention
  • FIG. 12 is a view illustrating another according to the present invention.
  • an electronic device refers to a device that is capable of communicating with other devices.
  • Specific forms of electronic devices include, but are not limited to, mobile phones, personal computers, digital cameras, personal digital assistants, portable computers, tablet computers, gaming machines, and the like.
  • the electronic device includes a sensing unit.
  • the sensing unit can include a proximity sensor.
  • the sensing unit can include one or more of an inductive sensing element such as an infrared sensing element, an ultrasonic sensing element, a capacitive sensing element, and the like.
  • the sensing unit has a first detection distance and a second detection distance, the first detection distance and the second detection distance being distances with respect to the electronic device, and the second detection distance being smaller than the first detection distance.
  • the first detection distance can be set to 10 mm and the second detection distance can be set to 0. That is, the first detection distance is a height of 10 mm with respect to the electronic device, and the second detection distance is a plane where the touch sensing area is located in the electronic device, and reaches the second detection height when the user's finger touches the touch sensing area.
  • the sensing unit may include a first sensing unit and a second sensing unit to detect a movement trajectory of the operating body when contacting the electronic device and a movement trajectory of the operating body when the electronic device is not in contact with the electronic device.
  • the first sensing unit may include a touch sensing area composed of various known sensor elements such as a capacitive touch sensor or a resistive touch sensor.
  • the second sensing unit can detect a motion trajectory of the operating body between the first distance and the second distance.
  • the second sensing unit may include a plurality of ultrasonic sensing elements, infrared sensing elements or image acquisition elements disposed at different locations of the electronic device to determine the position of the operating body.
  • FIG. 4(A) is an explanatory diagram showing an exemplary case of detecting a movement trajectory of the operating body between the first detection distance and the second detection distance by the infrared sensing element.
  • the first sensing module (not shown) of the electronic device 400 includes infrared emitting ends 410 respectively disposed on the left and right sides of the electronic device (alternatively, also upper and lower sides). And the red line receiving end 420.
  • the infrared emitting end 410 is emitted to the red line receiving end 420 at a predetermined interval along the left edge of the electronic device 400. Therefore, when the operating body enters the detection range of the first sensing module, the motion trajectory of the operating body can be determined according to the light occluded by the operating body.
  • the second sensing module can detect the motion trajectory of the operating body at the second detection distance.
  • the second sensing module can include a touch sensitive element such as a pressure sensitive, electrostatic touch pad or capacitive touch pad disposed on the electronic device.
  • the second sensing module is arranged to overlap with the display area of the electronic device.
  • the projection of the detection area of the first sensing module on the electronic device is the same as the detection area of the second sensing module on the electronic device.
  • Fig. 4 (B) is an explanatory diagram showing an exemplary case of detecting an operating body of the proximity electronic device through the capacitive touch screen 400'.
  • the electric field coupling of the capacitive touch screen 400' covers a space between the top of the touch screen and the touch screen at a distance L1, thereby being above the touch screen.
  • An effective detection area formed in the space of the distance L1 from the touch screen.
  • a detectable coupling current I of sufficient magnitude can be generated.
  • the controller in the capacitive touch screen 400 can calculate the precise position of the finger and the distance of the finger from the screen according to the current value detected at each pole. Further, the controller is also capable of detecting whether the finger touches the surface of the screen.
  • FIG. 1 is a flow chart depicting a control method 100 in accordance with an embodiment of the present invention.
  • the touch control method 100 can be used for an electronic device including the above-described sensing unit. As shown in FIG. 1, in step S101, a three-dimensional motion trajectory when the relative distance of the operating body with respect to the electronic device is smaller than the first detection distance is detected.
  • the three-dimensional motion trajectory of the operating body is a continuous motion trajectory in which one end is located between the first detection distance and the second detection distance and the other end is equal to or smaller than the second detection distance.
  • the operating body may reach the second detecting distance from the first detecting distance, and return the first detecting distance from the second detecting distance after the second detecting distance travels a trajectory.
  • the start point and the end point of the three-dimensional motion trajectory can be determined according to the requirements of the current operation, that is, the three-dimensional motion trajectory can be determined according to the requirements of the current operation, and the operating body should reach the second detection distance from the first detection distance and travel at the second detection distance.
  • the trajectory is also a trajectory of the operating body traveling at the second detecting distance and returning from the second detecting distance to the first detecting distance.
  • step S101 when the operating body reaches the first detection distance, it starts to detect the three-dimensional motion trajectory of the operating body until the operating body reaches the second detection distance and then opens from the second detection distance, or when the operating body is in the first Second detection distance or second detection distance When the position is below, the three-dimensional motion trajectory of the operating body is detected until the operating body returns from the second detection distance to the first detection distance.
  • the position of the user's finger touching the touch sensing area of the electronic device can be used as a reference point when the user's finger touches
  • the position in the air before the reference point is projected on the left side of the touch point on the touch sensing area (ie, when the user touches the reference point from the left side)
  • it may correspond to a command to page forward
  • the command corresponding to the page turning backward may be corresponding.
  • the second detection distance may be set to zero, and the end between the first detection distance and the second detection distance before the operation body reaches the second detection distance is the starting point of the three-dimensional motion trajectory, and in the second The position at which the distance is terminated or the position where the second detection distance is left after traveling a trajectory on the second detection distance is the end point of the three-dimensional motion trajectory.
  • the position of the user's finger touching the touch sensing area of the electronic device may be referred to as a reference point, when the user's finger is referred to from the reference
  • the position in the air after the point is left is projected on the touch sensing area above the reference point, it may correspond to a command to reduce the currently displayed image; and when the user's finger is away from the reference point, the position in the air is on the touch sensing area.
  • the second detection distance may be set to zero, the position at which the operating body starts to touch on the second detection distance is the starting point of the three-dimensional motion trajectory, and the position reaching the first detection distance is the end point of the three-dimensional motion trajectory.
  • the finger touches the last stroke when the finger is raised, according to the projection position of the finger in the display area, and according to the distance between the finger and the electronic device.
  • the user's finger gradually lifts away from the touch sensing area when the last stroke is written, and the last stroke becomes thinner with the height of the finger when the user's finger lifts off the touch sensing area.
  • a reference time may also be set.
  • the thickness of the entire character written before is thinned as the finger distances from the height of the electronic device, when the finger is lower than the first detection.
  • the thickness of the entire character is determined as the thickness corresponding to the height of the finger at this time.
  • the second detection distance may be set to zero
  • the starting position of the operating body at the second detecting distance is the starting point of the three-dimensional motion trajectory
  • the operating body is between the first detecting distance and the second detecting distance.
  • the end position is the end point or the position of the operating body at the first detecting distance is the end point of the three-dimensional motion trajectory in the case where the operating body moves all the way upward and the distance from the electronic device exceeds the first detecting distance.
  • the first detection distance may be set in advance by the device provider or the user, and alternatively, the first detection distance may be set according to the operation of the user. Specifically, when it is detected that the operating body is in a fixed position for more than a predetermined time, the distance of the fixed position with respect to the electronic device can be set as the first detection distance.
  • the second detection distance may be set in advance by the device provider or the user, and alternatively, the second detection distance may be set according to the operation of the user. Specifically, when it is detected that the time when the operating body is located at a fixed position relative to the electronic device that is smaller than the first detecting distance exceeds the predetermined time, the distance of the fixed position with respect to the electronic device is set as the second detecting distance. As described above, the second detection distance can be set to zero. That is, the electronic device can be touched when the operating body is at the second detection distance.
  • step S102 a state change signal is generated when the operating body reaches or exits from the second detection distance of the proximity sensing unit.
  • step S103 the three-dimensional trajectory is divided according to the state change signal generated in step S102, to obtain a first trajectory when the relative distance is equal to or smaller than the second detection distance, and the second trajectory when the relative distance is higher than the second detection distance .
  • the second trajectory may be a point at which the relative electronic device is located at the second detection distance, or a continuous motion trajectory composed of a plurality of points equal to or lower than the second detection distance.
  • step S104 a corresponding control command is executed in accordance with the first trajectory and the second trajectory.
  • the control command group can be determined based on the second trajectory divided in step S103.
  • a corresponding control command is then selected in the determined set of control commands based on the first trajectory, and the selected control command is executed.
  • the second trajectory when the user touches the volume control button on the touch screen may correspond to the volume adjustment command group.
  • the volume adjustment command group can include increasing the volume, decreasing the volume, muting, and maximizing the volume.
  • the first track may correspond to maximizing the volume; when the user's finger is from the touch screen When the position of the volume control button displayed above is left and the position in the air is projected on the touch sensing area below the volume control button, the first track may be Corresponding to muting; when the user's finger is away from the position of the volume control button displayed on the touch screen, and the position in the air is projected on the left side of the volume control button on the touch sensing area, the first track may correspond to decreasing the volume;
  • the first trajectory may correspond to increasing the volume when the position of the user's finger on the touch-sensitive area is projected to the right of the volume control button after the user's
  • control method of the embodiment by detecting the three-dimensional trajectory of the operating body in the process of approaching or leaving the electronic device, and the three-dimensional trajectory of the operating body in the process of approaching or leaving the electronic device and the operating body on the sensing unit or in The combination of the operations performed at the predetermined height of the electronic device to execute the control command enables a more precise touch operation, resulting in a better interactive experience.
  • FIGS. 2a-2d are explanatory diagrams showing an exemplary situation in which a finger approaches or leaves an electronic device.
  • a finger approaches or leaves an electronic device According to an embodiment of the present invention will be described with reference to Figs. 2a to 2d. It is assumed that the second detection distance is set to zero in the embodiment shown in Figures 2a - 2d.
  • the three-dimensional motion trajectory has different first trajectories and the same second trajectory.
  • the height of the finger first lower than the point A1 of the first detection distance is curved in a clockwise direction and then vertically lowered to touch the point B1 of the electronic device 200a to terminate the motion.
  • a state change signal is generated when a finger touches the point B1.
  • the first trajectory is a trajectory that is curved in a clockwise direction at the first detection distance and then vertically descended until reaching the point B1 (the first trajectory may be included according to different design needs)
  • the critical point when reaching or leaving the second detection distance) and since the finger does not move on the electronic device 200a after reaching the electronic device 200a, the second trajectory is B1 point.
  • the direction of the currently displayed image in the clockwise direction can be determined according to the first trajectory shown in Fig. 2a.
  • the height of the finger first lower than the A2 point of the first detection distance is curved in the counterclockwise direction and then vertically lowered to touch the point B2 of the electronic device 200b to terminate the motion.
  • a state change signal is generated when a finger touches the B2 point.
  • the first trajectory is a trajectory that is curved in a clockwise direction at the first detection distance and then vertically descended until reaching the point B2 (the first trajectory may be included according to different design needs)
  • the second track is point B2.
  • the direction of the currently displayed image in the reverse clockwise direction can be determined according to the first trajectory shown in FIG. 2b.
  • the three-dimensional motion trajectory is a motion trajectory that monotonically decreases or monotonically increases in relative distance relative to the electronic device 200c. That is, the three-dimensional motion trajectory is a continuous motion trajectory that the operating body approaches from the first detection height and reaches equal to or smaller than the second detection distance, or the operating body leaves at a distance equal to or smaller than the second detection distance to reach the first detection height.
  • Figure 2c is a continuous motion trajectory showing the finger approaching from the first detected height and reaching the second detected height. As shown in Fig. 2c, the finger moves from the point A3 in the air to the point B3 on the electronic device 200c in the direction indicated by the arrow. According to the method shown in Fig.
  • a state change signal is generated when a finger touches the B3 point.
  • the first track is a vector from the points A3 and B3, and since the finger reaches the electronic device 200c and the electronic device 200c does not move, the second track is the B3 point.
  • the direction of moving the currently displayed image can be determined based on the amount of distance between points A3 and B3.
  • Figure 2d is a continuous motion trajectory showing the finger exiting from the second detection distance to the first detected height.
  • the finger moves from point B4 on the electronic device 200d to point A4 in the air in the direction indicated by the arrow.
  • a state change signal is generated when the finger leaves from point B4.
  • the first track is a vector from the points A4 and B4
  • the second track is the B4 point.
  • the group of operation commands corresponding to a point on the electronic device 200d is the image of the currently displayed display
  • the direction of the currently displayed image is determined based on the vector from the points A4 and B4.
  • the second detection height is described as an example in the above embodiment, in an alternative embodiment of the present invention, the second detection height may be greater than zero, that is, the second detection height exists between the electronic device and the electronic device. distance. The operating body does not have to touch the electronic device.
  • FIG. 3 illustrates a method 300 for moving a display object in accordance with an embodiment of the present invention.
  • the method 300 is applied to an electronic device.
  • the electronic device is such as a portable mobile terminal, a personal computer, or the like.
  • the electronic device includes a display unit, and is included in the display area for display The object display area of the display object.
  • the display object includes, for example, a web page, a picture, a list, or other various display controls (eg, a slider).
  • the area of the object display area may be smaller than the area of the display area, so that background information other than the object display area is displayed in the display area.
  • the area of the object display area may also be equal to the area of the display area. For example, in the schematic diagrams shown in FIGS.
  • the rectangular area S1 drawn in solid lines is the display area of the electronic device, and the rectangular area S2 drawn by the dotted line in the display area is the object in the display area.
  • the list object in FIG. 7A includes "Contact 1", “Contact 2" ... "Contact 10" for a total of ten items, but since the area is larger than the area of the object display area, the object is Only three items “Contact 4", "Contact 5", and "Contact 6" are displayed in the display area.
  • the electronic device can include both a touch sensor and a proximity sensor.
  • the contact sensor is for sensing contact of an operating body with the electronic device
  • the proximity sensor is for sensing proximity of an operating body to the electronic device.
  • the electronic device may also include only proximity sensors.
  • the proximity sensor is designed to both sense the proximity of the operating body to the electronic device, and can also be used to sense the contact of the operating body with the electronic device.
  • sensor it will be collectively referred to as "sensor” without particularly distinguishing between the two.
  • the display unit and the sensing unit composed of the sensor may be stacked one above another, or may not be stacked, as long as there is a mapping relationship between the display unit and the sensing unit.
  • the display object and the object display area are in a first relative positional relationship.
  • the first relative positional relationship between the display object and the object display area refers to: the display object is displayed in the The object displays the first position of the area.
  • a shaded rectangle shows a slider object, with the position of its upper left vertex being the position of the display object.
  • the display object is located at the first position P1 of the object display area.
  • any other point in the display object can be set as the position of the display object.
  • the first relative positional relationship between the display object and the object display area refers to: A portion is displayed in the object display area.
  • the object display area In the object display area, only the first part of the list object including ten items (three items "Contact 4", "Contact 5", and "Contact 6") are shown.
  • the method senses a first operation of an operating body with respect to the electronic device.
  • the operating body is, for example, a fingertip of a finger operated by a user, a pen tip of a stylus, or the like.
  • the method senses a first series of track points composed of a plurality of track points formed by the operating body on the electronic device by using a sensor included in the electronic device, thereby sensing the operating body to the electronic device The first operation performed.
  • the operating body performs the first operation in a sliding operation.
  • step S302 the method obtains between the operating body and the electronic device according to the sensing result of the sensor. The first distance.
  • step S303 the method determines whether the first distance is less than or equal to the first detection distance.
  • the first detected distance is a threshold set by the method to identify the first operation as a contact operation. The value thereof is appropriately set as needed by those skilled in the art as needed.
  • the first detection distance can be set to the thickness of the film of the package sensor.
  • the first detecting distance may be set to the thickness of the protective body.
  • the method ends. On the other hand, when it is judged that the first distance is less than or equal to the first detection distance, the method proceeds to step S104.
  • step S304 the method obtains the first operating parameter and the second operating parameter of the operating body according to the first operation.
  • the method is capable of obtaining a first series of trajectory points of the first operation by the sensor. Through the first series of track points, the method can obtain a first operational parameter and a second operational parameter of the operational body.
  • the method may obtain the moving direction of the first operation as the first operating parameter by using the positions of the first track point and the last track point in the first series of track points.
  • the method may also obtain the first by using any two (adjacent) track points of the first series of track points.
  • the real-time moving direction of an operation as the first operating parameter.
  • the method may also obtain a second operational parameter that is different from the first operational parameter.
  • the method may pass the first track point and the last track point of the first series of track points Position, obtaining a moving distance of the first operation as the second operating parameter.
  • the method may obtain an operation time of the first operation as the second operation parameter by timing of a system clock in the electronic device.
  • the method may obtain the speed of the first operation as the second by the positions of the first track point and the last track point in the first series of track points and the timing of the system clock.
  • Operating parameters may obtain the real-time speed of the first operation by using any two (adjacent) track points of the first series of track points and the timing of the system clock of the electronic device. As the second operating parameter.
  • step S305 the method changes the first relative positional relationship between the display object and the object display area to a second relative positional relationship based on the first operational parameter and the second operational parameter.
  • the method first converts the first operating parameter and the second operating parameter into a first moving parameter and a second moving parameter of the display object, respectively.
  • the method converts a moving direction of the first operation into a moving direction of the display object as the first moving parameter.
  • the method may set a direction that coincides with or is opposite to a moving direction of the first operation as a moving direction of the display object.
  • the moving direction of the display object is the same as the moving direction of the first operation.
  • the method converts the moving distance of the first operation into a first moving distance of the display object as the second moving parameter.
  • the method may convert the first moving distance of the display object by adding the moving distance of the first operation to a coefficient.
  • the coefficients are appropriately set as needed by those skilled in the art as needed.
  • the coefficient can be set to zero.
  • the above conversion method is only an example. Those skilled in the art can understand that the method can obtain the first of the display object by converting the moving distance of the first operation by other means (such as multiplying by a coefficient, or according to a piecewise function or the like).
  • the moving distance is taken as the second movement parameter.
  • the method converts an operation time of the first operation into a first movement time of the display object as the second movement parameter.
  • the method sets the operation time of the first operation to the first movement time of the display object.
  • the first moving time of the display object is obtained by a person skilled in the art after the arithmetic operation of the first operation, which is not specifically limited.
  • the method converts the speed of the first operation to the first shift of the display object
  • the moving speed is taken as the second movement parameter.
  • the method may preset a segmentation function, when the speed of the first operation is between [0, ), the method sets the first moving speed of the display object to be the first When the speed of operation is between [ , v 2 ), the method sets the first moving speed of the display object to V 2 ; ... when the speed of the first operation is between [v ⁇ , v consumer) The method sets the first moving speed of the display object to V n .
  • the values of the ⁇ , ⁇ 2 ... ⁇ ⁇ , ⁇ perennial and V 2 ... V n can be determined by those skilled in the art Need to be set properly.
  • the method can be obtained by converting the speed of the first operation by other means (such as multiplying by a coefficient, or adding a coefficient, etc.).
  • the first moving speed of the display object is referred to as the second moving parameter.
  • the method moves the display object based on the first movement parameter and the second movement parameter, such that the display object and the object display area change from a first relative positional relationship to a second relative Positional relationship.
  • the method can compare the area of the display object with the size of the area of the object display area.
  • the method is based on a moving direction of the display object, and is further based on a first moving distance, a first moving time, and a first movement of the display object At least one of the speeds, moving the display object from the first position to a second position of the object display area, wherein the second position is different from the first position.
  • the method moves the display object from the first position (P1) shown in FIG. 5A to the coarse one based on the obtained first movement parameter (to the right) and the second movement parameter.
  • the method is based on the moving direction of the display object, and is further based on the first moving distance of the display object, the first movement
  • the display object is moved at least one of the time and the first moving speed to display the second portion of the display object in the object display area.
  • the second portion overlaps with the first portion at most.
  • the first moving distance is long
  • the first moving time is long
  • the first moving speed is fast
  • the second portion and the first portion may not overlap.
  • the first moving distance is short
  • the first moving speed is slow
  • the second portion partially overlaps the first portion.
  • the method moves the display object based on the obtained first movement parameter (upward) and the second movement parameter to display the second part of the display object ("contact 5", "Contact 6" and “Contact 7" are displayed in the object display area.
  • “Contact 5" and “Link The two items of the person 6" are included in the first part of the display object shown in FIG. 6A, and also in the second part of the display object shown in FIG. 6B, that is, the first part and the second part Partial overlap between the parts.
  • step S306 the method senses a second operation of the operating body with respect to the electronic device. Specifically, the method senses a second series of track points formed by the operating body on the electronic device by using a sensor (specifically, a proximity sensor), thereby sensing the operation performed by the operating body on the electronic device. Second operation. In other words, in the operation of step S306, the operating body is not in contact with the electronic device but only in proximity to the electronic device.
  • a sensor specifically, a proximity sensor
  • the method obtains a second between the operating body and the electronic device according to the sensing result of the proximity sensor, in step S307 distance.
  • the method obtains a second distance between the operating body and the electronic device at a predetermined timing.
  • the predetermined timing is appropriately determined by those skilled in the art as needed.
  • the predetermined timing may be set to 3ms, 5ms, or the like.
  • the method obtains in real time the real-time distance between the operating body and the electronic device in the second operation.
  • step S308 the method determines whether the second distance is greater than the first detection distance and less than or equal to the second detection distance.
  • the second detection distance is a threshold set by the method to identify the second operation as a proximity operation.
  • the value is appropriately set by a person skilled in the art according to the accuracy of the electronic device, and is not specifically limited herein.
  • the method ends. Similarly, when it is judged that the second distance is less than or equal to the first detection distance, the method ends.
  • the method can also continue the corresponding processing for the second operation as the contact operation according to the prior art, which will not be described here.
  • the method may be maintained at the moment when the second distance is less than or equal to the first detection distance or greater than the second detection distance. The relative positional relationship between the display object and the object display area does not change.
  • step S309 the method obtains a third operating parameter of the operating body different from the first operating parameter according to the second operation.
  • the method can obtain, by the proximity sensor, a second series of track points formed by the second operation being projected on the electronic device. Through the second series of track points, the method can obtain a third operational parameter of the operating body.
  • the method may obtain the moving distance of the second operation as the third operating parameter by using the positions of the first track point and the last track point in the second series of track points.
  • the method may obtain an operation time of the second operation as the third operation parameter by timing of a system clock in the electronic device.
  • the method may further obtain the speed of the second operation as the third operating parameter by using the moving distance of the second operation obtained and the operating time of the second operation.
  • step S310 the method changes the second relative positional relationship between the display object and the object display area to a third relative positional relationship based on the first operating parameter and the third operating parameter.
  • the method first converts the third operational parameter into a third mobile parameter of the display object.
  • the method converts the moving distance of the second operation into the second moving distance of the display object as the third moving parameter.
  • the method may convert the moving distance of the display object by adding the moving distance of the second operation to a coefficient.
  • the coefficients are appropriately set as needed by those skilled in the art as needed.
  • the coefficient can be set to be greater than zero.
  • the user can move the display object by a relatively long distance by an operation of a relatively short moving distance.
  • the above conversion method is only an example. Those skilled in the art can understand that the method can completely obtain the second of the display object by converting the moving distance of the second operation by other means (such as multiplication by a coefficient, or according to a piecewise function or the like).
  • the moving distance is taken as the third movement parameter.
  • the method converts the operation time of the second operation into the second movement time of the display object as the third movement parameter.
  • the method sets the operation time of the second operation to the second movement time of the display object.
  • the second moving time of the display object can be obtained by a person skilled in the art after the arithmetic operation of the second operation, which is not specifically limited.
  • the method converts the speed of the second operation to a second moving speed of the display object as the third movement parameter. Similar to the above, those skilled in the art can understand that the method can be completely converted by multiplying the speed of the second operation (for example, by multiplying a coefficient, The second moving speed of the display object is obtained as a third moving parameter by adding a coefficient, or by a piecewise function or the like.
  • the method moves the display object based on the first movement parameter and the third movement parameter, such that the display object and the object display area change from a second relative positional relationship to a third relative position Positional relationship.
  • the method may refer to the relationship between the area of the display object obtained at step S305 and the area of the object display area.
  • the method is based on a moving direction of the display object, and is further based on the second moving distance, the second moving time, and the second movement of the display object At least one of the speeds moves the display object from the second position to a third position of the object display area, wherein the third position is different from the second position.
  • the method moves the display object from the second position (P2) shown in FIG. 5B to the rough based on the obtained first movement parameter (to the right) and the third movement parameter.
  • the position shown by the solid line (P3) The position shown by the solid line (P3).
  • the method is based on the moving direction of the display object, and is further based on the second moving distance, the second movement of the display object At least one of the time and the second moving speed, moving the display object to display the third portion of the display object in the object display area.
  • the third portion overlaps with the second portion at most.
  • the moving distance is long
  • the moving time is long
  • the moving speed is fast
  • the third portion and the second portion may not overlap.
  • the moving time is short
  • the moving speed is slow
  • the third portion partially overlaps the second portion. For example, as shown in FIG.
  • the method moves the display object based on the obtained first movement parameter (upward) and the third movement parameter to display the third part of the display object ("Contact 7", "Contact 8" and “Contact 9" are displayed in the object display area.
  • the item "contact 7" is included in the second part of the display object shown in FIG. 6B, and is also included in the third part of the display object shown in FIG. 6C, that is, the second part and the Partial overlap between the third part.
  • the second series of trajectory points of the second operation sensed by the proximity sensor comprises a plurality of trajectory points.
  • the operating body performs the second operation in a manner of sliding over the electronic device.
  • the second series of trajectory points sensed by the method may also include only a single trajectory point.
  • the operating body can be suspended above the electronic device.
  • the method is based on the display pair obtained as described above a moving direction of the image, and a second moving time of the display object, moving the display object at a predetermined unit moving speed of the method, such that the second relative positional relationship between the display object and the object display area Change to the third relative positional relationship.
  • the method may also obtain the second distance and control a second moving speed of the display object based on the second distance.
  • the method may preset a segmentation function, and when the second distance is between (4), the method sets a second moving speed of the display object to ⁇ ; when the second When the distance is between (4, ⁇ 2 ), the method sets the second moving speed of the display object to ⁇ , 2 ; ... when the second distance is between ( ⁇ , J, the method Setting a second moving speed of the display object to V, n .
  • ⁇ , ⁇ can be appropriately selected by those skilled in the art as needed It should be noted that, in the above method, ⁇ corresponds to the first detection distance, and 4 corresponds to the second detection distance. Of course, similar to the above, those skilled in the art can understand that the method can be completely
  • the second distance is converted by other means (such as multiplication by a coefficient, etc.) to obtain a second moving speed of the display object.
  • the user can control the display object by adjusting the distance between the operating body and the electronic device. Move speed, and can be intuitive while adjusting Identifying the relationship between the distance and the moving speed of the display object, it is possible to accurately and quickly the desired content displayed in the object display region.
  • the method may determine, based on the moving distance, whether the displayed display object after moving is over border status.
  • the display object is moved until the display object is in a boundary state as a third relative positional relationship between the display object and the object display area.
  • the boundary state is: the first object boundary corresponding to the moving direction of the display object exceeds the object display area;
  • the boundary state is: the first object boundary coincides with a first display boundary corresponding to the moving direction in the object display area.
  • the over-boundary state is the right boundary of the slider object (shown by a thick solid line in the drawing) beyond the object display area S2.
  • the right border In this case, the method moves the display object until the boundary state as shown in Fig. 5A. At this time, the right margin of the slider object coincides with the right boundary of the object display region S2.
  • the boundary state is: the second object boundary corresponding to the moving direction in the display object is located at the The boundary state is: the second object boundary overlaps with a second display boundary of the object display area that is opposite to the moving direction.
  • the overbound state is a lower boundary of the list object (shown by a thick solid line in the drawing) located in the object display area.
  • the method moves the display object until the boundary state as shown in FIG. 6E. At this time, the lower boundary of the list object coincides with the lower boundary of the object display region S2.
  • the user needs to move a display object (such as a slider) from one position on the touch display screen to another.
  • a display object such as a slider
  • the conversion is obtained.
  • the second moving distance of the display object is such that the user can move the display object by a longer distance by a shorter dangling sliding distance, thereby shortening the moving distance of the operating body required for the user operation as a whole, and shortening the operation of the user. Time, which is more in line with the user's operating habits and improves the user experience.
  • a method according to an embodiment of the present invention in order to see content that is not currently displayed on the touch display screen, the user makes a shorter distance contact sliding operation to cause the display object to start scrolling in one direction, after which only a dangling pause operation is required, the display object can continue to scroll in the direction until the desired content is displayed in the object display area Medium, thereby enabling the user to move the display object a longer distance by a shorter distance contact sliding operation and a convenient dangling pause operation thereafter.
  • the user can also control the moving speed of the display object by adjusting the distance between the operating body and the electronic device, and can intuitively recognize the relationship between the distance and the moving speed of the display object while adjusting, thereby The user experience can be improved by accurately and quickly displaying the desired content on the object display area.
  • FIG. 7 is a block diagram showing an exemplary structure of a control device 700 according to an embodiment of the present invention.
  • Control device 700 can be applied For electronic devices.
  • the control device 700 of the present embodiment includes a sensing unit 710, a state detecting unit 720, a track dividing unit 730, and an executing unit 740.
  • the respective units of the electronic device 700 perform the respective steps/functions of the above-described control method in FIG. 1, and therefore, for brevity of description, they will not be described in detail.
  • the sensing unit 710 has a first detection distance and a second detection distance.
  • the sensing unit 710 can detect a three-dimensional motion trajectory when the relative distance of the operating body relative to the electronic device is less than the first detecting distance, wherein the first detecting distance and the second detecting distance are distances with respect to the electronic device, and the second detecting distance is smaller than
  • the first detection distance the three-dimensional motion trajectory is a continuous motion trajectory in which one end is located between the first detection distance and the second detection distance and the other end is equal to or smaller than the second detection distance.
  • the operating body may reach the second detecting distance from the first detecting distance, and return the first detecting distance from the second detecting distance after the second detecting distance enters a track.
  • the start point and the end point of the three-dimensional motion trajectory can be determined according to the requirements of the current operation, that is, the three-dimensional motion trajectory can be determined according to the requirements of the current operation, and the operating body should reach the second detection distance from the first detection distance and travel at the second detection distance.
  • the trajectory is also a trajectory of the operating body traveling at the second detecting distance and returning from the second detecting distance to the first detecting distance.
  • the sensing unit 710 starts detecting the three-dimensional motion trajectory of the operating body until the operating body reaches the second detection distance and then opens from the second detection distance, or when the operating body is in the second When detecting the position or the position below the second detection distance, the sensing unit 710 starts detecting the three-dimensional motion trajectory of the operating body until the operating body returns from the second detection distance to the first detection distance.
  • the first detection distance of the sensing unit 710 may be preset by the device provider or the user. Alternatively, the sensing unit 710 may also set the first detection distance according to the user's operation. Specifically, when it is detected that the operating body is in a fixed position for more than a predetermined time, the sensing unit 310 can set the distance of the fixed position relative to the electronic device to the first detecting distance.
  • the second detection distance of the sensing unit 710 may be preset by the device provider or the user. Alternatively, the sensing unit 710 may also set the second detection distance according to the user's operation. Specifically, when detecting that the operating body is located at a fixed position relative to the electronic device that is less than a fixed position of the first detecting distance for more than a predetermined time, the sensing unit 710 sets the distance of the fixed position relative to the electronic device as the second detecting. distance. As described above, the second detection distance of the sensing unit can be set to zero. That is, the electronic device can be touched when the operating body is at the second detection distance.
  • the three-dimensional motion trajectory is in a relative distance relative to the electronic device A monotonically decreasing or monotonically increasing trajectory. That is, the three-dimensional motion trajectory is a continuous motion trajectory that the operating body approaches from the first detection height and reaches equal to or smaller than the second detection distance, or the operating body leaves at a distance equal to or smaller than the second detection distance to reach the first detection height. Continuous motion trajectory.
  • the state detecting unit 720 may generate a state change signal when the operating body reaches or exits from the second detecting distance of the proximity sensing unit.
  • the trajectory dividing unit 730 may divide the three-dimensional trajectory according to the state change signal generated by the state detecting unit 720 to obtain a first trajectory when the relative distance is equal to or smaller than the second detecting distance and a second trajectory when the relative distance is higher than the second detecting distance.
  • the second trajectory may be a point at which the relative electronic device is located at the second detection distance, or a continuous motion trajectory composed of a plurality of points equal to or lower than the second detection distance.
  • Execution unit 740 can execute a corresponding control command according to the first trajectory and the second trajectory.
  • control device of the embodiment by detecting the three-dimensional trajectory of the operating body in the process of approaching or leaving the electronic device, and the three-dimensional trajectory of the operating body in the process of approaching or leaving the electronic device and the operating body on the sensing unit or in The combination of the operations performed at the predetermined height of the electronic device to execute the control command enables a more precise touch operation, resulting in a better interactive experience.
  • FIG. Fig. 8 is a block diagram showing an exemplary structure of a control device 800 according to another embodiment of the present invention.
  • Control device 800 can be applied to electronic devices.
  • the control device 800 of the present embodiment includes a sensing unit 810, a state detecting unit 820, and a track dividing unit 830.
  • the sensing unit 810 has a first detection distance and a second detection distance.
  • the sensing unit 810 can detect a three-dimensional motion trajectory when the relative distance of the operating body relative to the electronic device is less than the first detecting distance, wherein the first detecting distance and the second detecting distance are distances with respect to the electronic device, and the second detecting distance is less than The first detection distance, the three-dimensional motion trajectory is a continuous motion trajectory in which one end is located between the first detection distance and the second detection distance and the other end is equal to or smaller than the second detection distance.
  • the state detecting unit 820 may generate a state change signal when the operating body reaches or exits from the second detecting distance of the proximity sensing unit.
  • the trajectory dividing unit 830 may divide the three-dimensional trajectory according to the state change signal generated by the state detecting unit 820 to obtain a first trajectory when the relative distance is equal to or smaller than the second detecting distance and a second trajectory when the relative distance is higher than the second detecting distance.
  • Control device 800 also includes an execution unit 840.
  • the execution unit 840 can include a command group determination module 841, a command selection module 842, and a command execution module 843.
  • the command group determining module 841 can determine the control command group according to the second trajectory.
  • Command selection module 842 can be based on The first track selects a corresponding control command in the control command group determined by the command group determining module.
  • the command execution module 843 can execute the control commands selected by the command selection module.
  • the command group determining module 841 can determine that the second track can correspond to the volume adjustment command group according to the user touching the volume control button on the touch screen.
  • the volume adjustment command group can include increasing the volume, decreasing the volume, muting, and maximizing the volume.
  • the command selection module 842 may determine that the first trajectory may correspond to maximizing the volume when the position of the user's finger on the touch-sensitive area is projected over the volume control button after the user's finger is removed from the position of the volume control button displayed on the touch screen; The command selection module 842 may determine that the first trajectory may correspond to muting when the user's finger is positioned below the volume control button on the touch sensing area after the user's finger is removed from the position of the volume control button displayed on the touch screen; When the finger is removed from the position of the volume control button displayed on the touch screen and the position in the air is projected on the left side of the volume control button on the touch sensing area, the command selection module 842 may determine that the first track may correspond to decreasing the volume; The command selection module 842 may determine that the first trajectory may correspond to increasing the volume when the position of the user's finger on the touch-sensitive area is projected to the right of the volume control button after the user's finger is removed from the position of the volume control button displayed on the touch screen. .
  • the sensing unit may include one or more of an infrared sensing element, an ultrasonic sensing element, and a capacitive sensing element.
  • a three-dimensional motion trajectory when the relative distance of the operating body relative to the electronic device is less than the first detection distance can be detected by the capacitive sensing element.
  • the first response capacitance value corresponding to the first detection distance and the second response capacitance value corresponding to the second detection distance may be determined according to the method in the foregoing embodiment.
  • the sensing unit can perform corresponding operations according to the value of the response capacitance generated by the operating body.
  • the three-dimensional motion trajectory is a continuous motion trajectory in which one end is located between the first detection distance and the second detection distance and the other end is equal to the second detection distance.
  • the sensing unit may include a first sensing module and a second sensing module to detect a motion trajectory between the first detection distance and the second detection distance of the operating body and a motion trajectory at the second detection distance, respectively. That is, the first sensing module can detect a motion trajectory of the operating body between the first detection distance and the second detection distance.
  • the first sensing module can include a plurality of ultrasonic sensing elements, infrared sensing elements, or imaging devices disposed at different locations of the electronic device to determine the position of the operating body.
  • FIG. 9 and 10 respectively illustrate display devices 900 and 1000 that implement a method for moving a display object, in accordance with an embodiment of the present invention.
  • the display device is such as a portable mobile terminal, a personal computer, or the like.
  • the display device includes a display unit, and an object display area for displaying a display object is included in a display area thereof.
  • the display object includes, for example, a web page, a picture, a list, or other various display controls (e.g., sliders).
  • the display device may include both a touch sensor and a proximity sensor.
  • the contact sensor is for sensing contact of an operating body with the display device
  • the proximity sensor is for sensing proximity of an operating body to the display device.
  • the display device may also include only proximity sensors.
  • the proximity sensor is designed to both sense the proximity of the operating body to the display device, and can also be used to sense the contact of the operating body with the display device.
  • sensor it will be collectively referred to as "sensor” without particularly distinguishing between the two.
  • the display object and the object display area are in a first relative positional relationship.
  • the first relative positional relationship between the display object and the object display area means: the display object is displayed in the The first position of the object display area.
  • the first relative positional relationship between the display object and the object display area refers to: the first part of the display object is displayed in the The object display area.
  • the display device 900 includes: a first sensing unit 901, a first obtaining unit 902, a first determining unit 903, a second obtaining unit 904, a first changing unit 905, and a second sensing unit. 906.
  • the first sensing unit 901 is, for example, the above-described contact sensor, which senses a first operation of an operating body with respect to the display device.
  • the operating body is, for example, a fingertip of a finger operated by a user, a pen tip of a stylus, or the like.
  • the first sensing unit 901 senses a first series of track points composed of a plurality of track points formed by the operating body on the display device, thereby sensing the first performed by the operating body on the display device operating.
  • the first obtaining unit 902 obtains a first distance between the operating body and the display device according to the sensing result of the first sensing unit 901.
  • the first determining unit 903 determines whether the first distance is less than or equal to the first detection distance.
  • the first detection distance is a threshold value that the display device sets to identify the first operation as a contact operation. The value thereof is appropriately set as needed by those skilled in the art as needed.
  • the second obtaining unit 904 obtains the first operation parameter and the second operation parameter of the operating body according to the first operation, where the first operation The parameter is different from the second operational parameter.
  • the first sensing unit 901 can obtain the first series of track points of the first operation. Through the first series of track points, the second obtaining unit 904 can obtain the first operating parameter and the second operating parameter of the operating body.
  • the second obtaining unit 904 may obtain a moving direction of the first operation by using a position of the first track point and the last track point in the first series of track points, as the first operating parameter .
  • the second obtaining unit 904 may also pass any two (adjacent) track points of the first series of track points. Obtaining a real-time moving direction of the first operation as the first operating parameter.
  • the second obtaining unit 904 can also obtain a second operating parameter that is different from the first operating parameter.
  • the second obtaining unit 904 may obtain the moving distance of the first operation as the second operating parameter by using the positions of the first track point and the last track point in the first series of track points.
  • the second obtaining unit 904 may obtain an operation time of the first operation as the second operation parameter by timing of a system clock in the display device.
  • the second obtaining unit 904 may obtain the speed of the first operation by using the positions of the first track point and the last track point in the first series of track points and the timing of the system clock, as The second operating parameter.
  • the second obtaining unit 904 may obtain the first by using any two (adjacent) track points of the first series of track points and the timing of the system clock of the display device. The real-time speed of operation as the second operational parameter.
  • the first changing unit 905 changes the display object and the object display area from the first relative positional relationship to the second relative positional relationship based on the first operational parameter and the second operational parameter.
  • the first changing unit 905 may include a first converting unit as shown in FIG.
  • the first conversion unit 1001 converts the moving direction of the first operation into the moving direction of the display object as the first moving parameter.
  • the first turn The changing unit 1001 may set a direction that coincides with or is opposite to a moving direction of the first operation as a moving direction of the display object. In the following description, it is assumed that the moving direction of the display object is the same as the moving direction of the first operation.
  • the first converting unit 1001 converts the moving distance of the first operation into the first moving distance of the display object as the second moving parameter.
  • the first conversion unit 1001 may convert the first moving distance of the display object by adding the moving distance of the first operation to a coefficient.
  • the coefficients are appropriately set as needed by those skilled in the art as needed.
  • the coefficient can be set to zero.
  • the above conversion method is only an example. Those skilled in the art can understand that the first converting unit 1001 can completely obtain the display by converting the moving distance of the first operation by other means (such as multiplying by a coefficient, or according to a piecewise function or the like).
  • the first moving distance of the object is taken as the second moving parameter.
  • the first converting unit 1001 converts the operation time of the first operation into the first moving time of the display object as the second moving parameter. In order to comply with the user's operating habits, preferably, the first converting unit 1001 sets the operating time of the first operation to the first moving time of the display object.
  • the first moving time of the display object can be obtained by a person skilled in the art after the arithmetic operation of the first operation, which is not specifically limited.
  • the first conversion unit 1001 converts the speed of the first operation into a first moving speed of the display object as the second movement parameter.
  • the first converting unit 301 may preset a segmentation function, when the speed of the first operation is between [0, ), the first converting unit 1001 will move the first object of the display object.
  • the speed is set to Vi; when the speed of the first operation is between [ , v 2 ), the first converting unit 1001 sets the first moving speed of the display object to V 2 ;
  • the first converting unit 1001 sets the first moving speed of the display object to V n .
  • the v 2 ... v ⁇ , vp The value of V 2 ...
  • V n can be appropriately set by a person skilled in the art as needed.
  • the first conversion unit 1001 can completely pass the first operation
  • the speed is otherwise converted (e.g., multiplied by a coefficient, or added to a coefficient, etc.) to obtain a first moving speed of the display object as the second moving parameter.
  • the first moving unit 1002 moves the display object based on the first movement parameter and the second movement parameter, so that the display object and the object display area change from a first relative positional relationship. Is the second relative positional relationship.
  • the first mobile unit 1002 can compare the area of the display object The size of the area of the display area with the object.
  • the first moving unit 1002 is based on the moving direction of the display object, and is further based on the first moving distance of the display object, the first moving time And moving at least one of the first moving speeds from the first position to a second position of the object display area, wherein the second position is different from the first position.
  • the first moving unit 1002 is based on the moving direction of the display object, and is also based on the first moving distance of the display object. And at least one of the first moving time and the first moving speed, moving the display object to display the second portion of the display object in the object display area. It should be noted that the second portion overlaps with the first portion at most.
  • the second sensing unit 906 is, for example, the proximity sensor that senses a second operation of the operating body relative to the display device. Specifically, the second sensing unit 906 senses a second series of track points formed by the operating body on the display device, thereby sensing a second operation performed by the operating body on the display device. In addition, the second sensing unit 906 may be separately disposed as two units from the first sensing unit 901, or may be combined into one unit.
  • the third obtaining unit 907 obtains a second distance between the operating body and the display device according to the second operation. Further, preferably, the third obtaining unit 907 obtains a second distance between the operating body and the display device at a predetermined timing.
  • the predetermined timing is appropriately determined by a person skilled in the art as needed. For example, the predetermined timing may be set to 3ms, 5ms, or the like. Thereby, the third obtaining unit 907 obtains the real-time distance between the operating body and the display device in the second operation in real time.
  • the second determining unit 908 determines whether the second distance is greater than the first detection distance and less than or equal to the second detection distance.
  • the second detection distance is a threshold set to identify the second operation as a proximity operation. The value is appropriately set by a person skilled in the art according to the accuracy of the display device, and is not specifically limited herein.
  • the display device may further include a holding unit 1003 as shown in FIG.
  • the third obtaining unit 907 continuously obtains the second distance at a predetermined timing
  • the second determining unit 908 determines that the second distance is less than or equal to the first detecting distance, or is greater than the second detecting distance
  • the maintaining The unit 1003 can maintain the relative positional relationship between the display object and the object display area at that time.
  • the fourth obtaining unit 909 obtains a third operating parameter of the operating body different from the first operating parameter according to the second operation.
  • the second sensing unit 906 can obtain a second series of track points formed by the second operation being projected on the display device.
  • the fourth obtaining unit 909 can obtain the third operating parameter of the operating body.
  • the fourth obtaining unit 909 may obtain the moving distance of the second operation by using the positions of the first track point and the last track point in the second series of track points, as the third Operating parameters.
  • the fourth obtaining unit 909 may obtain an operation time of the second operation as the third operation parameter by timing of a system clock in the display device.
  • the fourth obtaining unit 909 can further obtain the speed of the second operation as the third operating parameter by using the moving distance of the second operation obtained above and the operating time of the second operation.
  • the second changing unit 910 changes the display object and the object display area from a second relative positional relationship to a third relative positional relationship based on the first operational parameter and the third operational parameter.
  • the second changing unit 910 may include a second converting unit 1004 as shown in FIG. 10, converting the third operating parameter into a third moving parameter of the display object, and a second moving unit 1005. And moving the display object based on the first movement parameter and the third movement parameter to change the display object and the object display area from a second relative positional relationship to a third relative positional relationship.
  • the second converting unit 1004 first converts the third operating parameter into a third moving parameter of the display object.
  • the second converting unit 1004 converts the moving distance of the second operation into the second moving distance of the display object as the third moving parameter.
  • the second converting unit 1004 may convert the moving distance of the display object by adding the moving distance of the second operation to a coefficient.
  • the coefficients are appropriately set as needed by those skilled in the art as needed.
  • the coefficient can be set greater than zero.
  • the user can move the display object by a longer distance by an operation of a relatively short moving distance.
  • the above conversion method is only an example.
  • the second converting unit 1004 can completely obtain the display by converting the moving distance of the second operation by other means (such as multiplication by a coefficient, or according to a piecewise function, etc.).
  • the second moving distance of the object is taken as the third movement parameter.
  • the second conversion unit 1004 converts the operation time of the second operation into the second movement time of the display object as the third movement parameter.
  • the second converting unit 1004 sets the operating time of the second operation to the second moving time of the display object.
  • the second moving time of the display object is obtained by a person skilled in the art after the arithmetic operation of the second operation, which is not specifically limited.
  • the second converting unit 1004 converts the speed of the second operation into the second moving speed of the display object as the third moving parameter. Similar to the above, those skilled in the art can understand that the second converting unit 1004 can completely convert the speed of the second operation by various means (such as multiplying a coefficient, adding a coefficient, or passing a minute). A segment function or the like) obtains a second moving speed of the display object as the third moving parameter.
  • the second moving unit 1005 moves the display object based on the first movement parameter and the third movement parameter, so that the display object and the object display area are changed from the second relative positional relationship. Is the third relative positional relationship.
  • the second moving unit 1005 can refer to the relationship between the area of the display object obtained by the first moving unit 1002 and the area of the object display area.
  • the second moving unit 1005 is based on the moving direction of the display object, and is further based on the second moving distance and the second moving time of the display object. And moving at least one of the second moving speed from the second position to a third position of the object display area, wherein the third position is different from the second position.
  • the second moving unit 1005 is based on the moving direction of the display object, and is also based on the second moving distance of the display object. And at least one of the second moving time and the second moving speed, moving the display object to display the third portion of the display object in the object display area. It is to be noted that the third portion overlaps with the second portion at most.
  • the second series of track points sensed by the second sensing unit 906 includes a plurality of track points.
  • the operating body performs the second operation in a manner of sliding over the display device.
  • the second series of track points sensed by the second sensing unit 906 may also include only a single track point.
  • the operating body can be suspended above the display device.
  • the second moving unit 1005 moves the display object based on the moving direction of the display object obtained as described above and the second moving time of the display object, so that the display Changing from the second relative positional relationship between the object and the object display area to The third relative positional relationship.
  • the fourth obtaining unit 909 may also obtain the second distance, and the second changing unit 910 controls the second moving speed of the display object based on the second distance.
  • the second changing unit 910 may preset a segmentation function, and when the second distance is between (4), the second changing unit 910 sets the second moving speed of the display object. Is ⁇ ; when the second distance is between (4, ⁇ 2 ), the second changing unit 910 sets the second moving speed of the display object to V, 2 ; ... when the second distance When (i _, J), the second changing unit 910 sets the second moving speed of the display object to v, n .
  • ⁇ , ⁇ and ⁇ , V, 2 The value of ...V, n can be appropriately set by a person skilled in the art as needed. It should be noted that ⁇ corresponds to the first detection distance, and 4 corresponds to the second detection distance.
  • the second changing unit 910 can obtain the second moving speed of the display object by converting the second distance by other means (such as multiplying by a coefficient, etc.). Controlling the display object by adjusting the distance between the operating body and the display device Moving speed, and can intuitively recognize the relationship between the distance and the moving speed of the display object at the same time adjustment, it is possible to accurately and quickly the desired content displayed in the object display region.
  • the second changing unit 910 may include a determining unit (not shown), and determining a moving distance and a moving direction of the display object based on the first operating parameter and the third operating parameter; a third determining unit (not shown), based on the moving distance, determining whether the moved display object is in an over-bound state; and a third moving unit (not shown), when the third determining unit determines the moved display object When the boundary state is reached, the display object is moved until the display object is in a boundary state as a third relative positional relationship between the display object and the object display area.
  • the contact sensing is performed by the first sensing unit
  • the proximity sensing is performed by the second sensing unit, thereby performing corresponding processing according to the results of the contact sensing and the proximity sensing, such that
  • the user can conveniently control the display device to perform a corresponding operation with a simple gesture that conforms to the operating habits (specifically, a touch-sliding, a back-hanging gesture, or a first-touch sliding, back-hanging gesture).
  • a simple gesture that conforms to the operating habits (specifically, a touch-sliding, a back-hanging gesture, or a first-touch sliding, back-hanging gesture).
  • FIG. 11 is a block diagram showing an exemplary structure of an electronic device 1100 according to an embodiment of the present invention.
  • the control device 1100 of the example includes a sensing unit 1110 and a processing unit 1120.
  • the sensing unit 1110 has a first detection distance and a second detection distance.
  • the sensing unit can include one or more of an inductive sensing element such as an infrared sensing element, an ultrasonic sensing element, a capacitive sensing element, and the like.
  • the sensing unit 1110 can detect a three-dimensional motion trajectory when the relative distance of the operating body relative to the electronic device is less than the first detection distance.
  • the first detection distance and the second detection distance are distances relative to the electronic device, and the second detection distance is less than the first detection distance.
  • the three-dimensional motion trajectory is a continuous motion trajectory in which one end is located between the first detection distance and the second detection distance and the other end is equal to or smaller than the second detection distance.
  • the operating body may reach the second detecting distance from the first detecting distance, and return the first detecting distance from the second detecting distance after the second detecting distance enters a track.
  • the start point and the end point of the three-dimensional motion trajectory can be determined according to the requirements of the current operation, that is, the three-dimensional motion trajectory can be determined according to the requirements of the current operation, and the operating body should reach the second detection distance from the first detection distance and travel at the second detection distance.
  • the trajectory is also a trajectory of the operating body traveling at the second detecting distance and returning from the second detecting distance to the first detecting distance.
  • the sensing unit 1110 starts detecting the three-dimensional motion trajectory of the operating body until the operating body reaches the second detection distance and then opens from the second detection distance, or when the operating body is in the second When detecting the position or the position below the second detection distance, the sensing unit 1110 starts detecting the three-dimensional motion trajectory of the operating body until the operating body returns from the second detection distance to the first detection distance.
  • the second trajectory may be a point at which the relative electronic device is located at the second detection distance, or a continuous motion trajectory composed of a plurality of points equal to or lower than the second detection distance.
  • the first detection distance of the sensing unit 1110 may be preset by the device provider or the user. Alternatively, the sensing unit 1110 may also set the first detection distance according to the user's operation. Specifically, when it is detected that the operating body is in a fixed position for more than a predetermined time, the sensing unit 1110 can set the distance of the fixed position relative to the electronic device to the first detecting distance.
  • the second detection distance of the sensing unit 1110 may be preset by the device provider or the user. Alternatively, the sensing unit 1110 may also set the second detection distance according to the user's operation. Specifically, when it is detected that the time when the operating body is located at a fixed position relative to the electronic device that is less than the first detection distance exceeds the predetermined time, the sensing unit 1110 sets the distance of the fixed position relative to the electronic device as the second detection. distance. As described above, the second detection distance of the sensing unit can be set to zero. That is to say, the electronic device can be touched when the operating body is at the second detection distance.
  • the three-dimensional motion trajectory is a motion trajectory that monotonically decreases or monotonically increases in relative distance relative to the electronic device. That is, the three-dimensional motion trajectory is a continuous motion trajectory that the operating body approaches from the first detection height and reaches equal to or smaller than the second detection distance, or the operating body leaves at a distance equal to or smaller than the second detection distance to reach the first detection height. Continuous motion trajectory.
  • the processing unit 1120 can include a state detection module 1121, a trajectory division module 1122, and an execution module 1123.
  • the state detecting module 1121 may generate a state change signal when the operating body reaches or exits from the second detecting distance of the proximity sensing unit.
  • the trajectory dividing module 1122 may divide the three-dimensional trajectory according to the state change signal generated by the state detecting module 1121 to obtain a first trajectory when the relative distance is equal to or smaller than the second detecting distance and a second trajectory when the relative distance is higher than the second detecting distance.
  • the execution module 1123 can execute a corresponding control command according to the first trajectory and the second trajectory.
  • the electronic device of the embodiment by detecting the three-dimensional trajectory of the operating body in the process of approaching or leaving the electronic device, and the three-dimensional trajectory of the operating body in the process of approaching or leaving the electronic device and the operating body on the sensing unit or in The combination of the operations performed at the predetermined height of the electronic device to execute the control command enables a more precise touch operation, resulting in a better interactive experience.
  • the electronic device 1200 includes: a first sensor 1201, a second sensor 1202, a processor 1203, and a display 1204.
  • the first sensor 1201, the second sensor 1202, the processor 1203, and the display 1204 are communicatively coupled.
  • the display area of the display 1204 includes an object display area for displaying a display object, and the display object and the object display area are in a first relative positional relationship.
  • the first sensor 1201 is, for example, a contact sensor that senses a first operation of an operating body with respect to the electronic device.
  • the second sensor 1202 is, for example, a proximity sensor that senses a second operation of the operating body relative to the electronic device.
  • the first sensor 1201 and the second sensor 1202 may be separately provided as two units, or may be combined into one unit.
  • the processor 1203 is, for example, a central processing unit or a microprocessor, configured to: obtain a first distance between the operating body and the electronic device according to the first operation; and determine whether the first distance is smaller than a first detection distance; when determining that the first distance is smaller than the first detection distance, obtaining, according to the first operation, a first operation parameter and a second operation parameter of the operation body, where the first operation parameter Different from the second operation parameter; based on the first operation parameter and the second operation parameter, the first relative positional relationship between the display object and the object display area Changing to a second relative positional relationship; obtaining a second distance between the operating body and the electronic device according to the second operation; determining whether the second distance is greater than the first detection distance and smaller than the second detection distance When determining that the second distance is greater than the first detection distance and less than the second detection distance, obtaining, according to the second operation, a third operation parameter different from the first operation parameter of the operating body; The first operating parameter and the third operating parameter change a relationship between the display object and
  • contact sensing is performed by the first sensor
  • proximity sensing is performed by the second sensor, thereby performing corresponding processing according to the result of the contact sensing and the proximity sensing, so that the user can be simple a gesture that conforms to the operating habits (specifically, a touch-to-slide, a back-hanging gesture, or a first touch-slide, a back-sliding gesture) to conveniently control the electronic device to perform a corresponding operation, thereby improving the user Experience.
  • the operating habits specifically, a touch-to-slide, a back-hanging gesture, or a first touch-slide, a back-sliding gesture
  • FIGS. 1 through 12 a control method and a corresponding control device thereof, a method for moving a display object, a corresponding display device thereof, and an electronic device according to an embodiment of the present invention are described with reference to FIGS. 1 through 12.
  • the unit/module can be implemented in software for execution by various types of processors.
  • an identified executable code module can comprise one or more physical or logical blocks of computer instructions, which can be constructed, for example, as an object, procedure, or function. Nonetheless, the executable code of the identified modules need not be physically located together, but may include different instructions stored in different bits. When these instructions are logically combined, they constitute the unit/module and implement the unit. / Module's stated purpose.
  • the unit/module can be implemented by software, considering the level of the existing hardware process, the unit/module that can be implemented by software can be constructed by a person skilled in the art without considering the cost.
  • the hardware circuit includes conventional Very Large Scale Integration (VLSI) circuits or gate arrays and existing semiconductors such as logic chips, transistors, or other discrete components. Modules can also be implemented with programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, and the like.
  • VLSI Very Large Scale Integration

Abstract

提供了一种控制方法和控制装置、移动显示对象的方法和显示装置、及电子设备。控制方法用于包括传感单元的电子设备,传感单元具有相对于电子设备的第一检测距离和第二检测距离,且第二检测距离小于第一检测距离。该方法包括:检测操作体相对于电子设备的相对距离小于第一检测距离时的三维运动轨迹,其是一端位于第一和第二检测距离之间而另一端等于或小于第二检测距离的连续运动轨迹;当操作体到达接近传感单元的第二检测距离或从第二检测距离离开时,生成状态改变信号;根据状态改变信号划分三维轨迹,以获得相对距离等于或小于第二检测距离时的第一轨迹和相对距离高于第二检测距离时的第二轨迹;以及根据第一轨迹和第二轨迹执行相应的控制命令。

Description

控制方法、 控制装置、 显示装置以及电子设备 技术领域
本发明实施例涉及电子设备的领域。 更具体地, 本发明涉及一种应用于 电子设备的控制方法和相应的控制装置、 应用于电子设备的移动显示对象的 方法和相应显示装置、 以及电子设备。 背景技术
目前, 例如蜂窝电话、 便携式音乐播放器、 便携式计算机的具有触摸显 示屏的各种电子装备已经被广泛应用。 在这些便携式电子装备中, 通常将显 示屏与触摸感应单元层叠放置以形成触摸显示屏。 所述触摸感应单元用于接 收用户的输入以方便用户的操作。 触摸感应单元可包括可由电容式触摸传感 器或者电阻式触摸传感器之类传感器元件构成的触摸感应区域。 这种电子设 备的用户通过一定的操作体(如指尖、 笔尖)接触所述触摸显示屏。 用户可 在电子设备的触摸控制区域上进行诸如单击、 双击、 拖曳等动作来实现相应 的控制功能。 换句话说, 所述电子设备通过触摸感应单元感测到操作体与所 述触摸显示屏的接触, 并响应于这样的接触来执行相应的操作。
随着技术的发展, 处理器的处理能力提高, 便携式电子设备可为用户提 供的功能不断增多。 然而, 现有的具有触摸感应单元的电子设备大多仅支持 接触操作。 也就是说, 所述电子设备仅在感测到操作体与所述触摸显示屏接 触时才执行相应的操作。 但是, 以上例如单击、 双击、 拖拽之类的触摸操作 已经不能满足用户越来越多样化的操作需要。 这在一些情况下带来了不便。
例如, 在一些电子设备的解锁操作中, 用户需要将一显示对象(如滑块) 从触摸显示屏上的第一位置移动到第二位置。 为此, 用户不得不做出滑动手 势, 并且滑动手势的行程等于第二位置与第一位置之间的距离。 这是相对较 长的一段距离, 在单手操作的情况下不符合用户的操作习惯, 容易造成用户 疲劳, 导致用户体验较差。
又例如, 在电子设备中常见的列表或网页滚动操作中, 为了看到触摸显 示屏上当前未显示的内容, 在一种实现方式中, 用户需要一直做出滑动手势, 直到所期望的内容显示在触摸显示屏上。 在行程较长的情况下, 这种实现方 式同样容易造成用户疲劳, 导致用户体验较差。 在另一种实现方式中, 用户 需要做出快速滑动 (fling )手势, 使得触摸显示屏上显示的内容快速滚动, 并以一定的负加速度逐渐停止滚动。 在这种实现方式中, 用户不能在做出该 手势的同时直观地识别该手势的速度与内容的滚动速度之间的关系, 从而无 法精确地将所期望的内容显示在触摸显示屏上。 发明内容
本发明实施例的目的在于提供一种应用于电子设备控制方法和相应的控 制装置、 以及电子设备, 以解决上述问题。
本发明的一个实施例提供了一种控制方法, 应用于电子设备。 所述电子 设备包括传感单元, 其中传感单元具有第一检测距离和第二检测距离。 具体 地, 第一检测距离和第二检测距离是相对于电子设备的距离, 且第二检测距 离小于第一检测距离。 所述控制方法包括: 检测操作体相对于电子设备的相 对距离小于第一检测距离时的三维运动轨迹, 其中三维运动轨迹是一端位于 第一检测距离和第二检测距离之间而另一端等于或小于第二检测距离的连续 运动轨迹; 当操作体到达接近传感单元的第二检测距离或从第二检测距离离 开时, 生成状态改变信号; 根据状态改变信号划分三维轨迹, 以获得相对距 离等于或小于第二检测距离时的第一轨迹和相对距离高于第二检测距离时的 第二轨迹; 以及根据第一轨迹和第二轨迹执行相应的控制命令。
本发明的另一实施例提供了一种应用于电子设备的控制装置。 所述控制 装置包括: 传感单元, 具有第一检测距离和第二检测距离, 配置来检测操作 体相对于电子设备的相对距离小于第一检测距离时的三维运动轨迹, 其中第 一检测距离和第二检测距离是相对于电子设备的距离, 且第二检测距离小于 第一检测距离, 三维运动轨迹是一端位于第一检测距离和第二检测距离之间 而另一端等于或小于第二检测距离的连续运动轨迹; 状态检测单元, 配置来 当操作体到达接近传感单元的第二检测距离或从第二检测距离离开时, 生成 状态改变信号; 轨迹划分单元, 配置来根据状态改变信号划分三维轨迹, 以 获得相对距离等于或小于第二检测距离时的第一轨迹和相对距离高于第二检 测距离时的第二轨迹; 以及执行单元, 配置来根据第一轨迹和第二轨迹执行 相应的控制命令。
本发明的另一实施例提供了一种电子设备, 包括:传感单元和处理单元。 传感单元具有第一检测距离和第二检测距离, 并且配置来检测操作体相对于 电子设备的相对距离小于第一检测距离时的三维运动轨迹, 其中第一检测距 离和第二检测距离是相对于电子设备的距离, 且第二检测距离小于第一检测 距离, 三维运动轨迹是一端位于第一检测距离和第二检测距离之间而另一端 等于或小于第二检测距离的连续运动轨迹。 处理单元包括: 状态检测模块, 配置来当操作体到达接近传感单元的第二检测距离或从第二检测距离离开 时, 生成状态改变信号; 轨迹划分模块, 配置来根据状态改变信号划分三维 轨迹, 以获得相对距离等于或小于第二检测距离时的第一轨迹和相对距离高 于第二检测距离时的第二轨迹; 以及执行模块, 配置来根据第一轨迹和第二 轨迹执行相应的控制命令。
通过上述本发明实施例提供的方案, 可检测操作体在接近或离开电子设 备过程中的三维轨迹, 并将操作体在接近或离开电子设备过程中的三维轨迹 与操作体在传感单元上或在距离电子设备预定高度处进行的操作结合来执行 控制命令。 通过有效地利用在接近或离开电子设备过程中的例如速度、 加速 度、 立体轨迹、 降落角、 起飞角之类的操作体运动信息, 能够实现更精准的 触控操作, 从而带来更好的交互体验。
根据本发明的另一方面, 本发明提供了一种用于移动显示对象的方法、 显示装置和电子设备, 其允许用户以简单的、符合操作习惯的手势进行操作, 从而改进了用户体验。
根据本发明一实施例, 提供了一种用于移动显示对象的方法, 应用于电 子设备中, 所述电子设备包括显示单元, 所述显示对象与所述显示单元的显 示区域中用于显示所述显示对象的对象显示区域之间处于第一相对位置关 系, 所述方法包括: 感测一操作体相对于所述电子设备的第一操作; 根据所 述第一操作, 获得所述操作体与所述电子设备之间的第一距离; 判断所述第 一距离是否小于等于第一检测距离; 当判断所述第一距离小于等于第一检测 距离时, 根据所述第一操作, 获得所述操作体的第一操作参数和第二操作参 数, 其中, 所述第一操作参数与所述第二操作参数不同; 基于所述第一操作 参数和所述第二操作参数, 将所述显示对象与所述对象显示区域之间从第一 相对位置关系改变为第二相对位置关系; 感测所述操作体相对于所述电子设 备的第二操作; 根据所述第二操作, 获得所述操作体与所述电子设备之间的 第二距离; 判断所述第二距离是否大于第一检测距离并小于等于第二检测距 离; 当判断所述第二距离大于第一检测距离并小于等于第二检测距离时, 根 据所述第二操作, 获得所述操作体的不同于所述第一操作参数的第三操作参 数; 以及基于所述第一操作参数和所述第三操作参数, 将所述显示对象与所 述对象显示区域之间从第二相对位置关系改变为第三相对位置关系。
根据本发明另一实施例, 提供了一种显示装置, 其显示区域中包括对象 显示区域, 所述对象显示区域用于显示显示对象, 所述显示对象与所述对象 显示区域之间处于第一相对位置关系, 所述显示装置包括: 第一感测单元, 感测一操作体相对于所述显示装置的第一操作; 第一获得单元, 根据所述第 一操作, 获得所述操作体与所述显示装置之间的第一距离; 第一判断单元, 判断所述第一距离是否小于等于第一检测距离; 第二获得单元, 当判断所述 第一距离小于等于第一检测距离时, 根据所述第一操作, 获得所述操作体的 第一操作参数和第二操作参数, 其中, 所述第一操作参数与所述第二操作参 数不同; 第一改变单元, 基于所述第一操作参数和所述第二操作参数, 将所 述显示对象与所述对象显示区域之间从第一相对位置关系改变为第二相对位 置关系; 第二感测单元, 感测所述操作体相对于所述显示装置的第二操作; 第三获得单元, 根据所述第二操作, 获得所述操作体与所述显示装置之间的 第二距离; 第二判断单元, 判断所述第二距离是否大于第一检测距离并小于 等于第二检测距离; 第四获得单元, 当判断所述第二距离大于第一检测距离 并小于等于第二检测距离时, 根据所述第二操作, 获得所述操作体的不同于 所述第一操作参数的第三操作参数; 以及第二改变单元, 基于所述第一操作 参数和所述第三操作参数, 将所述显示对象与所述对象显示区域之间从第二 相对位置关系改变为第三相对位置关系。
根据本发明另一实施例, 提供了一种电子设备, 包括: 显示器, 其显示 区域中包括对象显示区域, 所述对象显示区域用于显示显示对象, 所述显示 对象与所述对象显示区域之间处于第一相对位置关系; 第一传感器, 感测一 操作体相对于所述电子设备的第一操作; 第二传感器, 感测所述操作体相对 于所述电子设备的第二操作; 以及处理器; 其中, 所述处理器被配置为: 根 据所述第一操作, 获得所述操作体与所述电子设备之间的第一距离; 判断所 述第一距离是否小于第一检测距离; 当判断所述第一距离小于第一检测距离 时, 根据所述第一操作, 获得所述操作体的第一操作参数和第二操作参数, 其中, 所述第一操作参数与所述第二操作参数不同; 基于所述第一操作参数 和所述第二操作参数, 将所述显示对象与所述对象显示区域之间从第一相对 位置关系改变为第二相对位置关系; 根据所述第二操作, 获得所述操作体与 所述电子设备之间的第二距离; 判断所述第二距离是否大于第一检测距离并 小于第二检测距离; 当判断所述第二距离大于第一检测距离并小于第二检测 距离时, 根据所述第二操作, 获得所述操作体的不同于所述第一操作参数的 第三操作参数; 以及基于所述第一操作参数和所述第三操作参数, 将所述显 示对象与所述对象显示区域之间从第二相对位置关系改变为第三相对位置关 系。
在根据本发明该实施例的方法、 显示装置和电子设备中, 通过结合接触 检测和接近检测两者, 使得用户能够以简单的、 符合操作习惯的手势方便地 控制所述显示装置或电子设备来执行相应的操作, 从而改进了用户体验。 附图说明
为了更清楚地说明本发明实施例的技术方案, 下面将对实施例的描述中 所需要使用的附图作简单地介绍。 下面描述中的附图仅仅是本发明的示例性 实施例。
图 1是描述了根据本发明实施例的控制方法的流程图。
图 2A-图 2D是示出了手指接近或离开电子设备的示例情形的说明图。 图 3是图示根据本发明实施例的用于移动显示对象的方法的步骤的流程 图;
图 4A是示出了通过红外传感元件检测操作体在第一检测距离和第二检 测距离之间的运动轨迹的一种示例情形的说明图;
图 4B是图示电容式触摸屏的接近检测原理的示意图;
图 5A-图 5C是图示根据本发明实施例的电子设备上的显示示意图,并且 图 5D和图 5E是分别图示根据本发明实施例的过边界状态和边界状态的示意 图;
图 6A-图 6C是图示根据本发明实施例的电子设备上的显示示意图,并且 图 6D和图 6E是分别图示根据本发明实施例的过边界状态和边界状态的示意 图;
图 7示出根据本发明实施例的控制装置的示范性结构框图;
图 8是示出根据本发明另一实施例的控制装置的示范性结构框图; 图 9是图示根据本发明实施例的显示装置的主要配置的框图; 图 10是图示根据本发明实施例的显示装置的更详细配置的框图; 图 11是示出根据本发明实施例的电子设备的示范性结构框图; 以及 图 12是图示根据本发明另一实施例的电子设备的主要配置的框图。 具体实施方式
在下文中, 将参考附图详细描述本发明的优选实施例。 注意, 在本说明 书和附图中, 具有基本上相同步骤和元素用相同的附图标记来表示, 且对这 些步骤和元素的重复解译将被省略。
在本发明的以下实施例中,电子设备指的是能够与其他设备通信的设备。 电子设备的具体形式包括但不限于移动电话、 个人计算机、 数码相机、 个人 数字助手、 便携式计算机、 平板式计算机、 游戏机等。 电子设备包括传感单 元。 传感单元可包括接近传感器。 例如, 传感单元可包括红外传感元件、 超 声波传感元件、 电容传感元件等接近传感元件中的一个或多个。 传感单元具 有第一检测距离和第二检测距离, 第一检测距离和第二检测距离是相对于电 子设备的距离, 且第二检测距离小于第一检测距离。 例如可将第一检测距离 设置为 10毫米, 而将第二检测距离设置为 0。 即, 第一检测距离为相对于电 子设备 10毫米的高度,而第二检测距离为电子设备中触摸感应区域所在的平 面, 当用户的手指触摸到触摸感应区域时即到达第二检测高度。
可替换地, 传感单元可包括第一传感单元和第二传感单元以分别对操作 体在接触到电子设备时的运动轨迹以及操作体在未接触到电子设备时的运动 轨迹进行检测。 也就是说, 第一传感单元可包括由电容式触摸传感器或者电 阻式触摸传感器之类各种已知传感器元件构成的触摸感应区域。 第二传感单 元可检测操作体在第一距离和第二距离之间的运动轨迹。 第二传感单元可包 括设置在电子设备的不同位置上的多个超声波传感元件、 红外传感元件或者 图像采集元件, 来确定操作体的位置。
在描述本发明实施例之前, 首先, 将参考图 4描述接近检测的原理。 图 4 ( A )是示出了通过红外传感元件检测操作体在第一检测距离和第二 检测距离之间的运动轨迹的一种示例情形的说明图。 如图 4 ( A )所示, 电子 设备 400的第一传感模块(未示出) 包括分别设置在电子设备的左右两侧上 (可替换地, 也可是上下两侧) 的红外线发射端 410和红线接收端 420。 红 外线发射端 410沿电子设备 400的左边缘以预定间隔向红线接收端 420发射 从而当操作体进入第一传感模块的检测范围时, 可根据被操作体遮挡的光线 来确定操作体的运动轨迹。
另一方面, 第二传感模块可检测操作体在第二检测距离处的运动轨迹。 第二传感模块可包括设置在电子设备上的压敏、 静电触摸板或电容触摸板之 类的触摸感应元件。优选地第二传感模块可与电子设备的显示区域重叠设置。 此外, 第一传感模块的检测区域在电子设备上的投影与和第二传感模块在电 子设备上的检测区域相同。
图 4 ( B )是示出了通过电容式触摸屏 400'来检测接近电子设备的操作体 的一种示例情形的说明图。 下面, 将参照图 4 ( B )对通过电容式触摸屏来检 测接近电子设备操作体的示例进行说明。如图 4 ( B )所示,电容式触摸屏 400' 的电场耦合(如图 4 ( B ) 中的虚线所示)范围覆盖了触摸屏上方与触摸屏之 间的距离为 L1的空间, 从而在触摸屏上方与触摸屏之间的距离为 L1的空间 内形成了的有效检测区域。 当操作体进入有效检测区域内时, 可产生足够大 小的可检测耦合电流 I。电容式触摸屏 400,中的控制器可以根据各极点检测到 的电流值来计算手指的精确位置, 以及手指距离屏幕的距离。 更进一步地, 控制器还能够检测出手指是否触摸到屏幕表面。
下面, 将参照图 1来描述根据本发明实施例的控制方法。 图 1是描述了 根据本发明实施例的控制方法 100的流程图。
触摸控制方法 100可用于包括上述传感单元的电子设备。 如图 1所示, 在步骤 S101中,检测操作体相对于电子设备的相对距离小于第一检测距离时 的三维运动轨迹。 操作体的三维运动轨迹是一端位于第一检测距离和第二检 测距离之间而另一端等于或小于第二检测距离的连续运动轨迹。
在有些情况下操作体可能从第一检测距离到达第二检测距离, 在第二检 测距离行进一段轨迹之后又从第二检测距离返回第一检测距离。 此时可根据 当前操作的要求确定三维运动轨迹的起点和终点, 即, 可根据当前操作的要 求确定三维运动轨迹应当是操作体从第一检测距离到达第二检测距离以及在 第二检测距离行进的轨迹, 还是操作体在第二检测距离行进以及从第二检测 距离返回第一检测距离的轨迹。 也就是说, 在步骤 S101中, 当操作体到达第 一检测距离时开始检测操作体的三维运动轨迹直到操作体到达第二检测距离 后又从第二检测距离开为止, 或者当操作体处于第二检测距离或第二检测距 离以下的位置时开始检测操作体的三维运动轨迹直到操作体从第二检测距离 返回第一检测距离为止。
例如, 在通过电子设备阅读电子读物可根据用户的输入进行翻页操作的 情况下,可以用户的手指触碰到电子设备的触摸感应区域中的位置为参考点, 当用户的手指在触碰到参考点之前在空中的位置在触摸感应区域上的投映在 触摸点左侧时(即, 当用户从左侧触碰到参考点时), 可对应于向前翻页的命 令; 而当用户的手指在触碰到参考点之前在空中的位置在触摸感应区域上的 投映在触摸点右侧时(即, 当用户从右侧触碰到参考点时), 可对应于向后翻 页的命令。 在此情况下, 可将第二检测距离设置为零, 以操作体到达第二检 测距离之前位于第一检测距离和第二检测距离之间的一端为三维运动轨迹的 起点, 并且以在第二检测距离上终止的位置或在第二检测距离上行进一段轨 迹之后离开第二检测距离的位置为三维运动轨迹的终点。
另一方面,在通过电子设备观看图像时根据用户的输入进行放大 /缩小操 作的情况下, 可以用户的手指触碰到电子设备的触摸感应区域中的位置为参 考点, 当用户的手指从参考点离开之后在空中的位置在触摸感应区域上的投 映在参考点上方时, 可对应于缩小当前显示的图像的命令; 而当用户的手指 从参考点离开之后在空中的位置在触摸感应区域上的投映在参考点下方时, 可对应于放大当前显示的图像的命令。 在此情况下, 可将第二检测距离设置 为零, 以操作体在第二检测距离上开始触摸的位置为三维运动轨迹的起点, 并且以到达第一检测距离的位置为三维运动轨迹的终点。
此外, 例如用户在触摸感应区域中进行手写输入的情况下, 手指触碰到 当手指抬起时最后一个笔划根据手指在显示区域中的投映位置延续, 并且根 据手指的与电子设备之间的距离越大, 在触摸感应区域中与手指对应的笔划 的部分越细, 直到手指与电子设备之间的距离超出第一检测距离为止。 例如, 当输入写 "之" 字时, 在写最后一笔时用户的手指逐渐抬离触摸感应区域, 则当用户手指抬离触摸感应区域时最后一笔随着手指的高度逐渐变细。 可替 换地, 还可设置一参考时间, 当用户手指抬离触摸感应区域时, 之前所书写 的整个字符的粗细度随着手指距离电子设备的高度而变细, 当手指在低于第 一检测距离的位置停留的时间超过所设置的参考时间时, 确定整个字符的粗 细度为此时手指高度所对应的粗细度。 在此情况下, 可将第二检测距离设置为零, 以操作体在第二检测距离上 起始位置为三维运动轨迹的起点, 并且以操作体在第一检测距离和第二检测 距离之间的终止位置为终点或者在操作体一直向上运动、 与电子设备之间的 距离超过第一检测距离的情况下以操作体在第一检测距离处的位置为三维运 动轨迹的终点。
可由设备提供商或用户预先设置第一检测距离, 可替换地, 还可根据用 户的操作来设置第一检测距离。 具体地, 当检测到操作体位于一固定位置的 时间超过预定时间时, 可将固定位置相对于电子设备的距离设置为第一检测 距离。
与第一检测距离类似, 可由设备提供商或用户预先设置第二检测距离, 可替换地, 还可根据用户的操作来设置第二检测距离。 具体地, 当检测到操 作体位于相对于电子设备的相对距离小于第一检测距离的一固定位置的时间 超过预定时间时, 将固定位置相对于电子设备的距离设置为第二检测距离。 如上所述, 可将第二检测距离设置为零。 也就是说, 当操作体处于第二检测 距离时可触碰到电子设备。
在步骤 S102中,当操作体到达接近传感单元的第二检测距离或从第二检 测距离离开时, 生成状态改变信号。 在步骤 S 103中, 根据在步骤 S102中生 成的状态改变信号划分三维轨迹, 以获得相对距离等于或小于第二检测距离 时的第一轨迹和相对距离高于第二检测距离时的第二轨迹。 在本实施例中, 第二轨迹可以是相对电子设备位于第二检测距离处的一个点, 或者是由等于 或低于第二检测距离的多个点组成的连续运动轨迹。
最后在步骤 S 104中,根据第一轨迹和第二轨迹执行相应的控制命令。优 选地, 可根据在步骤 S 103中划分的第二轨迹来确定控制命令组。 然后根据第 一轨迹在所确定的控制命令组中选择相应的控制命令, 并且执行所选择的控 制命令。
具体地, 当用户在触摸屏上触碰音量控制按钮时的第二轨迹可对应于音 量调节命令组。 音量调节命令组可包括增大音量、 减小音量、 静音和最大化 音量。 当用户的手指从触摸屏上显示的音量控制按钮的位置离开之后在空中 的位置在触摸感应区域上的投映在音量控制按钮上方时, 第一轨迹可对应于 最大化音量; 当用户的手指从触摸屏上显示的音量控制按钮的位置离开之后 在空中的位置在触摸感应区域上的投映在音量控制按钮下方时, 第一轨迹可 对应于静音; 当用户的手指从触摸屏上显示的音量控制按钮的位置离开之后 在空中的位置在触摸感应区域上的投映在音量控制按钮左侧时, 第一轨迹可 对应于减小音量; 而当用户的手指从触摸屏上显示的音量控制按钮的位置离 开之后在空中的位置在触摸感应区域上的投映在音量控制按钮右侧时, 第一 轨迹可对应于增大音量。
在本实施例的控制方法中, 通过检测操作体在接近或离开电子设备过程 中的三维轨迹, 并将操作体在接近或离开电子设备过程中的三维轨迹与操作 体在传感单元上或在距离电子设备预定高度处进行的操作结合来执行控制命 令, 能够实现更精准的触控操作, 从而带来更好的交互体验。
图 2a-图 2d是示出了手指接近或离开电子设备的示例情形的说明图。 下 面, 将参照图 2a-图 2d, 以手指为例来描述根据本发明实施例操作体接近或 离开电子设备的情形。 假设在图 2a-图 2d所示的实施例中第二检测距离设置 为零。
在图 2a和图 2b所示的实施例中, 三维运动轨迹具有不同的第一轨迹和 相同的第二轨迹。 如图 2a所示, 手指首先低于第一检测距离的 A1点所在的 高度沿顺时针方向画弧形然后竖直下降触碰到电子设备 200a的 B1点后终止 运动。 根据图 1中所示的方法, 在图 2a所示的示例中, 当手指触碰到 B1点 时生成状态改变信号。 根据状态改变信号, 第一轨迹为在第一检测距离处沿 顺时针方向画弧形然后竖直下降的轨迹直到到达 B1点之前的轨迹(可根据不 同的设计需要确定第一轨迹是否包括操作体到达或离开第二检测距离时的临 界点), 而由于手指到达电子设备 200a后在电子设备 200a上没有移动因此第 二轨迹为 B 1点。 假设在电子设备 200a上触碰一点对应的操作命令组为旋转 当前显示的图像,根据图 2a中所示的第一轨迹可确定沿顺时针方向旋转当前 显示的图像的方向。
另一方面在图 2b所示的示例中, 手指首先低于第一检测距离的 A2点所 在的高度沿逆时针方向画弧形然后竖直下降触碰到电子设备 200b的 B2点后 终止运动。 根据图 1 中所示的方法, 在图 2b所示的示例中, 当手指触碰到 B2点时生成状态改变信号。 根据状态改变信号, 第一轨迹为在第一检测距离 处沿顺时针方向画弧形然后竖直下降的轨迹直到到达 B2点之前的轨迹(可根 据不同的设计需要确定第一轨迹是否包括操作体到达或离开第二检测距离时 第二轨迹为 B2点。 假设在电子设备 200b上触碰一点对应的操作命令组为旋 转当前显示的图像,根据图 2b中所示的第一轨迹可确定逆顺时针方向旋转当 前显示的图像的方向。
此外, 根据本发明的一个实施例, 三维运动轨迹是在相对于电子设备 200c的相对距离上单调减少或单调增加的运动轨迹。 也就是说, 三维运动轨 迹是操作体从第一检测高度接近并到达等于或小于第二检测距离处的连续运 动轨迹, 或者操作体从等于或小于第二检测距离处离开而到达第一检测高度 的连续运动轨迹。 图 2c是示出了手指从第一检测高度接近并到达第二检测高 度的连续运动轨迹。 如图 2c所示, 手指沿箭头所示的方向从位于空中的 A3 点移动到电子设备 200c上的 B3点。才艮据图 1中所示的方法, 在图 2c所示的 示例中, 当手指触碰到 B3点时生成状态改变信号。根据状态改变信号, 第一 轨迹为从 A3点和 B3点之间的向量, 而由于手指到达电子设备 200c后在电 子设备 200c没有移动因此第二轨迹为 B3点。假设在电子设备 200c上触碰一 点对应的操作命令组为移动当前显示的图像,根据从 A3点和 B3点之间的向 量可确定移动当前显示的图像的方向。
图 2d是示出了手指从第二检测距离处离开而到达第一检测高度的连续 运动轨迹。 如图 2d所示, 手指沿箭头所示的方向从位于电子设备 200d上的 B4点移动到空中的 A4点。 才艮据图 1中所示的方法, 在图 2d所示的示例中, 当手指从 B4点离开时生成状态改变信号。根据状态改变信号,第一轨迹为从 A4点和 B4点之间的向量,而由于手指在电子设备 200d没有移动因此第二轨 迹为 B4点。 假设从电子设备 200d上的一点对应的操作命令组为旋转当前显 示的图像,根据从 A4点和 B4点之间的向量可确定旋转当前显示的图像的方 向。
虽然在以上实施例中以第二检测高度为零为例进行了描述, 但是在本发 明的可替换实施例中, 可第二检测高度可以大于零, 即第二检测高度与电子 设备之间存在距离。 操作体不必须触碰到电子设备上。
下面, 将结合图 3来描述根据本发明实施例的控制方法在移动显示对象 时的示例应用。 图 3 示出了根据本发明实施例的用于移动显示对象的方法 300。
所述方法 300应用于电子设备中。 所述电子设备诸如便携式移动终端、 个人计算机等。 所述电子设备包括显示单元, 并且其显示区域中包括用于显 示显示对象的对象显示区域。 所述显示对象例如包括网页、 图片、 列表或其 他各种显示控件(如, 滑块)。 需要指出的是, 所述对象显示区域的面积可以 小于所述显示区域的面积, 以便在所述显示区域中除所述对象显示区域以外 的其他区域显示背景信息。 当然, 所述对象显示区域的面积也可以等于所述 显示区域的面积。 例如, 在图 5和图 6所示的示意图中, 以实线绘出的矩形 区域 S1 为电子设备的显示区域, 并且该显示区域中以虚线绘出的矩形区域 S2为该显示区域中的对象显示区域, 其中分别显示了一滑块对象(图 5 )和 一列表对象(图 6 )。 需要指出的是, 图 7A中的列表对象包括 "联系人 1"、 "联系人 2" ... "联系人 10" 总计十个项目, 然而由于其面积大于对象显示 区域的面积, 因此在对象显示区域中仅显示 "联系人 4"、 "联系人 5" 和 "联 系人 6" 三个项目。
此外, 所述电子设备可包括接触传感器和接近传感器两者。 所述接触传 感器用于感测操作体与所述电子设备的接触, 并且所述接近传感器用于感测 操作体与所述电子设备的接近。 替代地, 所述电子设备也可仅包括接近传感 器。 在此情况下, 所述接近传感器设计为既可以用于感测操作体与所述电子 设备的接近, 也可以用于感测操作体与所述电子设备的接触。 以下, 在不需 要特别区分两者的情况下, 将统称为 "传感器"。 此外, 所述显示单元和由所 述传感器构成的感测单元可以上下层叠设置, 也可不必层叠设置, 只要所述 显示单元和所述感测单元之间存在映射关系即可。
在所述方法开始时, 所述显示对象与所述对象显示区域之间处于第一相 对位置关系。
具体地, 当所述显示对象的面积小于所述对象显示区域的面积时, 所述 显示对象与所述对象显示区域之间处于第一相对位置关系指的是: 所述显示 对象显示在所述对象显示区域的第一位置。 例如, 图 5A 中以带阴影的矩形 示出了一滑块对象, 设以其左上顶点所处的位置为该显示对象的位置。 贝' J , 在图 5A中, 显示对象位于对象显示区域的第一位置 P1点处。 当然, 本领域 技术人员能够理解, 除左上顶点外, 也可以将该显示对象中的其他任一点设 为该显示对象的位置。
另一方面,当所述显示对象的面积大于等于所述对象显示区域的面积时, 所述显示对象与所述对象显示区域之间处于第一相对位置关系指的是: 所述 显示对象的第一部分显示在所述对象显示区域。 例如, 如上所述, 在图 6A 中, 所述对象显示区域中仅示出了包括十个项目的列表对象的第一部分(三 个项目 "联系人 4"、 "联系人 5" 和 "联系人 6" )。
下面, 将参考图 3描述根据本发明实施例的用于移动显示对象的方法。 如图 3所示, 在步骤 S301 , 所述方法感测一操作体相对于所述电子设备 的第一操作。 具体地, 所述操作体例如用户用来操作的手指的指尖、 触笔的 笔尖等。 所述方法通过所述电子设备包括的传感器, 感测由所述操作体在所 述电子设备上形成的多个轨迹点而组成的第一系列轨迹点, 从而感测所述操 作体对电子设备进行的第一操作。 换句话说, 所述操作体以滑动操作的方式 进行所述第一操作。
另一方面, 由于所述传感器能够感测操作体与电子设备之间的距离, 因 此, 在步骤 S302, 所述方法根据传感器的感测结果, 获得所述操作体与所述 电子设备之间的第一距离。
在步骤 S303 , 所述方法判断所述第一距离是否小于等于第一检测距离。 所述第一检测距离是所述方法将所述第一操作识别为接触操作而设置的阈 值。 其值由本领域技术人员根据需要适当地设置。 例如, 当将传感器和显示 器从上至下层叠布置时, 可以将该第一检测距离设置为封装传感器的薄膜的 厚度。 当将保护体(比如玻璃, 用于操作体接触)、 传感器和显示器从上至下 层叠布置时, 可以将该第一检测距离设置为所述保护体的厚度。
当判断所述第一距离大于第一检测距离时, 所述方法结束。 另一方面, 当判断所述第一距离小于等于第一检测距离时, 所述方法进行到步骤 S 104。
在步骤 S304, 所述方法根据所述第一操作, 获得所述操作体的第一操作 参数和第二操作参数。 具体地, 如上所述, 所述方法通过所述传感器能够得 到所述第一操作的第一系列轨迹点。 通过所述第一系列轨迹点, 所述方法可 以获得所述操作体的第一操作参数和第二操作参数。
更具体地, 所述方法可以通过所述第一系列轨迹点中的第一轨迹点和最 后轨迹点的位置, 得到所述第一操作的移动方向, 作为所述第一操作参数。 当然, 在所述电子设备支持较高运算能力的情况下, 为提高精度, 所述方法 也可通过所述第一系列轨迹点中的任何两个(相邻的)轨迹点, 得到所述第 一操作的实时移动方向, 作为所述第一操作参数。
此外, 所述方法还可获得与所述第一操作参数不同的第二操作参数。 例 如, 所述方法可以通过所述第一系列轨迹点中的第一轨迹点和最后轨迹点的 位置, 得到所述第一操作的移动距离, 作为所述第二操作参数。 替代地, 所 述方法可通过所述电子设备中的系统时钟的计时, 得到所述第一操作的操作 时间, 作为所述第二操作参数。 替代地, 所述方法可以通过所述第一系列轨 迹点中的第一轨迹点和最后轨迹点的位置、 以及所述系统时钟的计时, 得到 所述第一操作的速度, 作为所述第二操作参数。 当然, 类似地, 所述方法可 以通过所述第一系列轨迹点中的任何两个(相邻的)轨迹点、 以及所述电子 设备的系统时钟的计时, 得到所述第一操作的实时速度, 作为所述第二操作 参数。
在步骤 S305 , 所述方法基于所述第一操作参数和所述第二操作参数, 将 所述显示对象与所述对象显示区域之间从第一相对位置关系改变为第二相对 位置关系。
具体地, 所述方法首先将所述第一操作参数和所述第二操作参数分别转 换为所述显示对象的第一移动参数和第二移动参数。
更具体地, 例如, 所述方法将所述第一操作的移动方向转换为所述显示 对象的移动方向, 作为所述第一移动参数。 例如, 所述方法可以将与所述第 一操作的移动方向一致或相反的方向设置为所述显示对象的移动方向。 在以 下的描述中, 4叚设所述显示对象的移动方向与所述第一操作的移动方向相同。
又例如, 所述方法将所述第一操作的移动距离转换为所述显示对象的第 一移动距离, 作为所述第二移动参数。 例如, 所述方法可以通过将所述第一 操作的移动距离与一系数相加而转换得到所述显示对象的第一移动距离。 所 述系数由本领域技术人员根据需要适当地设置。 例如, 所述系数可以设置为 0。 当然, 以上转换方式仅为示例。 本领域技术人员能够理解, 所述方法完全 可以通过将所述第一操作的移动距离经其他方式转换(如乘以一系数, 或根 据分段函数等等) 而得到所述显示对象的第一移动距离作为所述第二移动参 数。
替代地, 所述方法将所述第一操作的操作时间转换为所述显示对象的第 一移动时间, 作为所述第二移动参数。 为符合用户的操作习惯, 优选地, 所 述方法将所述第一操作的操作时间设置为所述显示对象的第一移动时间。 当 然, 本领域技术人员也可将所述第一操作的操作时间经适当算术变换后得到 所述显示对象的第一移动时间, 在此不做具体限定。
替代地, 所述方法将所述第一操作的速度转换为所述显示对象的第一移 动速度, 作为所述第二移动参数。 例如, 所述方法可预设一分段函数, 当所 述第一操作的速度在 [0, )之间时, 所述方法将所述显示对象的第一移动速度 设置为 当所述第一操作的速度在 [ , v2)之间时, 所述方法将所述显示对 象的第一移动速度设置为 V2; …当所述第一操作的速度在 [v^,v„)之间时, 所 述方法将所述显示对象的第一移动速度设置为 Vn。 所述 ν,ν2...ν^,ν„和 V2...Vn的值可由本领域技术人员根据需要适当地设置。 当然, 与上述类似, 本领域技术人员能够理解, 所述方法完全可通过将所述第一操作的速度经其 他方式转换(如乘以一系数, 或与一系数相加等等) 而得到所述显示对象的 第一移动速度作为所述第二移动参数。
此后, 所述方法基于所述第一移动参数和所述第二移动参数, 移动所述 显示对象, 使得所述显示对象与所述对象显示区域之间从第一相对位置关系 改变为第二相对位置关系。
更具体地, 例如, 所述方法可以比较所述显示对象的面积与所述对象显 示区域的面积的大小。 当所述显示对象的面积小于所述对象显示区域的面积 时, 所述方法基于所述显示对象的移动方向, 并且还基于所述显示对象的第 一移动距离、 第一移动时间和第一移动速度中的至少一个, 将所述显示对象 从所述第一位置移动到所述对象显示区域的第二位置, 其中, 所述第二位置 与所述第一位置不同。 例如, 如图 5B所示, 所述方法基于获得的第一移动参 数(向右)和第二移动参数, 将所述显示对象从图 5A所示的第一位置(P1 ) 处移动到以粗实线示出的位置(P2 )处。
另一方面, 当所述显示对象的面积大于等于所述对象显示区域的面积时, 所述方法基于所述显示对象的移动方向, 并且还基于所述显示对象的第一移 动距离、 第一移动时间和第一移动速度中的至少一个, 移动所述显示对象, 以将所述显示对象的第二部分显示在所述对象显示区域。 需要指出的是, 所 述第二部分与所述第一部分至多部分重叠。 换句话说, 在第一移动距离较长、 第一移动时间较长或第一移动速度较快的情况下, 所述第二部分与所述第一 部分可以不重叠。 反之, 在第一移动距离较短、 第一移动时间较短或第一移 动速度较慢的情况下, 所述第二部分与所述第一部分部分重叠。 例如, 如图 6B所示, 所述方法基于获得的第一移动参数(向上)和第二移动参数, 移动 所述显示对象, 以将所述显示对象的第二部分("联系人 5"、 "联系人 6" 和 "联系人 7" 三个项目 )显示在所述对象显示区域。 其中, "联系人 5"和 "联 系人 6"这两个项目既包括在图 6A所示的显示对象的第一部分中,也包括在 图 6B 所示的显示对象的第二部分中, 即, 所述第一部分和所述第二部分之 间部分重叠。
此后, 在步骤 S306, 所述方法感测所述操作体相对于所述电子设备的第 二操作。 具体地, 所述方法通过传感器(具体地, 接近传感器), 感测所述操 作体在所述电子设备上投影形成的第二系列轨迹点, 从而感测所述操作体对 电子设备进行的第二操作。 换句话说, 在步骤 S306的操作中, 所述操作体并 不与所述电子设备接触, 而仅与所述电子设备接近。
由于所述接近传感器能够感测操作体与电子设备之间的距离, 因此, 在 步骤 S307 , 所述方法根据接近传感器的感测结果, 获得所述操作体与所述电 子设备之间的第二距离。
此外, 优选地, 在步骤 S307的处理中, 所述方法以预定定时获得所述操 作体与所述电子设备之间的第二距离。 所述预定定时由本领域技术人员根据 需要适当地确定。 例如, 所述预定定时可设置为 3ms、 5ms等。 由此, 所述 方法实时地获得在所述第二操作中所述操作体与所述电子设备之间的实时距 离。
在步骤 S308 ,所述方法判断所述第二距离是否大于第一检测距离并小于 等于第二检测距离。 所述第二检测距离是所述方法将所述第二操作识别为接 近操作而设置的阈值。 其值由本领域技术人员根据电子设备的精度适当地设 置, 在此不做具体限定。
当判断所述第二距离大于第二检测距离时, 所述方法结束。 同样地, 当 判断所述第二距离小于等于第一检测距离时, 所述方法结束。 当然, 在此情 况下, 所述方法也可根据现有技术针对作为接触操作的第二操作继续进行相 应的处理, 在此不再描述。 此外, 在以预定定时不断获得所述第二距离的情 况下, 在判断所述第二距离小于等于第一检测距离、 或大于第二检测距离时 的时刻, 所述方法可以保持在该时刻的所述显示对象与所述对象显示区域之 间的相对位置关系不变。
另一方面, 当判断所述第一距离小于等于第一检测距离时, 所述方法进 行到步骤 S309。
在步骤 S309, 所述方法根据所述第二操作, 获得所述操作体的不同于所 述第一操作参数的第三操作参数。 具体地, 如上所述, 所述方法通过所述接近传感器能够得到所述第二操 作在所述电子设备上投影形成的第二系列轨迹点。通过所述第二系列轨迹点, 所述方法可以获得所述操作体的第三操作参数。
更具体地, 例如, 所述方法可以通过所述第二系列轨迹点中的第一轨迹 点和最后轨迹点的位置, 得到所述第二操作的移动距离, 作为所述第三操作 参数。 替代地, 所述方法可通过所述电子设备中的系统时钟的计时, 得到所 述第二操作的操作时间, 作为所述第三操作参数。 当然, 所述方法还可通过 上述得到的第二操作的移动距离和第二操作的操作时间, 进一步得到所述第 二操作的速度, 作为所述第三操作参数。
此后, 在步骤 S310, 所述方法基于所述第一操作参数和所述第三操作参 数, 将所述显示对象与所述对象显示区域之间从第二相对位置关系改变为第 三相对位置关系。
具体地, 所述方法首先将所述第三操作参数转换为所述显示对象的第三 移动参数。
更具体地, 例如, 所述方法将所述第二操作的移动距离转换为所述显示 对象的第二移动距离, 作为所述第三移动参数。 例如, 所述方法可以通过将 所述第二操作的移动距离与一系数相加而转换得到所述显示对象的移动距 离。 所述系数由本领域技术人员根据需要适当地设置。 优选地, 所述系数可 以设置得大于 0。 由此, 用户可以通过相对较短移动距离的操作而将所述显 示对象移动较长距离。 当然, 以上转换方式仅为示例。 本领域技术人员能够 理解,所述方法完全可以通过将所述第二操作的移动距离经其他方式转换(如 乘以一系数, 或根据分段函数等等) 而得到所述显示对象的第二移动距离作 为所述第三移动参数。
替代地, 所述方法将所述第二操作的操作时间转换为所述显示对象的第 二移动时间, 作为所述第三移动参数。 为符合用户的操作习惯, 优选地, 所 述方法将所述第二操作的操作时间设置为所述显示对象的第二移动时间。 当 然, 本领域技术人员也可将所述第二操作的操作时间经适当算术变换后得到 所述显示对象的第二移动时间, 在此不做具体限定。
替代地, 所述方法将所述第二操作的速度转换为所述显示对象的第二移 动速度, 作为所述第三移动参数。 与上述类似, 本领域技术人员能够理解, 所述方法完全可通过将所述第二操作的速度经各种方式转换(如乘以一系数、 与一系数相加、 或通过分段函数等等) 而得到所述显示对象的第二移动速度 作为所述第三移动参数。
此后, 所述方法基于所述第一移动参数和所述第三移动参数, 移动所述 显示对象, 使得所述显示对象与所述对象显示区域之间从第二相对位置关系 改变为第三相对位置关系。
更具体地, 例如, 所述方法可以参考在步骤 S305获得的显示对象的面积 与对象显示区域的面积之间的关系。 当所述显示对象的面积小于所述对象显 示区域的面积时, 所述方法基于所述显示对象的移动方向, 并且还基于所述 显示对象的第二移动距离、 第二移动时间和第二移动速度中的至少一个, 将 所述显示对象从所述第二位置移动到所述对象显示区域的第三位置, 其中, 所述第三位置与所述第二位置不同。 例如, 如图 5C所示, 所述方法基于获得 的第一移动参数 (向右)和第三移动参数, 将所述显示对象从图 5B所示的第 二位置(P2 )处移动到以粗实线示出的位置(P3 )处。
另一方面, 当所述显示对象的面积大于等于所述对象显示区域的面积时, 所述方法基于所述显示对象的移动方向, 并且还基于所述显示对象的第二移 动距离、 第二移动时间和第二移动速度中的至少一个, 移动所述显示对象, 以将所述显示对象的第三部分显示在所述对象显示区域。 需要指出的是, 所 述第三部分与所述第二部分至多部分重叠。 换句话说, 在移动距离较长、 移 动时间较长或移动速度较快的情况下, 所述第三部分与所述第二部分可以不 重叠。 反之, 在移动距离较短、 移动时间较短或移动速度较慢的情况下, 所 述第三部分与所述第二部分部分重叠。 例如, 如图 6C所示, 所述方法基于获 得的第一移动参数(向上)和第三移动参数, 移动所述显示对象, 以将所述 显示对象的第三部分("联系人 7"、 "联系人 8" 和 "联系人 9" 三个项目 ) 显示在所述对象显示区域。 其中, 项目 "联系人 7"既包括在图 6B 所示的显 示对象的第二部分中, 也包括在图 6C 所示的显示对象的第三部分中, 即, 所述第二部分和所述第三部分之间部分重叠。
在上述实施例中, 所述方法通过所述接近传感器所感测的第二操作的第 二系列轨迹点包括多个轨迹点。 换句话说, 所述操作体以在所述电子设备上 方滑行的方式进行第二操作。 替代地, 所述方法所感测到的第二系列轨迹点 也可仅包括单个轨迹点。 换句话说, 所述操作体可以悬空停留在所述电子设 备的上方。 在此情况下, 同样地, 所述方法基于如上所述获得的所述显示对 象的移动方向、 以及所述显示对象的第二移动时间, 以所述方法预定的单位 移动速度移动所述显示对象, 使得所述显示对象与所述对象显示区域之间从 第二相对位置关系改变为第三相对位置关系。
优选地, 在此情况下, 所述方法还可获得所述第二距离, 并基于所述第 二距离控制所述显示对象的第二移动速度。 例如, 所述方法可预设一分段函 数, 当所述第二距离在 ( ,4]之间时, 所述方法将所述显示对象的第二移动速 度设置为 νΊ; 当所述第二距离在 (4,ί 2]之间时, 所述方法将所述显示对象的 第二移动速度设置为 ν,2; …当所述第二距离在 ( ^, J之间时, 所述方法将 所述显示对象的第二移动速度设置为 V,n。 所述^, ί 2...ί _ 和 νΊ, ν,2...ν,η 的值可由本领域技术人员根据需要适当地设置。 需要指出的是, 在上述方法 中, ^对应于第一检测距离, 并且 4对应于第二检测距离。 当然, 与上述类 似, 本领域技术人员能够理解, 所述方法完全可通过将所述第二距离经其他 方式转换(如乘以一系数等等)而得到所述显示对象的第二移动速度。 由此, 用户可通过调整操作体与电子设备之间的距离来控制显示对象的移动速度, 并且可以在调整的同时直观地识别该距离与显示对象的移动速度之间的关 系, 从而能够精确并快速地将所期望的内容显示在对象显示区域上。
此外, 在步骤 S310的操作中, 在如上所述确定所述显示对象的移动距离 和移动方向之后, 优选地, 所述方法可以基于所述移动距离, 判断移动后的 显示对象是否会处于过边界状态。 当判断移动后的显示对象会处于过边界状 态时, 移动所述显示对象, 直到所述显示对象处于边界状态, 作为所述显示 对象与所述对象显示区域之间的第三相对位置关系。
其中, 当所述显示对象的面积小于所述对象显示区域的面积时, 所述过 边界状态为: 所述显示对象中与所述移动方向对应的第一对象边界超出所述 对象显示区域; 所述边界状态为: 所述第一对象边界与所述对象显示区域中 与所述移动方向对应的第一显示边界重合。 例如, 如图 5D 示意性所示, 在 移动方向为向右的情况下, 所述过边界状态为所述滑块对象的右边界(图中 以粗实线示出)超出了对象显示区域 S2的右边界。 在此情况下, 所述方法移 动所述显示对象, 直到如图 5Ε所示的边界状态。 此时, 所述滑块对象的右边 界与所述对象显示区域 S2的右边界重合。
当所述显示对象的面积大于等于所述对象显示区域的面积时, 所述过边 界状态为: 所述显示对象中与所述移动方向反向对应的第二对象边界位于所 述对象显示区域中; 所述边界状态为: 所述第二对象边界与所述对象显示区 域中与所述移动方向反向对应的第二显示边界重合。 例如, 如图 6D 示意性 所示, 在移动方向为向上的情况下, 所述过边界状态为所述列表对象的下边 界(图中以粗实线示出)位于所述对象显示区域中。 在此情况下, 所述方法 移动所述显示对象, 直到如图 6E所示的边界状态。 此时, 所述列表对象的下 边界与所述对象显示区域 S2的下边界重合。
以上,描述了根据本发明实施例的方法。在根据本发明实施例的方法中, 通过在第一操作期间执行接触感测、 并在第二操作期间执行接近感测, 并根 据接触感测和接近感测的结果执行相应处理, 使得用户能够以简单的、 符合 操作习惯的手势(具体地, 先接触滑动、 后悬空停留的手势, 或先接触滑动、 后悬空滑动的手势), 来方便地控制所述电子设备执行相应的操作, 从而改进 了用户体验。
例如, 如上所述, 在一些电子设备的解锁操作中, 用户需要将一显示对 象(如滑块)从触摸显示屏上的一位置移动到另一位置。 在根据本发明实施 例的方法中, 通过在步骤 S310的操作中的适当设置, 例如, 将作为第三操作 参数的操作体在第二操作中的移动距离乘以大于 1 的系数, 转换得到所述显 示对象的第二移动距离, 使得用户能够通过较短的悬空滑动距离来将显示对 象移动较长距离, 从而整体上缩短了用户操作所需的操作体的移动距离, 并 缩短了用户的操作时间, 从而更加符合用户的操作习惯, 改进了用户体验。
又例如, 在电子设备中常见的显示对象滚动操作中, 通过根据本发明实 施例的方法 (尤其是步骤 S306-S310的操作 ), 为了看到触摸显示屏上当前未 显示的内容, 用户做出较短距离的接触滑动操作以使得显示对象在一方向上 开始滚动, 此后, 仅需做出悬空停留操作, 所述显示对象便可在该方向上继 续滚动, 直到所期望的内容显示在对象显示区域中, 从而使得用户能够通过 较短距离的接触滑动操作以及此后的方便的悬空停留操作来将显示对象移动 较长距离。 此外, 优选地, 用户还可通过调整操作体与电子设备之间的距离 来控制显示对象的移动速度, 并且可以在调整的同时直观地识别该距离与显 示对象的移动速度之间的关系, 从而能够精确并快速地将所期望的内容显示 在对象显示区域上, 改进了用户体验。
接下来, 将参考图 7来说明根据本发明实施例的控制装置。 图 7是示出 根据本发明实施例的控制装置 700的示范性结构框图。 控制装置 700可应用 于电子设备。 如图 7中所示, 本实施例的控制装置 700包括传感单元 710、 状态检测单元 720、轨迹划分单元 730和执行单元 740。 电子设备 700的各个 单元执行上述图 1中的控制方法的各个步骤 /功能, 因此, 为了描述简洁, 不 再具体描述。
例如, 传感单元 710具有第一检测距离和第二检测距离。 传感单元 710 可检测操作体相对于电子设备的相对距离小于第一检测距离时的三维运动轨 迹, 其中第一检测距离和第二检测距离是相对于电子设备的距离, 且第二检 测距离小于第一检测距离, 三维运动轨迹是一端位于第一检测距离和第二检 测距离之间而另一端等于或小于第二检测距离的连续运动轨迹。
如上所述,在有些情况下操作体可能从第一检测距离到达第二检测距离, 在第二检测距离进一段轨迹之后又从第二检测距离返回第一检测距离。 此时 可根据当前操作的要求确定三维运动轨迹的起点和终点, 即, 可根据当前操 作的要求确定三维运动轨迹应当是操作体从第一检测距离到达第二检测距离 以及在第二检测距离行进的轨迹 , 还是操作体在第二检测距离行进以及从第 二检测距离返回第一检测距离的轨迹。 也就是说, 当操作体到达第一检测距 离时传感单元 710开始检测操作体的三维运动轨迹直到操作体到达第二检测 距离后又从第二检测距离开为止, 或者当操作体处于第二检测距离或第二检 测距离以下的位置时传感单元 710开始检测操作体的三维运动轨迹直到操作 体从第二检测距离返回第一检测距离为止。
可由设备提供商或用户预先设置传感单元 710的第一检测距离, 可替换 地, 传感单元 710还可根据用户的操作来设置第一检测距离。 具体地, 当检 测到操作体位于一固定位置的时间超过预定时间时, 传感单元 310可将固定 位置相对于电子设备的距离设置为第一检测距离。
与第一检测距离类似, 可由设备提供商或用户预先设置传感单元 710的 第二检测距离, 可替换地, 传感单元 710还可根据用户的操作来设置第二检 测距离。 具体地, 当检测到操作体位于相对于电子设备的相对距离小于第一 检测距离的一固定位置的时间超过预定时间时, 传感单元 710将固定位置相 对于电子设备的距离设置为第二检测距离。 如上所述, 可将传感单元的第二 检测距离设置为零。 也就是说, 当操作体处于第二检测距离时可触碰到电子 设备。
根据本发明的一个实施例, 三维运动轨迹是在相对于电子设备的相对距 离上单调减少或单调增加的运动轨迹。 也就是说, 三维运动轨迹是操作体从 第一检测高度接近并到达等于或小于第二检测距离处的连续运动轨迹, 或者 操作体从等于或小于第二检测距离处离开而到达第一检测高度的连续运动轨 迹。
当操作体到达接近传感单元的第二检测距离或从第二检测距离离开时, 状态检测单元 720可生成状态改变信号。 轨迹划分单元 730可根据状态检测 单元 720生成的状态改变信号划分三维轨迹, 以获得相对距离等于或小于第 二检测距离时的第一轨迹和相对距离高于第二检测距离时的第二轨迹。 在本 实施例中, 第二轨迹可以是相对电子设备位于第二检测距离处的一个点, 或 者是由等于或低于第二检测距离的多个点组成的连续运动轨迹。执行单元 740 可根据所述第一轨迹和所述第二轨迹执行相应的控制命令。
在本实施例的控制装置中, 通过检测操作体在接近或离开电子设备过程 中的三维轨迹, 并将操作体在接近或离开电子设备过程中的三维轨迹与操作 体在传感单元上或在距离电子设备预定高度处进行的操作结合来执行控制命 令, 能够实现更精准的触控操作, 从而带来更好的交互体验。
下面, 参照图 8说明根据本发明另一实施例的控制装置。 图 8是示出根 据本发明另一实施例的控制装置 800的示范性结构框图。 控制装置 800可应 用于电子设备。 如图 8中所示, 与控制装置 700类似, 本实施例的控制装置 800包括传感单元 810、状态检测单元 820和轨迹划分单元 830。传感单元 810 具有第一检测距离和第二检测距离。 传感单元 810可检测操作体相对于电子 设备的相对距离小于第一检测距离时的三维运动轨迹, 其中第一检测距离和 第二检测距离是相对于电子设备的距离,且第二检测距离小于第一检测距离, 三维运动轨迹是一端位于第一检测距离和第二检测距离之间而另一端等于或 小于第二检测距离的连续运动轨迹。
当操作体到达接近传感单元的第二检测距离或从第二检测距离离开时, 状态检测单元 820可生成状态改变信号。 轨迹划分单元 830可根据状态检测 单元 820生成的状态改变信号划分三维轨迹, 以获得相对距离等于或小于第 二检测距离时的第一轨迹和相对距离高于第二检测距离时的第二轨迹。
控制装置 800还包括执行单元 840。 如图 8所示, 执行单元 840可包括 命令组确定模块 841、 命令选择模块 842和命令执行模块 843。 具体地, 命令 组确定模块 841可根据第二轨迹确定控制命令组。 命令选择模块 842可根据 第一轨迹在命令组确定模块所确定的控制命令组中选择相应的控制命令。 命 令执行模块 843可执行命令选择模块所选择的控制命令。
具体地, 当使用控制装置 800播放音乐时, 命令组确定模块 841可根据 用户在触摸屏上触碰音量控制按钮确定第二轨迹可对应于音量调节命令组。 音量调节命令组可包括增大音量、 减小音量、 静音和最大化音量。 当用户的 手指从触摸屏上显示的音量控制按钮的位置离开之后在空中的位置在触摸感 应区域上的投映在音量控制按钮上方时, 命令选择模块 842可确定第一轨迹 可对应于最大化音量; 当用户的手指从触摸屏上显示的音量控制按钮的位置 离开之后在空中的位置在触摸感应区域上的投映在音量控制按钮下方时, 命 令选择模块 842可确定第一轨迹可对应于静音; 当用户的手指从触摸屏上显 示的音量控制按钮的位置离开之后在空中的位置在触摸感应区域上的投映在 音量控制按钮左侧时,命令选择模块 842可确定第一轨迹可对应于减小音量; 而当用户的手指从触摸屏上显示的音量控制按钮的位置离开之后在空中的位 置在触摸感应区域上的投映在音量控制按钮右侧时, 命令选择模块 842可确 定第一轨迹可对应于增大音量。 命令执行模块 843可执行命令选择模块 822 所选择的音量调节命令。
在根据本发明实施例中, 传感单元可包括红外传感元件、 超声波传感元 件、 电容传感元件中的一个或多个。 例如, 可通过电容传感元件来检测操作 体相对于电子设备的相对距离小于第一检测距离时的三维运动轨迹。具体地, 可根据上述实施例中的方法确定第一检测距离对应的第一响应电容值和第二 检测距离对应的第二响应电容值。 传感单元可根据操作体所产生响应电容值 来进行相应的操作。
可替换地, 在将第二检测距离设置为零的情况下, 三维运动轨迹是一端 位于第一检测距离和第二检测距离之间而另一端等于第二检测距离的连续运 动轨迹。 传感单元可包括第一传感模块和第二传感模块以分别对操作体在第 一检测距离和第二检测距离之间的运动轨迹以及在第二检测距离处的运动轨 迹进行检测。 也就是说, 第一传感模块可检测操作体在第一检测距离和第二 检测距离之间的运动轨迹。 第一传感模块可包括设置在电子设备的不同位置 上的多个超声波传感元件、 红外传感元件或者成像装置, 来确定操作体的位 置。
下面,将结合图 9和图 10来描述根据本发明实施例的控制装置在移动显 示对象时的示例应用。图 9和图 10分别示出了根据本发明实施例的实施用于 移动显示对象的方法的显示装置 900和 1000。
所述显示装置诸如便携式移动终端、 个人计算机等。 所述显示装置包括 显示单元, 并且其显示区域中包括用于显示显示对象的对象显示区域。 所述 显示对象例如包括网页、 图片、 列表或其他各种显示控件(如, 滑块)。
此外, 所述显示装置可包括接触传感器和接近传感器两者。 所述接触传 感器用于感测操作体与所述显示装置的接触, 并且所述接近传感器用于感测 操作体与所述显示装置的接近。 替代地, 所述显示装置也可仅包括接近传感 器。 在此情况下, 所述接近传感器设计为既可以用于感测操作体与所述显示 装置的接近, 也可以用于感测操作体与所述显示装置的接触。 以下, 在不需 要特别区分两者的情况下, 将统称为 "传感器"。
此外, 所述显示对象与所述对象显示区域之间处于第一相对位置关系。 如上所述, 当所述显示对象的面积小于所述对象显示区域的面积时, 所述显 示对象与所述对象显示区域之间处于第一相对位置关系指的是: 所述显示对 象显示在所述对象显示区域的第一位置。 当所述显示对象的面积大于等于所 述对象显示区域的面积时, 所述显示对象与所述对象显示区域之间处于第一 相对位置关系指的是: 所述显示对象的第一部分显示在所述对象显示区域。
如图 9和图 10所示, 显示装置 900包括: 第一感测单元 901、 第一获得 单元 902、 第一判断单元 903、 第二获得单元 904、 第一改变单元 905、 第二 感测单元 906、 第三获得单元 907、 第二判断单元 908、 第四获得单元 909和 第二改变单元 910。
具体地, 第一感测单元 901例如为上述接触传感器, 其感测一操作体相 对于所述显示装置的第一操作。所述操作体例如用户用来操作的手指的指尖、 触笔的笔尖等。 所述第一感测单元 901感测由所述操作体在所述显示装置上 形成的多个轨迹点而组成的第一系列轨迹点 , 从而感测所述操作体对显示装 置进行的第一操作。
第一获得单元 902根据第一感测单元 901的感测结果, 获得所述操作体 与所述显示装置之间的第一距离。
第一判断单元 903判断所述第一距离是否小于等于第一检测距离。 所述 第一检测距离是所述显示装置将所述第一操作识别为接触操作而设置的阈 值。 其值由本领域技术人员根据需要适当地设置。 当判断所述第一距离小于等于第一检测距离时, 第二获得单元 904根据 所述第一操作, 获得所述操作体的第一操作参数和第二操作参数, 其中, 所 述第一操作参数与所述第二操作参数不同。 具体地, 如上所述, 所述第一感 测单元 901 能够得到所述第一操作的第一系列轨迹点。 通过所述第一系列轨 迹点, 所述第二获得单元 904可以获得所述操作体的第一操作参数和第二操 作参数。
更具体地, 所述第二获得单元 904可以通过所述第一系列轨迹点中的第 一轨迹点和最后轨迹点的位置, 得到所述第一操作的移动方向, 作为所述第 一操作参数。 当然, 在所述显示装置支持较高运算能力的情况下, 为提高精 度, 所述第二获得单元 904也可通过所述第一系列轨迹点中的任何两个(相 邻的)轨迹点, 得到所述第一操作的实时移动方向, 作为所述第一操作参数。
此外, 所述第二获得单元 904还可获得与所述第一操作参数不同的第二 操作参数。 例如, 所述第二获得单元 904可以通过所述第一系列轨迹点中的 第一轨迹点和最后轨迹点的位置, 得到所述第一操作的移动距离, 作为所述 第二操作参数。 替代地, 所述第二获得单元 904可通过所述显示装置中的系 统时钟的计时, 得到所述第一操作的操作时间, 作为所述第二操作参数。 替 代地, 所述第二获得单元 904可以通过所述第一系列轨迹点中的第一轨迹点 和最后轨迹点的位置、 以及所述系统时钟的计时, 得到所述第一操作的速度, 作为所述第二操作参数。 当然, 类似地, 所述第二获得单元 904可以通过所 述第一系列轨迹点中的任何两个(相邻的)轨迹点、 以及所述显示装置的系 统时钟的计时, 得到所述第一操作的实时速度, 作为所述第二操作参数。
第一改变单元 905基于所述第一操作参数和所述第二操作参数, 将所述 显示对象与所述对象显示区域之间从第一相对位置关系改变为第二相对位置 关系。
具体地, 所述第一改变单元 905可以包括如图 10所示的第一转换单元
1001 , 将所述第一操作参数和所述第二操作参数分别转换为所述显示对象的 第一移动参数和第二移动参数; 以及第一移动单元 1002, 基于所述第一移动 参数和所述第二移动参数, 移动所述显示对象, 以将所述显示对象与所述对 象显示区域之间从第一相对位置关系改变为第二相对位置关系。
更具体地, 例如, 所述第一转换单元 1001将所述第一操作的移动方向转 换为所述显示对象的移动方向, 作为所述第一移动参数。 例如, 所述第一转 换单元 1001 可以将与所述第一操作的移动方向一致或相反的方向设置为所 述显示对象的移动方向。 在以下的描述中, ^^设所述显示对象的移动方向与 所述第一操作的移动方向相同。
又例如,所述第一转换单元 1001将所述第一操作的移动距离转换为所述 显示对象的第一移动距离, 作为所述第二移动参数。 例如, 所述第一转换单 元 1001 可以通过将所述第一操作的移动距离与一系数相加而转换得到所述 显示对象的第一移动距离。所述系数由本领域技术人员根据需要适当地设置。 例如, 所述系数可以设置为 0。 当然, 以上转换方式仅为示例。 本领域技术 人员能够理解,所述第一转换单元 1001完全可以通过将所述第一操作的移动 距离经其他方式转换(如乘以一系数, 或根据分段函数等等) 而得到所述显 示对象的第一移动距离作为所述第二移动参数。
替代地,所述第一转换单元 1001将所述第一操作的操作时间转换为所述 显示对象的第一移动时间, 作为所述第二移动参数。 为符合用户的操作习惯, 优选地,所述第一转换单元 1001将所述第一操作的操作时间设置为所述显示 对象的第一移动时间。 当然, 本领域技术人员也可将所述第一操作的操作时 间经适当算术变换后得到所述显示对象的第一移动时间,在此不做具体限定。
替代地,所述第一转换单元 1001将所述第一操作的速度转换为所述显示 对象的第一移动速度,作为所述第二移动参数。 例如, 所述第一转换单元 301 可预设一分段函数, 当所述第一操作的速度在 [0, )之间时, 所述第一转换单 元 1001将所述显示对象的第一移动速度设置为 Vi; 当所述第一操作的速度 在 [ , v2)之间时, 所述第一转换单元 1001将所述显示对象的第一移动速度设 置为 V2; …当所述第一操作的速度在 [ν^,ν„)之间时, 所述第一转换单元 1001 将所述显示对象的第一移动速度设置为 Vn。 所述 , v2...v^,v p V2...Vn 的值可由本领域技术人员根据需要适当地设置。 当然, 与上述类似, 本领域 技术人员能够理解,所述第一转换单元 1001完全可通过将所述第一操作的速 度经其他方式转换(如乘以一系数, 或与一系数相加等等) 而得到所述显示 对象的第一移动速度作为所述第二移动参数。
此后,所述第一移动单元 1002基于所述第一移动参数和所述第二移动参 数, 移动所述显示对象, 使得所述显示对象与所述对象显示区域之间从第一 相对位置关系改变为第二相对位置关系。
更具体地, 例如, 所述第一移动单元 1002可以比较所述显示对象的面积 与所述对象显示区域的面积的大小。 当所述显示对象的面积小于所述对象显 示区域的面积时, 所述第一移动单元 1002基于所述显示对象的移动方向, 并 且还基于所述显示对象的第一移动距离、 第一移动时间和第一移动速度中的 至少一个, 将所述显示对象从所述第一位置移动到所述对象显示区域的第二 位置, 其中, 所述第二位置与所述第一位置不同。
另一方面, 当所述显示对象的面积大于等于所述对象显示区域的面积时, 所述第一移动单元 1002基于所述显示对象的移动方向,并且还基于所述显示 对象的第一移动距离、 第一移动时间和第一移动速度中的至少一个, 移动所 述显示对象, 以将所述显示对象的第二部分显示在所述对象显示区域。 需要 指出的是, 所述第二部分与所述第一部分至多部分重叠。
第二感测单元 906例如是所述接近传感器, 其感测所述操作体相对于所 述显示装置的第二操作。 具体地, 所述第二感测单元 906感测所述操作体在 所述显示装置上投影形成的第二系列轨迹点, 从而感测所述操作体对显示装 置进行的第二操作。 此外, 所述第二感测单元 906可以与所述第一感测单元 901分离设置为两个单元, 也可合并为一个单元。
第三获得单元 907根据所述第二操作, 获得所述操作体与所述显示装置 之间的第二距离。 此外, 优选地, 所述第三获得单元 907以预定定时获得所 述操作体与所述显示装置之间的第二距离。 所述预定定时由本领域技术人员 根据需要适当地确定。 例如, 所述预定定时可设置为 3ms、 5ms等。 由此, 所述第三获得单元 907实时地获得在所述第二操作中所述操作体与所述显示 装置之间的实时距离。
第二判断单元 908判断所述第二距离是否大于第一检测距离并小于等于 第二检测距离。 所述第二检测距离是将所述第二操作识别为接近操作而设置 的阈值。 其值由本领域技术人员根据显示装置的精度适当地设置, 在此不做 具体限定。
此外, 优选地, 所述显示装置还可包括如图 10所示的保持单元 1003。 当第三获得单元 907 以预定定时不断获得所述第二距离时, 在第二判断单元 908 判断所述第二距离小于等于第一检测距离、 或大于第二检测距离时的时 刻,所述保持单元 1003可以保持在该时刻的所述显示对象与所述对象显示区 域之间的相对位置关系不变。
当判断所述第二距离大于第一检测距离并小于等于第二检测距离时, 第 四获得单元 909根据所述第二操作, 获得所述操作体的不同于所述第一操作 参数的第三操作参数。
具体地, 如上所述, 所述第二感测单元 906能够得到所述第二操作在所 述显示装置上投影形成的第二系列轨迹点。 通过所述第二系列轨迹点, 所述 第四获得单元 909可以获得所述操作体的第三操作参数。
更具体地, 例如, 所述第四获得单元 909可以通过所述第二系列轨迹点 中的第一轨迹点和最后轨迹点的位置, 得到所述第二操作的移动距离, 作为 所述第三操作参数。 替代地, 所述第四获得单元 909可通过所述显示装置中 的系统时钟的计时, 得到所述第二操作的操作时间, 作为所述第三操作参数。 当然, 所述第四获得单元 909还可通过上述得到的第二操作的移动距离和第 二操作的操作时间, 进一步得到所述第二操作的速度, 作为所述第三操作参 数。
第二改变单元 910基于所述第一操作参数和所述第三操作参数, 将所述 显示对象与所述对象显示区域之间从第二相对位置关系改变为第三相对位置 关系。
具体地, 所述第二改变单元 910 可包括如图 10 所示的第二转换单元 1004, 将所述第三操作参数转换为所述显示对象的第三移动参数; 以及第二 移动单元 1005 , 基于所述第一移动参数和所述第三移动参数, 移动所述显示 对象, 以将所述显示对象与所述对象显示区域之间从第二相对位置关系改变 为第三相对位置关系。
所述第二转换单元 1004 首先将所述第三操作参数转换为所述显示对象 的第三移动参数。
更具体地, 例如, 所述第二转换单元 1004将所述第二操作的移动距离转 换为所述显示对象的第二移动距离, 作为所述第三移动参数。 例如, 所述第 二转换单元 1004 可以通过将所述第二操作的移动距离与一系数相加而转换 得到所述显示对象的移动距离。 所述系数由本领域技术人员根据需要适当地 设置。 优选地, 所述系数可以设置得大于 0。 由此, 用户可以通过相对较短 移动距离的操作而将所述显示对象移动较长距离。 当然, 以上转换方式仅为 示例。 本领域技术人员能够理解, 所述第二转换单元 1004完全可以通过将所 述第二操作的移动距离经其他方式转换(如乘以一系数, 或根据分段函数等 等) 而得到所述显示对象的第二移动距离作为所述第三移动参数。 替代地,所述第二转换单元 1004将所述第二操作的操作时间转换为所述 显示对象的第二移动时间, 作为所述第三移动参数。 为符合用户的操作习惯, 优选地,所述第二转换单元 1004将所述第二操作的操作时间设置为所述显示 对象的第二移动时间。 当然, 本领域技术人员也可将所述第二操作的操作时 间经适当算术变换后得到所述显示对象的第二移动时间,在此不做具体限定。
替代地,所述第二转换单元 1004将所述第二操作的速度转换为所述显示 对象的第二移动速度, 作为所述第三移动参数。 与上述类似, 本领域技术人 员能够理解,所述第二转换单元 1004完全可通过将所述第二操作的速度经各 种方式转换(如乘以一系数、 与一系数相加、 或通过分段函数等等) 而得到 所述显示对象的第二移动速度作为所述第三移动参数。
此后,所述第二移动单元 1005基于所述第一移动参数和所述第三移动参 数, 移动所述显示对象, 使得所述显示对象与所述对象显示区域之间从第二 相对位置关系改变为第三相对位置关系。
更具体地, 例如, 所述第二移动单元 1005 可以参考所述第一移动单元 1002获得的显示对象的面积与对象显示区域的面积之间的关系。 当所述显示 对象的面积小于所述对象显示区域的面积时,所述第二移动单元 1005基于所 述显示对象的移动方向, 并且还基于所述显示对象的第二移动距离、 第二移 动时间和第二移动速度中的至少一个, 将所述显示对象从所述第二位置移动 到所述对象显示区域的第三位置, 其中, 所述第三位置与所述第二位置不同。 另一方面, 当所述显示对象的面积大于等于所述对象显示区域的面积时, 所 述第二移动单元 1005基于所述显示对象的移动方向,并且还基于所述显示对 象的第二移动距离、 第二移动时间和第二移动速度中的至少一个, 移动所述 显示对象, 以将所述显示对象的第三部分显示在所述对象显示区域。 需要指 出的是, 所述第三部分与所述第二部分至多部分重叠。
在上述实施例中, 所述第二感测单元 906感测的第二系列轨迹点包括多 个轨迹点。 换句话说, 所述操作体以在所述显示装置上方滑行的方式进行第 二操作。 替代地, 所述第二感测单元 906所感测到的第二系列轨迹点也可仅 包括单个轨迹点。 换句话说, 所述操作体可以悬空停留在所述显示装置的上 方。 在此情况下, 同样地, 所述第二移动单元 1005基于如上所述获得的所述 显示对象的移动方向、 以及所述显示对象的第二移动时间, 移动所述显示对 象, 使得所述显示对象与所述对象显示区域之间从第二相对位置关系改变为 第三相对位置关系。
优选地, 在此情况下, 所述第四获得单元 909还可获得所述第二距离, 并且所述第二改变单元 910基于所述第二距离控制所述显示对象的第二移动 速度。 例如, 所述第二改变单元 910可预设一分段函数, 当所述第二距离在 ( ,4]之间时,所述第二改变单元 910将所述显示对象的第二移动速度设置为 νΊ; 当所述第二距离在 (4,ί 2]之间时, 所述第二改变单元 910将所述显示对 象的第二移动速度设置为 V,2; …当所述第二距离在 (i _, J之间时, 所述第 二改变单元 910将所述显示对象的第二移动速度设置为 v,n。所述 , ί 2... ^,ί 和 νΊ, V,2...V,n的值可由本领域技术人员根据需要适当地设置。 需要指出的 是, ^对应于第一检测距离, 并且 4对应于第二检测距离。 当然, 与上述类 似, 本领域技术人员能够理解, 所述第二改变单元 910完全可通过将所述第 二距离经其他方式转换(如乘以一系数等等) 而得到所述显示对象的第二移 动速度。 由此, 用户可通过调整操作体与显示装置之间的距离来控制显示对 象的移动速度, 并且可以在调整的同时直观地识别该距离与显示对象的移动 速度之间的关系, 从而能够精确并快速地将所期望的内容显示在对象显示区 域上。
此外, 优选地, 所述第二改变单元 910可以包括确定单元(未示出), 基 于所述第一操作参数和所述第三操作参数, 确定所述显示对象的移动距离和 移动方向; 第三判断单元(未示出), 基于所述移动距离, 判断移动后的显示 对象是否会处于过边界状态; 以及第三移动单元(未示出), 当第三判断单元 判断移动后的显示对象会处于过边界状态时, 移动所述显示对象, 直到所述 显示对象处于边界状态, 作为所述显示对象与所述对象显示区域之间的第三 相对位置关系。
以上, 描述了根据本发明实施例的显示装置。 在根据本发明实施例的显 示装置中, 通过第一感测单元执行接触感测, 并通过第二感测单元执行接近 感测, 从而根据接触感测和接近感测的结果执行相应处理, 使得用户能够以 简单的、 符合操作习惯的手势(具体地, 先接触滑动、 后悬空停留的手势, 或先接触滑动、后悬空滑动的手势), 来方便地控制所述显示装置执行相应的 操作, 从而改进了用户体验。
下面, 参照图 11说明根据本发明实施例的电子设备。 图 11是示出根据 本发明实施例的电子设备 1100的示范性结构框图。 如图 11 中所示, 本实施 例的控制装置 1100包括传感单元 1110和处理单元 1120。
具体地, 传感单元 1110具有第一检测距离和第二检测距离。 如上所述, 传感单元可包括红外传感元件、 超声波传感元件、 电容传感元件等接近传感 元件中的一个或多个。传感单元 1110可检测操作体相对于电子设备的相对距 离小于第一检测距离时的三维运动轨迹。 第一检测距离和第二检测距离是相 对于电子设备的距离, 且第二检测距离小于第一检测距离。 三维运动轨迹是 一端位于第一检测距离和第二检测距离之间而另一端等于或小于第二检测距 离的连续运动轨迹。
如上所述,在有些情况下操作体可能从第一检测距离到达第二检测距离, 在第二检测距离进一段轨迹之后又从第二检测距离返回第一检测距离。 此时 可根据当前操作的要求确定三维运动轨迹的起点和终点, 即, 可根据当前操 作的要求确定三维运动轨迹应当是操作体从第一检测距离到达第二检测距离 以及在第二检测距离行进的轨迹, 还是操作体在第二检测距离行进以及从第 二检测距离返回第一检测距离的轨迹。 也就是说, 当操作体到达第一检测距 离时传感单元 1110开始检测操作体的三维运动轨迹直到操作体到达第二检测 距离后又从第二检测距离开为止, 或者当操作体处于第二检测距离或第二检 测距离以下的位置时传感单元 1110开始检测操作体的三维运动轨迹直到操作 体从第二检测距离返回第一检测距离为止。 在本实施例中, 第二轨迹可以是 相对电子设备位于第二检测距离处的一个点, 或者是由等于或低于第二检测 距离的多个点组成的连续运动轨迹。
可由设备提供商或用户预先设置传感单元 1110的第一检测距离,可替换 地, 传感单元 1110还可根据用户的操作来设置第一检测距离。 具体地, 当检 测到操作体位于一固定位置的时间超过预定时间时,传感单元 1110可将固定 位置相对于电子设备的距离设置为第一检测距离。
与第一检测距离类似,可由设备提供商或用户预先设置传感单元 1110的 第二检测距离, 可替换地, 传感单元 1110还可根据用户的操作来设置第二检 测距离。 具体地, 当检测到操作体位于相对于电子设备的相对距离小于第一 检测距离的一固定位置的时间超过预定时间时,传感单元 1110将固定位置相 对于电子设备的距离设置为第二检测距离。 如上所述, 可将传感单元的第二 检测距离设置为零。 也就是说, 当操作体处于第二检测距离时可触碰到电子 设备。 根据本发明的一个实施例, 三维运动轨迹是在相对于电子设备的相对距 离上单调减少或单调增加的运动轨迹。 也就是说, 三维运动轨迹是操作体从 第一检测高度接近并到达等于或小于第二检测距离处的连续运动轨迹, 或者 操作体从等于或小于第二检测距离处离开而到达第一检测高度的连续运动轨 迹。
处理单元 1120可包括状态检测模块 1121、 轨迹划分模块 1122和执行模 块 1123。 当操作体到达接近传感单元的第二检测距离或从第二检测距离离开 时, 状态检测模块 1121可生成状态改变信号。 轨迹划分模块 1122可根据状 态检测模块 1121生成的状态改变信号划分三维轨迹,以获得相对距离等于或 小于第二检测距离时的第一轨迹和相对距离高于第二检测距离时的第二轨 迹。 执行模块 1123可根据第一轨迹和第二轨迹执行相应的控制命令。
在本实施例的电子设备中, 通过检测操作体在接近或离开电子设备过程 中的三维轨迹, 并将操作体在接近或离开电子设备过程中的三维轨迹与操作 体在传感单元上或在距离电子设备预定高度处进行的操作结合来执行控制命 令, 能够实现更精准的触控操作, 从而带来更好的交互体验。
下面将参照图 12描述根据本发明实施例的电子设备的另一示例。
如图 12所示, 所述电子设备 1200包括: 第一传感器 1201、 第二传感器 1202、处理器 1203和显示器 1204。 所述第一传感器 1201、 第二传感器 1202、 处理器 1203和显示器 1204之间可通信地耦合。
显示器 1204的显示区域中包括对象显示区域,所述对象显示区域用于显 示显示对象,所述显示对象与所述对象显示区域之间处于第一相对位置关系。
第一传感器 1201例如为接触传感器,其感测一操作体相对于所述电子设 备的第一操作。 第二传感器 1202例如为接近传感器, 其感测所述操作体相对 于所述电子设备的第二操作。 所述第一传感器 1201和第二传感器 1202可以 分离设置为两个单元, 也可以合并为一个单元。
处理器 1203例如为中央处理单元或微处理器,其被配置为:根据所述第 一操作, 获得所述操作体与所述电子设备之间的第一距离; 判断所述第一距 离是否小于第一检测距离; 当判断所述第一距离小于第一检测距离时, 根据 所述第一操作, 获得所述操作体的第一操作参数和第二操作参数, 其中, 所 述第一操作参数与所述第二操作参数不同; 基于所述第一操作参数和所述第 二操作参数, 将所述显示对象与所述对象显示区域之间从第一相对位置关系 改变为第二相对位置关系; 根据所述第二操作, 获得所述操作体与所述电子 设备之间的第二距离; 判断所述第二距离是否大于第一检测距离并小于第二 检测距离; 当判断所述第二距离大于第一检测距离并小于第二检测距离时, 根据所述第二操作, 获得所述操作体的不同于所述第一操作参数的第三操作 参数; 以及基于所述第一操作参数和所述第三操作参数, 将所述显示对象与 所述对象显示区域之间从第二相对位置关系改变为第三相对位置关系。
所述处理器 1203的具体处理已经参照图 3在根据本发明实施例的控制方 法中详细描述, 在此不再重复。
以上, 描述了根据本发明实施例的电子设备。 在根据本发明实施例的电 子设备中, 通过第一传感器执行接触感测, 并通过第二传感器执行接近感测, 从而根据接触感测和接近感测的结果执行相应处理,使得用户能够以简单的、 符合操作习惯的手势(具体地, 先接触滑动、 后悬空停留的手势, 或先接触 滑动、 后悬空滑动的手势), 来方便地控制所述电子设备执行相应的操作, 从 而改进了用户体验。
以上,参照图 1到图 12描述了根据本发明实施例的控制方法及其相应控 制装置、 用于移动显示对象的方法及其相应显示装置、 及电子设备。
需要说明的是, 在本说明书中, 术语 "包括"、 "包含" 或者其任何其他 变体意在涵盖非排他性的包含, 从而使得包括一系列要素的过程、 方法、 物 品或者设备不仅包括那些要素, 而且还包括没有明确列出的其他要素, 或者 是还包括为这种过程、 方法、 物品或者设备所固有的要素。 在没有更多限制 的情况下, 由语句 "包括一个 ... ... " 限定的要素, 并不排除在包括所述要素 的过程、 方法、 物品或者设备中还存在另外的相同要素。
最后, 还需要说明的是, 上述一系列处理不仅包括以这里所述的顺序按 时间序列执行的处理, 而且包括并行或分别地、 而不是按时间顺序执行的处 理。
通过以上的实施方式的描述, 本领域的技术人员可以清楚地了解到本发 明可借助软件加必需的硬件平台的方式来实现, 当然也可以全部通过硬件来 实施。 基于这样的理解, 本发明的技术方案对背景技术做出贡献的全部或者 部分可以以软件产品的形式体现出来, 该计算机软件产品可以存储在存储介 质中, 如 ROM/RAM、 磁碟、 光盘等, 包括若干指令用以使得一台计算机设 备(可以是个人计算机, 服务器, 或者网络设备等)执行本发明各个实施例 或者实施例的某些部分所述的方法。
在本发明实施例中, 单元 /模块可以用软件实现, 以便由各种类型的处理 器执行。 举例来说, 一个标识的可执行代码模块可以包括计算机指令的一个 或多个物理或者逻辑块, 举例来说, 其可以被构建为对象、 过程或函数。 尽 管如此, 所标识模块的可执行代码无需物理地位于一起, 而是可以包括存储 在不同位里上的不同的指令, 当这些指令逻辑上结合在一起时, 其构成单元 / 模块并且实现该单元 /模块的规定目的。
在单元 /模块可以利用软件实现时, 考虑到现有硬件工艺的水平, 所以可 以以软件实现的单元 /模块, 在不考虑成本的情况下, 本领域技术人员都可以 搭建对应的硬件电路来实现对应的功能, 所述硬件电路包括常规的超大规模 集成(VLSI ) 电路或者门阵列以及诸如逻辑芯片、 晶体管之类的现有半导体 或者是其它分立的元件。 模块还可以用可编程硬件设备, 诸如现场可编程门 阵列、 可编程阵列逻辑、 可编程逻辑设备等实现。
本领域普通技术人员可以意识到, 结合本文中所公开的实施例描述的各 示例的单元及算法步骤, 能够以电子硬件、 计算机软件或者二者的结合来实 现, 为了清楚地说明硬件和软件的可互换性, 在上述说明中已经按照功能一 般性地描述了各示例的组成及步骤。 这些功能究竟以硬件还是软件方式来执 行, 取决于技术方案的特定应用和设计约束条件。 专业技术人员可以对每个 特定的应用来使用不同方法来实现所描述的功能, 但是这种实现不应认为超 出本发明的范围。
尽管已示出和描述了本发明的一些实施例, 但本领域技术人员应理解, 在不脱离本发明的原理和精神的情况下, 可对这些实施例进行各种修改, 这 样的修改应落入本发明的范围内。

Claims

权 利 要 求 书
1. 一种控制方法, 应用于电子设备, 所述电子设备包括传感单元, 其中 所述传感单元具有第一检测距离和第二检测距离, 所述第一检测距离和所述 第二检测距离是相对于所述电子设备的距离, 且所述第二检测距离小于所述 第一检测距离, 所述方法包括:
检测操作体相对于所述电子设备的相对距离小于所述第一检测距离时的 三维运动轨迹, 其中所述三维运动轨迹是一端位于所述第一检测距离和第二 检测距离之间而另一端等于或小于所述第二检测距离的连续运动轨迹;
当所述操作体到达所述接近传感单元的第二检测距离或从所述第二检测 距离离开时, 生成状态改变信号;
根据所述状态改变信号划分所述三维轨迹, 以获得所述相对距离等于或 小于所述第二检测距离时的第一轨迹和所述相对距离高于所述第二检测距离 时的第二轨迹; 以及
根据所述第一轨迹和所述第二轨迹执行相应的控制命令。
2. 如权利要求 1所述的方法, 其中
当检测到所述操作体位于一固定位置的时间超过预定时间时, 将所述固 定位置相对于所述电子设备的距离设置为所述第一检测距离。
3. 如权利要求 1所述的方法, 其中
当检测到所述操作体位于相对于所述电子设备的所述相对距离小于所述 第一检测距离的一固定位置的时间超过预定时间时, 将所述固定位置相对于 所述电子设备的距离设置为所述第二检测距离。
4. 如权利要求 1所述的方法, 其中
所述三维运动轨迹是在所述相对距离单调减少或单调增加的运动轨迹。
5. 如权利要求 1所述的方法, 其中所述根据所述第一轨迹和所述第二轨 迹执行相应的控制命令包括:
根据所述第二轨迹确定控制命令组;
根据所述第一轨迹在所确定的所述控制命令组中选择相应的控制命令; 以及
执行所选择的控制命令。
6. 如权利要求 1所述的方法, 其中检测操作体相对于所述电子设备的相 对距离小于所述第一检测距离时的三维运动轨迹包括:
感测操作体相对于所述电子设备的第一操作,
根据所述第一操作, 获得所述操作体与所述电子设备之间的第一距 离,
判断所述第一距离是否小于等于第一检测距离,
当判断所述第一距离小于等于第一检测距离时,根据所述第一操作, 获得所述操作体的第一操作参数和第二操作参数, 其中, 所述第一操作 参数与所述第二操作参数不同,
感测所述操作体相对于所述电子设备的第二操作,
根据所述第二操作, 获得所述操作体与所述电子设备之间的第二距 离,
判断所述第二距离是否大于第一检测距离并小于等于第二检测距 离, 以及
当判断所述第二距离大于第一检测距离并小于等于第二检测距离 时, 根据所述第二操作, 获得所述操作体的不同于所述第一操作参数的 第三操作参数,
根据所述第一轨迹和所述第二轨迹执行相应的控制命令包括:
基于所述第一操作参数和所述第二操作参数, 将所述显示对象与所 述对象显示区域之间从第一相对位置关系改变为第二相对位置关系, 以 及
基于所述第一操作参数和所述第三操作参数, 将所述显示对象与所 述对象显示区域之间从第二相对位置关系改变为第三相对位置关系。
7. 一种控制装置, 应用于电子设备, 所述装置包括:
传感单元, 具有第一检测距离和第二检测距离, 配置来检测操作体相对 于所述电子设备的相对距离小于所述第一检测距离时的三维运动轨迹, 其中 所述第一检测距离和所述第二检测距离是相对于所述电子设备的距离, 且所 述第二检测距离小于所述第一检测距离, 所述三维运动轨迹是一端位于所述 第一检测距离和第二检测距离之间而另一端等于或小于所述第二检测距离的 连续运动轨迹;
状态检测单元, 配置来当所述操作体到达所述接近传感单元的第二检测 距离或从所述第二检测距离离开时, 生成状态改变信号; 轨迹划分单元, 配置来根据所述状态改变信号划分所述三维轨迹, 以获 得所述相对距离等于或小于所述第二检测距离时的第一轨迹和所述相对距离 高于所述第二检测距离时的第二轨迹; 以及
执行单元, 配置来根据所述第一轨迹和所述第二轨迹执行相应的控制命 令。
8. 如权利要求 7所述的装置, 其中
所述传感单元还配置来当检测到所述操作体位于一固定位置的时间超过 预定时间时, 将所述固定位置相对于所述电子设备的距离设置为所述第一检 测距离。
9. 如权利要求 7所述的装置, 其中
所述传感单元还配置来当检测到所述操作体位于相对于所述电子设备的 所述相对距离小于所述第一检测距离的一固定位置的时间超过预定时间时, 将所述固定位置相对于所述电子设备的距离设置为所述第二检测距离。
10. 如权利要求 7所述的装置, 其中所述执行单元包括:
命令组确定模块, 配置来根据所述第二轨迹确定控制命令组;
命令选择模块, 配置来根据所述第一轨迹在所述命令组确定模块所确定 的控制命令组中选择相应的控制命令; 以及
命令执行模块, 配置来执行所选择的控制命令。
11. 如权利要求 7所述的装置, 其中
所述第二检测距离为零;
所述三维运动轨迹是一端位于所述第一检测距离和第二检测距离之间而 另一端等于所述第二检测距离的连续运动轨迹;
所述传感单元包括:
第一传感模块, 配置来检测所述操作体在所述第一检测距离和第二 检测距离之间的运动轨迹; 以及
第二传感模块, 配置来检测所述操作体在第二检测距离处的运动轨 迹。
12. 如权利要求 7所述的装置, 其中
所述传感单元包括:
第一感测单元, 感测一操作体相对于所述显示装置的第一操作, 第一获得单元, 根据所述第一操作, 获得所述操作体与所述显示装 置之间的第一距离,
第一判断单元, 判断所述第一距离是否小于等于第一检测距离, 第二获得单元, 当判断所述第一距离小于等于第一检测距离时, 根 据所述第一操作, 获得所述操作体的第一操作参数和第二操作参数, 其 中, 所述第一操作参数与所述第二操作参数不同,
第二感测单元, 感测所述操作体相对于所述显示装置的第二操作, 第三获得单元, 根据所述第二操作, 获得所述操作体与所述显示装 置之间的第二距离,
第二判断单元, 判断所述第二距离是否大于第一检测距离并小于等 于第二检测距离, 以及
第四获得单元, 当判断所述第二距离大于第一检测距离并小于等于 第二检测距离时, 根据所述第二操作, 获得所述操作体的不同于所述第 一操作参数的第三操作参数,
所述执行单元包括:
第一改变单元, 基于所述第一操作参数和所述第二操作参数, 将所 述显示对象与所述对象显示区域之间从第一相对位置关系改变为第二相 对位置关系, 以及
第二改变单元, 基于所述第一操作参数和所述第三操作参数, 将所 述显示对象与所述对象显示区域之间从第二相对位置关系改变为第三相 对位置关系。
13. 一种电子设备, 包括:
传感单元, 具有第一检测距离和第二检测距离, 配置来检测操作体相对 于所述电子设备的相对距离小于所述第一检测距离时的三维运动轨迹, 其中 所述第一检测距离和所述第二检测距离是相对于所述电子设备的距离, 且所 述第二检测距离小于所述第一检测距离, 所述三维运动轨迹是一端位于所述 第一检测距离和第二检测距离之间而另一端等于或小于所述第二检测距离的 连续运动轨迹;
处理单元, 包括:
状态检测模块, 配置来当所述操作体到达所述接近传感单元的第二 检测距离或从所述第二检测距离离开时, 生成状态改变信号;
轨迹划分模块, 配置来根据所述状态改变信号划分所述三维轨迹, 以获得所述相对距离等于或小于所述第二检测距离时的第一轨迹和所述相对 距离高于所述第二检测距离时的第二轨迹; 以及
执行模块, 配置来根据所述第一轨迹和所述第二轨迹执行相应的控 制命令。
14. 一种用于移动显示对象的方法, 应用于电子设备中, 所述电子设备 包括显示单元, 所述显示对象与所述显示单元的显示区域中用于显示所述显 示对象的对象显示区域之间处于第一相对位置关系, 所述方法包括:
感测一操作体相对于所述电子设备的第一操作;
根据所述第一操作, 获得所述操作体与所述电子设备之间的第一距离; 判断所述第一距离是否小于等于第一检测距离;
当判断所述第一距离小于等于第一检测距离时, 根据所述第一操作, 获 得所述操作体的第一操作参数和第二操作参数, 其中, 所述第一操作参数与 所述第二操作参数不同;
基于所述第一操作参数和所述第二操作参数, 将所述显示对象与所述对 象显示区域之间从第一相对位置关系改变为第二相对位置关系;
感测所述操作体相对于所述电子设备的第二操作;
根据所述第二操作, 获得所述操作体与所述电子设备之间的第二距离; 判断所述第二距离是否大于第一检测距离并小于等于第二检测距离; 当判断所述第二距离大于第一检测距离并小于等于第二检测距离时, 根 据所述第二操作, 获得所述操作体的不同于所述第一操作参数的第三操作参 数; 以及
基于所述第一操作参数和所述第三操作参数, 将所述显示对象与所述对 象显示区域之间从第二相对位置关系改变为第三相对位置关系。
15. 如权利要求 14所述的方法, 其中, 将所述显示对象与所述对象显示 区域之间从第一相对位置关系改变为第二相对位置关系包括:
将所述第一操作参数和所述第二操作参数分别转换为所述显示对象的第 一移动参数和第二移动参数; 以及
基于所述第一移动参数和所述第二移动参数, 移动所述显示对象, 使得 所述显示对象与所述对象显示区域之间从第一相对位置关系改变为第二相对 位置关系;
并且, 将所述显示对象与所述对象显示区域之间从第二相对位置关系改 变为第三相对位置关系包括:
将所述第三操作参数转换为所述显示对象的第三移动参数; 以及 基于所述第一移动参数和所述第三移动参数, 移动所述显示对象, 使得 所述显示对象与所述对象显示区域之间从第二相对位置关系改变为第三相对 位置关系。
16. 如权利要求 14所述的方法, 其中, 当所述显示对象的面积小于所述 对象显示区域的面积时, 所述显示对象与所述对象显示区域之间处于第一相 对位置关系为: 所述显示对象显示在所述对象显示区域的第一位置;
并且, 将所述显示对象与所述对象显示区域之间从第一相对位置关系改 变为第二相对位置关系包括: 将所述显示对象从所述第一位置移动到所述对 象显示区域的第二位置, 其中, 所述第二位置与所述第一位置不同;
将所述显示对象与所述对象显示区域之间从第二相对位置关系改变为第 三相对位置关系包括: 将所述显示对象从所述第二位置移动到所述对象显示 区域的第三位置, 其中, 所述第三位置与所述第二位置不同。
17. 如权利要求 14所述的方法, 其中, 当所述显示对象的面积大于等于 所述对象显示区域的面积时, 所述显示对象与所述对象显示区域之间处于第 一相对位置关系为: 所述显示对象的第一部分显示在所述对象显示区域; 并且, 将所述显示对象与所述对象显示区域之间从第一相对位置关系改 变为第二相对位置关系包括: 移动所述显示对象, 以将所述显示对象的第二 部分显示在所述对象显示区域, 其中, 所述第二部分与所述第一部分至多部 分重叠;
将所述显示对象与所述对象显示区域之间从第二相对位置关系改变为第 三相对位置关系包括: 移动所述显示对象, 以将所述显示对象的第三部分显 示在所述对象显示区域, 其中, 所述第三部分与所述第二部分至多部分重叠。
18. 如权利要求 14所述的方法, 其中,
所述第一操作参数包括所述操作体在第一操作中的移动方向;
所述第二操作参数包括所述操作体在第一操作中的移动距离、 速度和操 作时间中的至少一个; 并且
所述第三操作参数包括所述操作体在第二操作中的移动距离、 所述操作 体在第二操作中的操作时间中的至少一个。
19. 如权利要求 14所述的方法, 其中, 所述根据所述第二操作获得所述 操作体与所述电子设备之间的第二距离包括:
根据所述第二操作, 以预定定时获得所述操作体与所述电子设备之间的 第二距离。
20. 如权利要求 19所述的方法, 还包括:
在判断所述第二距离小于等于第一检测距离、 或大于第二检测距离时的 时刻, 保持在该时刻的所述显示对象与所述对象显示区域之间的相对位置关 系不变。
21. 如权利要求 14所述的方法, 其中, 将所述显示对象与所述对象显示 区域之间从第二相对位置关系改变为第三相对位置关系包括:
基于所述第一操作参数和所述第三操作参数, 确定所述显示对象的移动 巨离和移动方向;
基于所述移动距离, 判断移动后的显示对象是否会处于过边界状态; 以 及
当判断移动后的显示对象会处于过边界状态时, 移动所述显示对象, 直 到所述显示对象处于边界状态, 作为所述显示对象与所述对象显示区域之间 的第三相对位置关系;
其中, 当所述显示对象的面积小于所述对象显示区域的面积时, 所述过 边界状态为: 所述显示对象中与所述移动方向对应的第一对象边界超出所述 对象显示区域; 所述边界状态为: 所述第一对象边界与所述对象显示区域中 与所述移动方向对应的第一显示边界重合; 并且
当所述显示对象的面积大于等于所述对象显示区域的面积时, 所述过边 界状态为: 所述显示对象中与所述移动方向反向对应的第二对象边界位于所 述对象显示区域中; 所述边界状态为: 所述第二对象边界与所述对象显示区 域中与所述移动方向反向对应的第二显示边界重合。
22. —种显示装置, 其显示区域中包括对象显示区域, 所述对象显示区 域用于显示显示对象, 所述显示对象与所述对象显示区域之间处于第一相对 位置关系, 所述显示装置包括:
第一感测单元, 感测一操作体相对于所述显示装置的第一操作; 第一获得单元, 根据所述第一操作, 获得所述操作体与所述显示装置之 间的第一距离;
第一判断单元, 判断所述第一距离是否小于等于第一检测距离; 第二获得单元, 当判断所述第一距离小于等于第一检测距离时, 根据所 述第一操作, 获得所述操作体的第一操作参数和第二操作参数, 其中, 所述 第一操作参数与所述第二操作参数不同;
第一改变单元, 基于所述第一操作参数和所述第二操作参数, 将所述显 示对象与所述对象显示区域之间从第一相对位置关系改变为第二相对位置关 系;
第二感测单元, 感测所述操作体相对于所述显示装置的第二操作; 第三获得单元, 根据所述第二操作, 获得所述操作体与所述显示装置之 间的第二距离;
第二判断单元, 判断所述第二距离是否大于第一检测距离并小于等于第 二检测距离;
第四获得单元, 当判断所述第二距离大于第一检测距离并小于等于第二 检测距离时, 根据所述第二操作, 获得所述操作体的不同于所述第一操作参 数的第三操作参数; 以及
第二改变单元, 基于所述第一操作参数和所述第三操作参数, 将所述显 示对象与所述对象显示区域之间从第二相对位置关系改变为第三相对位置关 系。
23. 如权利要求 22所述的显示装置, 其中, 所述第一改变单元包括: 第一转换单元, 将所述第一操作参数和所述第二操作参数分别转换为所 述显示对象的第一移动参数和第二移动参数; 以及
第一移动单元, 基于所述第一移动参数和所述第二移动参数, 移动所述 显示对象, 以将所述显示对象与所述对象显示区域之间从第一相对位置关系 改变为第二相对位置关系;
并且, 所述第二改变单元包括:
第二转换单元, 将所述第三操作参数转换为所述显示对象的第三移动参 数; 以及
第二移动单元, 基于所述第一移动参数和所述第三移动参数, 移动所述 显示对象, 以将所述显示对象与所述对象显示区域之间从第二相对位置关系 改变为第三相对位置关系。
24. 如权利要求 22所述的显示装置, 其中, 所述第三获得单元根据所述 第二操作, 以预定定时获得所述操作体与所述显示装置之间的第二距离。
25. 如权利要求 24所述的显示装置, 还包括:
保持单元, 在第二判断单元判断所述第二距离小于等于第一检测距离、 或大于第二检测距离时的时刻, 保持在该时刻的所述显示对象与所述对象显 示区域之间的相对位置关系不变。
26. 如权利要求 22所述的显示装置, 其中, 所述第二改变单元包括: 确定单元, 基于所述第一操作参数和所述第三操作参数, 确定所述显示 对象的移动距离和移动方向;
第三判断单元, 基于所述移动距离, 判断移动后的显示对象是否会处于 过边界状态; 以及
第三移动单元, 当第三判断单元判断移动后的显示对象会处于过边界状 态时, 移动所述显示对象, 直到所述显示对象处于边界状态, 作为所述显示 对象与所述对象显示区域之间的第三相对位置关系;
其中, 当所述显示对象的面积小于所述对象显示区域的面积时, 所述过 边界状态为: 所述显示对象中与所述移动方向对应的第一对象边界超出所述 对象显示区域; 所述边界状态为: 所述第一对象边界与所述对象显示区域中 与所述移动方向对应的第一显示边界重合; 并且
当所述显示对象的面积大于等于所述对象显示区域的面积时, 所述过边 界状态为: 所述显示对象中与所述移动方向反向对应的第二对象边界位于所 述对象显示区域中; 所述边界状态为: 所述第二对象边界与所述对象显示区 域中与所述移动方向反向对应的第二显示边界重合。
27. 一种电子设备, 包括:
显示器, 其显示区域中包括对象显示区域, 所述对象显示区域用于显示 显示对象, 所述显示对象与所述对象显示区域之间处于第一相对位置关系; 第一传感器, 感测一操作体相对于所述电子设备的第一操作;
第二传感器, 感测所述操作体相对于所述电子设备的第二操作; 以及 处理器;
其中, 所述处理器被配置为:
根据所述第一操作, 获得所述操作体与所述电子设备之间的第一距离; 判断所述第一距离是否小于第一检测距离;
当判断所述第一距离小于第一检测距离时, 根据所述第一操作, 获得所 述操作体的第一操作参数和第二操作参数, 其中, 所述第一操作参数与所述 第二操作参数不同;
基于所述第一操作参数和所述第二操作参数, 将所述显示对象与所述对 象显示区域之间从第一相对位置关系改变为第二相对位置关系;
根据所述第二操作, 获得所述操作体与所述电子设备之间的第二距离; 判断所述第二距离是否大于第一检测距离并小于第二检测距离; 当判断所述第二距离大于第一检测距离并小于第二检测距离时, 根据所 述第二操作, 获得所述操作体的不同于所述第一操作参数的第三操作参数; 以及
基于所述第一操作参数和所述第三操作参数, 将所述显示对象与所述对 象显示区域之间从第二相对位置关系改变为第三相对位置关系。
PCT/CN2012/072033 2011-03-07 2012-03-07 控制方法、控制装置、显示装置以及电子设备 WO2012119548A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/003,687 US10345912B2 (en) 2011-03-07 2012-03-07 Control method, control device, display device and electronic device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201110054512.2 2011-03-07
CN201110054512.2A CN102681702B (zh) 2011-03-07 2011-03-07 控制方法、控制装置以及电子设备
CN201110061129.X 2011-03-14
CN201110061129.XA CN102681750B (zh) 2011-03-14 2011-03-14 用于移动显示对象的方法、显示装置和电子设备

Publications (1)

Publication Number Publication Date
WO2012119548A1 true WO2012119548A1 (zh) 2012-09-13

Family

ID=46797495

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/072033 WO2012119548A1 (zh) 2011-03-07 2012-03-07 控制方法、控制装置、显示装置以及电子设备

Country Status (2)

Country Link
US (1) US10345912B2 (zh)
WO (1) WO2012119548A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2752740A4 (en) * 2012-10-31 2015-03-11 Huawei Device Co Ltd DRAWING CONTROL METHOD, DEVICE THEREFOR AND MOBILE DEVICE

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9507462B2 (en) * 2012-06-13 2016-11-29 Hong Kong Applied Science and Technology Research Institute Company Limited Multi-dimensional image detection apparatus
US9711090B2 (en) * 2012-08-09 2017-07-18 Panasonic Intellectual Property Corporation Of America Portable electronic device changing display brightness based on acceleration and distance
KR20140110452A (ko) * 2013-03-08 2014-09-17 삼성전자주식회사 전자장치에서 근접 터치를 이용한 사용자 인터페이스 제어 방법 및 장치
US20160239002A1 (en) * 2013-09-25 2016-08-18 Schneider Electric Buildings Llc Method and device for adjusting a set point
KR101655810B1 (ko) 2014-04-22 2016-09-22 엘지전자 주식회사 차량용 디스플레이 장치
CN104238757B (zh) * 2014-09-29 2018-01-23 联想(北京)有限公司 一种电子设备的控制方法、装置和电子设备
US9613203B2 (en) 2015-03-02 2017-04-04 Comcast Cable Communications, Llc Security mechanism for an electronic device
KR102279790B1 (ko) 2015-03-10 2021-07-19 엘지전자 주식회사 차량용 디스플레이 장치
CN106325467B (zh) * 2015-06-15 2021-10-29 中兴通讯股份有限公司 控制移动终端的方法、装置及移动终端
KR20180005070A (ko) 2016-07-05 2018-01-15 엘지전자 주식회사 이동 단말기 및 그 제어방법

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090265670A1 (en) * 2007-08-30 2009-10-22 Kim Joo Min User interface for a mobile device using a user's gesture in the proximity of an electronic device
CN101627361A (zh) * 2007-01-07 2010-01-13 苹果公司 用于解释触摸屏显示器上的手指姿态的便携式多功能设备、方法和图形用户界面
US20100095206A1 (en) * 2008-10-13 2010-04-15 Lg Electronics Inc. Method for providing a user interface using three-dimensional gestures and an apparatus using the same

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1723505A4 (en) * 2004-03-11 2007-04-18 Redsky Mobile Inc LIMITED USER INTERFACNAVIGATION
US7379562B2 (en) * 2004-03-31 2008-05-27 Microsoft Corporation Determining connectedness and offset of 3D objects relative to an interactive surface
US20080165145A1 (en) * 2007-01-07 2008-07-10 Scott Herz Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Swipe Gesture
WO2009067224A1 (en) 2007-11-19 2009-05-28 Cirque Corporation Touchpad combined with a display and having proximity and touch sensing capabilities
JP4775386B2 (ja) 2008-02-18 2011-09-21 ソニー株式会社 センシング装置、表示装置、電子機器およびセンシング方法
KR101486345B1 (ko) 2008-03-21 2015-01-26 엘지전자 주식회사 이동 단말기 및 이동 단말기의 화면 표시 방법
JP2010157189A (ja) 2009-01-05 2010-07-15 Sony Corp 情報処理装置、情報処理方法およびプログラム
JP2010176330A (ja) 2009-01-28 2010-08-12 Sony Corp 情報処理装置、及び表示制御方法
JP5229084B2 (ja) 2009-04-14 2013-07-03 ソニー株式会社 表示制御装置、表示制御方法およびコンピュータプログラム
JP2011053971A (ja) * 2009-09-02 2011-03-17 Sony Corp 情報処理装置、情報処理方法およびプログラム
US8624925B2 (en) * 2009-10-16 2014-01-07 Qualcomm Incorporated Content boundary signaling techniques
JP2011150414A (ja) * 2010-01-19 2011-08-04 Sony Corp 情報処理装置、操作入力決定方法及び操作入力決定プログラム
US20120054670A1 (en) * 2010-08-27 2012-03-01 Nokia Corporation Apparatus and method for scrolling displayed information
US9164670B2 (en) * 2010-09-15 2015-10-20 Microsoft Technology Licensing, Llc Flexible touch-based scrolling

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101627361A (zh) * 2007-01-07 2010-01-13 苹果公司 用于解释触摸屏显示器上的手指姿态的便携式多功能设备、方法和图形用户界面
US20090265670A1 (en) * 2007-08-30 2009-10-22 Kim Joo Min User interface for a mobile device using a user's gesture in the proximity of an electronic device
US20100095206A1 (en) * 2008-10-13 2010-04-15 Lg Electronics Inc. Method for providing a user interface using three-dimensional gestures and an apparatus using the same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2752740A4 (en) * 2012-10-31 2015-03-11 Huawei Device Co Ltd DRAWING CONTROL METHOD, DEVICE THEREFOR AND MOBILE DEVICE
KR101522919B1 (ko) * 2012-10-31 2015-05-22 후아웨이 디바이스 컴퍼니 리미티드 드로잉 제어 방법, 장치 및 이동 단말기

Also Published As

Publication number Publication date
US20130342491A1 (en) 2013-12-26
US10345912B2 (en) 2019-07-09

Similar Documents

Publication Publication Date Title
WO2012119548A1 (zh) 控制方法、控制装置、显示装置以及电子设备
US10216407B2 (en) Display control apparatus, display control method and display control program
US10353570B1 (en) Thumb touch interface
TWI471756B (zh) 虛擬觸控方法
US8570283B2 (en) Information processing apparatus, information processing method, and program
US7884807B2 (en) Proximity sensor and method for indicating a display orientation change
RU2541852C2 (ru) Устройство и способ для управления пользовательским интерфейсом на основе движений
US20110157078A1 (en) Information processing apparatus, information processing method, and program
US20140043265A1 (en) System and method for detecting and interpreting on and off-screen gestures
US8775958B2 (en) Assigning Z-order to user interface elements
JPWO2013094371A1 (ja) 表示制御装置、表示制御方法およびコンピュータプログラム
US20130100051A1 (en) Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device
US20130100049A1 (en) Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device
KR20100136578A (ko) 터치 입력 수단 및 스타일러스 펜, 이를 이용한 터치스크린 장치, 및 터치스크린 제어방법
US20160054887A1 (en) Gesture-based selection and manipulation method
TW200941303A (en) Multi-object direction touch selection method and device, electronic device, computer accessible recording media and computer program product
US20130100050A1 (en) Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device
KR101339420B1 (ko) 베젤 영역을 이용한 전자책 컨텐츠 제어방법 및 장치
KR101154137B1 (ko) 터치 패드 상에서 한손 제스처를 이용한 사용자 인터페이스
JP6185439B2 (ja) 表示装置、画像形成装置、及び表示制御方法
TWI564780B (zh) 觸控螢幕姿態技術
US20130249807A1 (en) Method and apparatus for three-dimensional image rotation on a touch screen
TWI480792B (zh) 電子裝置的操作方法
US20170075453A1 (en) Terminal and terminal control method
CN102681702B (zh) 控制方法、控制装置以及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12755599

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14003687

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12755599

Country of ref document: EP

Kind code of ref document: A1