CN113534957B - Single-hand control method and device and mobile terminal - Google Patents

Single-hand control method and device and mobile terminal Download PDF

Info

Publication number
CN113534957B
CN113534957B CN202110791445.6A CN202110791445A CN113534957B CN 113534957 B CN113534957 B CN 113534957B CN 202110791445 A CN202110791445 A CN 202110791445A CN 113534957 B CN113534957 B CN 113534957B
Authority
CN
China
Prior art keywords
sliding
point
hand
screen
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110791445.6A
Other languages
Chinese (zh)
Other versions
CN113534957A (en
Inventor
史亚巍
王友
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Tongxin Software Technology Co ltd
Original Assignee
Nanjing Tongxin Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Tongxin Software Technology Co ltd filed Critical Nanjing Tongxin Software Technology Co ltd
Priority to CN202110791445.6A priority Critical patent/CN113534957B/en
Publication of CN113534957A publication Critical patent/CN113534957A/en
Application granted granted Critical
Publication of CN113534957B publication Critical patent/CN113534957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a single-hand control method, which is executed in a mobile terminal, wherein a distance sensor is arranged under a screen of the mobile terminal, and the method comprises the following steps: receiving a first one-hand sliding operation on the screen, and determining a first sliding track according to the first one-hand sliding operation through a ranging sensor; judging whether the first sliding track meets the preset track condition or not, and displaying a touch area setting interface on the screen when the first sliding track is determined to meet the preset track condition; receiving a second one-hand sliding operation of a touch area setting interface on the screen, and determining an accessible area on the screen according to the second one-hand sliding operation through the ranging sensor; and zooming the screen display area into the accessible area to display a one-handed operation area for initiating a one-handed operation mode. The invention also discloses a corresponding single-hand control device and a mobile terminal.

Description

Single-hand control method and device and mobile terminal
Technical Field
The present invention relates to the field of mobile terminals, and in particular, to a one-hand control method, a one-hand control device, and a mobile terminal.
Background
At present, the implementation method of the single-hand mode of the mobile terminal is excessively complicated in operation steps, for example, a user needs to open a setting program, enter an intelligent auxiliary page in the setting program, and select the single-hand mode in the intelligent auxiliary page.
In the prior art, one scheme for realizing the single-hand mode is to start the single-hand mode by sliding a finger in a specific area to enter the single-hand mode or sliding the finger left and right in a navigation area. However, this solution has a disadvantage in that sliding the entry point or the left-right sliding point easily causes a malfunction, which affects the current content of the display interface of the user. In addition, although the pressure sensor is used to determine whether to enter the one-hand mode, the pressure sensor is used to determine the magnitude of the pressure value, and thus it is difficult to control the start or switch of the one-hand mode.
For this reason, it is necessary to provide a one-hand control method to solve the problems in the above technical solutions.
Disclosure of Invention
To this end, the present invention provides a one-hand control method, a one-hand control device and a mobile terminal to solve or at least alleviate the above problems.
According to an aspect of the present invention, there is provided a one-hand control method performed in a mobile terminal having a distance sensor arranged under a screen thereof, the method comprising the steps of: receiving a first one-hand sliding operation on the screen, and determining a first sliding track according to the first one-hand sliding operation through a ranging sensor; judging whether the first sliding track meets the preset track condition or not, and displaying a touch area setting interface on the screen when the first sliding track is determined to meet the preset track condition; receiving a second one-hand sliding operation of a touch area setting interface on the screen, and determining an accessible area on the screen according to the second one-hand sliding operation through the ranging sensor; and zooming the screen display area into the accessible area to display a one-handed operation area for initiating a one-handed operation mode.
Optionally, in the one-hand control method according to the present invention, determining the sliding track according to the first one-hand sliding operation includes: calculating a first sliding distance and a first sliding track according to a first single-hand sliding operation; when the first sliding track meets the preset track condition, the step of displaying a touch area setting interface on the screen comprises the following steps: when it is determined that the first sliding distance meets the predetermined distance condition and the first sliding track meets the predetermined track condition, a touch area setting interface is displayed on the screen.
Optionally, in the one-hand control method according to the present invention, the step of determining the accessible area on the screen according to the second one-hand sliding operation includes: and determining a second sliding track according to a second one-hand sliding operation, and determining an accessible area on the screen according to the second sliding track.
Optionally, in the one-hand control method according to the present invention, the step of zooming the screen display area into the accessible area to display the one-hand operation area includes: calculating a scaling and a starting position of a corresponding screen display area when the area of the single-hand operation area is maximized based on the touchable area; and scaling the screen display area according to the scaling scale and the initial position so as to display the scaled one-hand operation area.
Optionally, in the one-hand control method according to the present invention, further comprising the steps of: receiving a third one-hand sliding operation on the screen, and determining a third sliding track according to the third one-hand sliding operation through a ranging sensor; and judging whether the third sliding track meets the exit track condition, and when the first sliding track meets the exit track condition, redisplaying the screen display area before zooming so as to exit the single-hand operation mode.
Optionally, in the one-hand control method according to the present invention, determining the sliding track according to the third one-hand sliding operation includes: calculating a third sliding distance and a third sliding track according to a third single-hand sliding operation; when the first sliding track meets the exit track condition, the step of redisplaying the screen display area before zooming comprises the following steps: when it is determined that the third sliding distance meets the exit distance condition and the third sliding track meets the exit track condition, the screen display area before zooming is redisplayed.
Optionally, in the one-hand control method according to the present invention, the distance sensor is adapted to: transmitting a laser beam to a finger area sliding on a screen, and receiving the laser beam reflected by the finger area; the distance of the finger from the screen is calculated from the reflected laser beam to determine the sliding distance and the sliding track of the finger sliding on the screen.
Optionally, in the one-hand control method according to the present invention, the distance sensor is a TOF sensor.
According to one aspect of the present invention, there is provided a one-hand control device residing in a mobile terminal, comprising: a receiving unit adapted to receive a first one-hand sliding operation on the screen, determining a first sliding track by a ranging sensor according to the first one-hand sliding operation; and adapted to receive a second one-hand sliding operation of a touch area setting interface on the screen, determining, by the ranging sensor, a touch area on the screen according to the second one-hand sliding operation; the processing unit is suitable for judging whether the first sliding track meets the preset track conditions, and displaying a touch area setting interface on the screen when the first sliding track is determined to meet the preset track conditions; and is adapted to zoom the screen display area into said accessible area to display a one-handed operation area for initiating a one-handed operation mode.
According to an aspect of the present invention, there is provided a mobile terminal including: at least one processor; and a memory storing program instructions, wherein the program instructions are configured to be adapted to be executed by the at least one processor, the program instructions comprising instructions for performing the method as described above.
According to an aspect of the present invention, there is provided a readable storage medium storing program instructions which, when read and executed by a mobile terminal, cause the mobile terminal to perform the method as described above.
According to the technical scheme, the single-hand control method and the device are provided, the user operation is sensed based on the ranging sensor, the user can trigger and start the single-hand operation mode by only sliding the screen with one hand, the user can set the single-hand touchable area through the single-hand sliding operation, and the screen display area is scaled into the touchable area according to the sliding track of the finger of the user and displayed to the maximum extent, so that the scaled single-hand operation area is matched with the single-hand touchable range of the user, and the single-hand operation of the user is facilitated. Further, in the one-handed operation mode, the user can exit the one-handed operation mode by one-handed sliding. Therefore, the operation mode of triggering the single-hand operation mode is simpler, so that a user can conveniently enter the single-hand operation mode, complicated operation steps are omitted, and misoperation is avoided. And the single-hand operation area realized based on the single-hand sliding track of the user is more suitable for the single-hand use of the user, so that more convenient and comfortable operation experience is provided for the user.
The foregoing description is only an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present invention more readily apparent.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which set forth the various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to fall within the scope of the claimed subject matter. The above, as well as additional objects, features, and advantages of the present disclosure will become more apparent from the following detailed description when read in conjunction with the accompanying drawings. Like reference numerals generally refer to like parts or elements throughout the present disclosure.
Fig. 1 shows a schematic diagram of a mobile terminal 100 according to one embodiment of the invention;
FIG. 2 illustrates a flow chart of a single-hand control method 200 according to one embodiment of the invention;
FIG. 3 shows a schematic view of a touch area setup interface according to one embodiment of the invention;
FIG. 4 illustrates a schematic diagram of a zoom screen display area according to one embodiment of the invention; and
Fig. 5 shows a schematic diagram of a single-handed control device 500 according to an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 illustrates a block diagram of a mobile terminal 100 according to one embodiment of the present invention. The mobile terminal 100 may be a mobile phone, a tablet computer, a notebook computer, a multimedia player, a wearable device, etc. configured with a front camera and a display screen, but is not limited thereto. As shown in fig. 1, mobile terminal 100 may include a memory interface 102, a multi-core processor 104, and a peripheral interface 106.
The memory interface 102, the multi-core processor 104, and/or the peripheral interface 106 may be discrete components or may be integrated in one or more integrated circuits. In the mobile terminal 100, various elements may be coupled by one or more communication buses or signal lines. Sensors, devices, and subsystems may be coupled to peripheral interface 106 to facilitate various functions.
For example, the acceleration sensor 110, the magnetic field sensor 112 and the gravity sensor 114 may be coupled to the peripheral interface 106, the acceleration sensor 110 may collect acceleration data in three coordinate axis directions of the machine body coordinate system, the magnetic field sensor 112 may collect magnetic field data (magnetic induction) in three coordinate axis directions of the machine body coordinate system, the gravity sensor 114 may collect gravity data in three coordinate axis directions of the machine body coordinate system, and the above sensor may facilitate functions of step counting, orientation, intelligent switching of horizontal and vertical screens, and the like. Other sensors 116 may also be coupled to the peripheral interface 106, such as a positioning system (e.g., a GPS receiver), a temperature sensor, a biometric sensor, or other sensing device, to facilitate relevant functions.
The camera subsystem 120 and optical sensor 122, which may be, for example, a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) optical sensor, may be used to facilitate implementation of camera functions such as recording photographs and video clips. Communication functions may be facilitated by one or more wireless communication subsystems 124, which may include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The particular design and implementation of the wireless communication subsystem 124 may depend on the one or more communication networks supported by the mobile terminal 100. For example, the mobile terminal 100 may include a wireless communication subsystem 124 designed to support LTE, 3G, GSM networks, GPRS networks, EDGE networks, wi-Fi or WiMax networks, and BlueboothTM networks.
The audio subsystem 126 may be coupled with a speaker 128 and a microphone 130 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions. The I/O subsystem 140 may include a touch screen controller 142 and/or one or more other input controllers 144. The touch screen controller 142 may be coupled to a touch screen 146. For example, the touch screen 146 and touch screen controller 142 can detect contact and movement or suspension therewith using any of a variety of touch sensing technologies including, but not limited to, capacitive, resistive, infrared and surface acoustic wave technologies. One or more other input controllers 144 may be coupled to other input/control devices 148, such as one or more buttons, rocker switches, thumbwheels, infrared ports, USB ports, and/or pointing devices such as a stylus. The one or more buttons (not shown) may include up/down buttons for controlling the volume of speaker 128 and/or microphone 130.
The memory interface 102 may be coupled with a memory 150. The memory 150 may include an internal memory such as a Static Random Access Memory (SRAM), a non-volatile memory (NVRAM), etc., and an external memory, but is not limited thereto; the external memory may be, for example, a hard disk, a removable hard disk, a usb disk, or the like, but is not limited thereto. Memory 150 may store program instructions, which may include, for example, an operating system 152 and applications 154. The operating system 152 may be, for example, android, iOS, windows Phone, etc., which includes program instructions for handling basic system services and performing hardware dependent tasks. The memory 150 may also store an application 154, and the application 154 may include program instructions for implementing various user-desired functions. The applications 154 may be provided separately from the operating system or may be self-contained with the operating system. In addition, when the application 154 is installed in the mobile terminal 100, a driving module may be added to the operating system. As the mobile device runs, an operating system 152 is loaded from memory 150 and executed by processor 104. The application 154, when running, is also loaded from the memory 150 and executed by the processor 104. Applications 154 run on top of the operating system and utilize interfaces provided by the operating system and underlying hardware to implement various user-desired functions, such as hardware management, instant messaging, web browsing, and the like.
In one embodiment, the program instructions stored in the memory 150 include a plurality of program instructions adapted to execute the one-hand control method 200 of the present invention, and the program instructions may be executed by a processor, so that the one-hand control method 200 of the present invention may be executed in the mobile terminal 100.
In one embodiment, the application 154 includes a single-hand control device 500, and the single-hand control device 500 includes a plurality of program instructions adapted to execute the single-hand control method 200 of the present invention, and the program instructions may be executed by a processor, so that the single-hand control method 200 of the present invention may be executed in the single-hand control device 500 of the mobile terminal 100.
In one embodiment, the other sensors 116 in the mobile terminal 100 include a distance sensor, which may be disposed below the screen of the mobile terminal 100.
Fig. 2 shows a flow chart of a one-hand control method 200 according to one embodiment of the invention. The method 200 is adapted to be performed in a mobile terminal, such as the aforementioned mobile terminal 100. Wherein the screen of the mobile terminal comprises a screen display area.
As shown in fig. 2, the method 200 includes steps S210 to S240.
As shown in fig. 2, in step S210, a first one-hand sliding operation of a user on a screen is received, and a first sliding track is determined according to the first one-hand sliding operation by a ranging sensor. It should be noted that, when step S210 is performed, the mobile terminal 100 is in the normal operation mode in which the one-handed operation mode is not turned on.
According to the ranging sensor of the present invention, when the mobile terminal 100 receives a sliding operation of a user's finger on the screen, the ranging sensor may emit a laser beam to a finger area slid on the screen, receive the laser beam reflected by the finger area, and calculate a distance of the finger from the screen according to the reflected laser beam (according to a time difference from the emission of the laser beam to the reception of the reflected laser beam, and in combination with a speed of light), thereby to calculate a sliding distance and a sliding trajectory of the finger slid on the screen. In one implementation, the distance sensor is, for example, a TOF sensor.
In one implementation, a user may slide through a single finger (e.g., thumb) on a screen. When a user's finger performs a first one-hand sliding operation on the screen, a distance sensor disposed below the screen emits a plurality of laser beams, such as infrared laser beams, to a finger area sliding on the screen, so that the finger area reflects the laser beams to the distance sensor, and after receiving the laser beams reflected by the finger, the distance sensor calculates a real-time distance of the sliding finger from the screen according to the reflected laser beams in combination with the speed of light, and thus a distance (first sliding distance) and a sliding track (first sliding track) of the sliding of the user's finger on the screen can be calculated, i.e., the first sliding distance and the first sliding track can be determined according to the first one-hand sliding operation by the distance sensor.
Subsequently, in step S220, it is determined whether the first sliding track meets the predetermined track condition, and when it is determined that the first sliding track meets the predetermined track condition, a touch area setting interface is displayed on the screen. Here, based on the touch area setting interface, the user may be prompted to set the touch area at the time of one-hand operation by one-hand sliding operation.
It should be noted that the predetermined track condition is a preset track condition that meets the requirement of starting the single-hand operation mode, and the present invention is not limited specifically herein.
FIG. 3 illustrates a schematic view of a touch area setting interface according to one embodiment of the invention. In one implementation, as shown in fig. 3, the predetermined trajectory is, for example, an arc-shaped trajectory slid on the screen by a left thumb or a right thumb, where a length section and a position section of the arc-shaped trajectory may be set by a person skilled in the art according to a practical situation when the user operates with one hand. The invention does not specifically limit the length interval and the position interval of the arc track. As described previously, in step S210, the first sliding distance and the first sliding trajectory may be calculated according to the first one-hand sliding operation by the ranging sensor. According to one embodiment, when it is determined in step S220 whether the first sliding track meets the predetermined track condition, it is also determined whether the first sliding distance meets the predetermined distance condition. That is, it is determined whether the first sliding distance meets a predetermined distance condition, and whether the first sliding trajectory meets a predetermined trajectory condition. When the first sliding distance meets the preset distance condition and the first sliding track meets the preset track condition, the condition that the single-hand operation mode is started is described to be met, and then a touch area setting interface is displayed on the screen. That is, in the case where both the sliding distance and the sliding trajectory conform to the preset one-hand mode condition, the touch area setting interface is displayed on the screen to prompt the user to perform the second one-hand sliding operation so as to determine the touch area when the user operates with one hand.
Subsequently, in step S230, a second one-hand sliding operation of the user' S touch area setting interface on the screen is received, and the touch area on the screen is determined according to the second one-hand sliding operation by the ranging sensor. Here, the reach area is the range that the finger can reach on the screen when the user operates with one hand.
Specifically, the ranging sensor may determine a locus of a finger sliding on the screen according to the second one-hand sliding operation (second sliding locus), and further, determine an accessible area on the screen when the user operates with one hand according to the second sliding locus.
In one implementation, according to the distance measuring sensor in the present invention, when a user's finger performs a second one-hand sliding operation on a screen, the distance sensor disposed under the screen emits a plurality of laser beams to a finger area sliding on the screen, so that the finger area reflects the laser beams to the distance sensor, and after receiving the laser beams reflected by the finger, the distance sensor calculates a real-time distance of the sliding finger from the screen according to the reflected laser beams in combination with a speed of light, and thus a distance (second sliding distance) and a sliding track (second sliding track) of the sliding of the user's finger on the screen can be calculated, that is, the second sliding distance and the second sliding track can be determined according to the second one-hand sliding operation by the distance sensor. Thus, the reachable area of the finger on the screen at the time of the one-hand operation of the user can be determined from the second slide trajectory.
It can be seen that, according to the scheme of the invention, the operation of the user is sensed based on the ranging sensor, and the user can trigger the single-hand operation mode by only sliding the screen with one hand. Also, the user can set the one-hand accessible region by a one-hand sliding operation.
Finally, in step S240, the screen display area is zoomed into the touchable area to display the one-handed operation area in order to initiate the one-handed operation mode. Here, the one-hand operation area is an area presented after zooming the screen display area.
According to one embodiment, the scaling of the corresponding screen display area and the starting position of the scaling, i.e. the starting position of the one-hand operation area formed after scaling, are calculated on the basis of the touchable area, where the area of the one-hand operation area is maximized. And scaling the screen display area in equal proportion according to the scaling and the starting position so as to display the scaled one-hand operation area. That is, the scaling determined herein ensures that the area of the one-handed operation region is maximized and that the one-handed operation region is located within the reach region. It should be noted that the present invention is not limited to a specific algorithm for calculating the scale and the starting position.
It should be noted that the one-handed operation region is formed by scaling the screen display region equally. The screen display area refers to an operation area in a full-screen display state in the normal operation mode, that is, the screen display area has an original scaling ratio of 1.
Fig. 4 shows a schematic diagram of a zoom screen display area according to an embodiment of the present invention.
And establishing a coordinate system by taking a straight line of the lower edge of the screen display area as an X axis and taking a straight line of the left edge of the screen display area as a Y axis.
According to one embodiment, as shown in fig. 4, by acquiring two points on the second sliding track that are located on the vertical line, a scaling ratio is determined based on a ratio of a vertical distance between the two points to a height of the screen display area, and further, starting position coordinates may be further calculated according to the scaling ratio, so that a closed rectangle, that is, an inscribed rectangle located within the accessible area, may be formed based on two points on the second sliding track that are located on the vertical line, the starting position points. The inscribed rectangle can be used as a display area of the zoomed one-hand operation area, namely, the screen display area is zoomed according to the zoom scale and the initial position and then is displayed in the inscribed rectangle of the reachable area, so that the one-hand operation area is formed.
According to one embodiment, as shown in fig. 4 (fig. 4 is a schematic diagram based on a left-hand one-hand operation), it is assumed that the height of the screen display area is H, the width is L, and the original coordinate point is O. Based on the finger sliding track (i.e. the second sliding track), firstly, two points which are positioned on a vertical straight line on the second sliding track and have the largest vertical distance, such as a point A and a point B, which are shown in the figure, wherein the point A is an intersection point of the finger sliding track and an X axis, the coordinate of the point A is expressed as (a, 0), the coordinate of the point B is expressed as (a, h), and the vertical distance between the two points A, B is h. Assuming that the zoomed longitudinal border (right border) is formed based on A, B points on the second sliding track, the corresponding zoom scale k can be determined according to the coordinates of the point a and the point B, and is equal to the ratio of the vertical distance H between the point a and the point B to the height H of the screen display area, i.e., k=h/H. And, based on the scaling k and A, B coordinates of two points, the coordinates of the starting position M point can be determined, and the coordinates of the starting position M point can be expressed as M (a-kL, 0).
If the determined starting position M point is positioned in the screen display area, namely a-kL is more than or equal to 0, an inscribed rectangle positioned in the accessible area of the screen display area can be formed based on the point A, the point B and the point M, and the inscribed rectangle is taken as the display area of the zoomed single-hand operation area. By scaling the screen display area according to the determined scaling k and the starting position M point, the scaled one-handed operation area can be displayed in the rectangle. In this way, the maximum area of the one-hand operation region formed in the reach region and the maximum scaling with respect to the screen display region can be achieved.
If the determined starting position M is located outside the screen display area, i.e., a-kL <0, then the AB is gradually translated rightward to intersect the second sliding path at A 1 and B 1 points, the coordinates of which are denoted as a 1 point (the coordinates of which are denoted as a 1,s1),B1 point (the vertical distance between the two points of which may be denoted as h 1. Assuming that the two points of A 1、B1 on the second sliding path constitute a scaled longitudinal border, the corresponding scale may be determined from the coordinates of the points of A 1 and B 1, wherein the scale k 1=h1/H. And the coordinates of the corresponding starting position M 1 may be determined based on the coordinates of the two points of the scale k 1 and A 1、B1, the coordinates of the start position M 1 point may be expressed as M 1(a1-k1L,s1) until the point M 1 is the same as the abscissa of the origin O (point M 1 is located on the Y axis where the left frame of the screen display area is located), that is, when the abscissa a 1-k1 l=0 of the point M 1, an inscribed rectangle within the touchable region of the screen display area may be formed based on the points a 1, B 1 and M 1.
It should be understood that the invention scales the screen display area into the touchable area and maximizes the display according to the sliding track of the finger of the user, so that the scaled single-hand operation area is matched with the touchable range of the single hand of the user, and the single-hand operation is more convenient and comfortable.
According to one embodiment, when the user wishes to exit the one-handed operation mode, and resume the normal operation mode, a third one-handed sliding operation may be performed on the screen to request exit from the one-handed operation mode. The mobile terminal 100 may receive a third one-hand sliding operation on the screen, and determine a third sliding trajectory according to the third one-hand sliding operation through the ranging sensor. Further, it is determined whether the third sliding track meets the exit track condition, and when it is determined that the first sliding track meets the exit track condition, the screen display area before zooming (i.e., the full-screen display state when the zoom scale is 1) is redisplayed, so that the one-hand operation mode is exited.
In one implementation, when a third one-hand sliding operation of a user's finger on the screen is received, a third sliding distance and a third sliding track may be calculated according to the third one-hand sliding operation by the ranging sensor in the present invention. Here, a specific method for calculating the sliding distance and the sliding trajectory of the ranging sensor according to the third one-hand sliding operation is similar to the foregoing description, and a detailed description thereof will be omitted.
And further, judging whether the third sliding distance meets the exit distance condition or not, and judging whether the third sliding track meets the exit track condition or not. When it is determined that the third sliding distance meets the exit distance condition and the third sliding track meets the exit track condition, the screen display area before zooming is redisplayed. Thus, the one-handed operation mode is exited and the normal operation mode is restored.
The exit track condition and the exit distance condition are preset conditions meeting the requirement of exiting the single-hand operation mode, and the exit track condition and the exit distance condition are not particularly limited.
Fig. 5 shows a schematic diagram of a single-handed control device 500 according to an embodiment of the invention. The one-handed control device 500 resides within the mobile terminal 100.
As shown in fig. 5, the single-handed control device 500 includes a receiving unit 510 and a processing unit 520 connected.
Wherein the receiving unit 510 may receive a first one-hand sliding operation on the screen, determine a first sliding trajectory according to the first one-hand sliding operation through the ranging sensor, and transmit to the processing unit 520. The processing unit 520 determines whether the first sliding track meets the predetermined track condition, and displays a touch area setting interface on the screen when it is determined that the first sliding track meets the predetermined track condition.
The receiving unit 510 may also receive a second one-hand sliding operation of the touch area setting interface on the screen, determine the touch area on the screen according to the second one-hand sliding operation by the ranging sensor, and transmit to the processing unit 520. Subsequently, the processing unit 520 zooms the screen display area into the accessible area to display a one-handed operation area in order to initiate a one-handed operation mode.
It should be noted that the receiving unit 510 is configured to perform steps S210 and S230 in the method 200 described above, and the processing unit 520 is configured to perform steps S220 and S240 in the method 200 described above. The specific execution logic of the receiving unit 510 and the processing unit 520 refers to the foregoing specific description of steps S210 to S240, and will not be repeated here.
According to the one-hand control method and the one-hand control device, the user operation is sensed based on the ranging sensor, the user can trigger and start the one-hand operation mode by only sliding the screen by one hand, the user can set the one-hand touchable area through the one-hand sliding operation, and the screen display area is zoomed into the touchable area and displayed maximally according to the sliding track of the fingers of the user, so that the zoomed one-hand operation area is matched with the one-hand touchable range of the user, and the one-hand operation of the user is facilitated. Further, in the one-handed operation mode, the user can exit the one-handed operation mode by one-handed sliding. Therefore, the operation mode of triggering the single-hand operation mode is simpler, so that a user can conveniently enter the single-hand operation mode, complicated operation steps are omitted, and misoperation is avoided. And the single-hand operation area realized based on the single-hand sliding track of the user is more suitable for the single-hand use of the user, so that more convenient and comfortable operation experience is provided for the user.
A8, the method of A7, wherein determining a sliding trajectory according to the third one-handed sliding operation comprises: calculating a third sliding distance and a third sliding track according to a third single-hand sliding operation; when the first sliding track meets the exit track condition, the step of redisplaying the screen display area before zooming comprises the following steps: when it is determined that the third sliding distance meets the exit distance condition and the third sliding track meets the exit track condition, the screen display area before zooming is redisplayed.
A9, the method of any of A1-A8, wherein the distance sensor is adapted to: transmitting a laser beam to a finger area sliding on a screen, and receiving the laser beam reflected by the finger area; the distance of the finger from the screen is calculated from the reflected laser beam to determine the sliding distance and the sliding track of the finger sliding on the screen.
A10, the method of any of A1-A9, wherein the distance sensor is a TOF sensor.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions of the methods and apparatus of the present invention, may take the form of program code (i.e., instructions) embodied in tangible media, such as removable hard drives, U-drives, floppy diskettes, CD-ROMs, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the mobile terminal will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to execute the one-hand control method of the present invention in accordance with instructions in said program code stored in the memory.
By way of example, and not limitation, readable media comprise readable storage media and communication media. The readable storage medium stores information such as computer readable instructions, data structures, program modules, or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of readable media.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with examples of the invention. The required structure for a construction of such a system is apparent from the description above. In addition, the present invention is not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into a plurality of sub-modules.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments can be used in any combination.
Furthermore, some of the embodiments are described herein as methods or combinations of method elements that may be implemented by a processor of a computer system or by other means of performing the functions. Thus, a processor with the necessary instructions for implementing the described method or method element forms a means for implementing the method or method element. Furthermore, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is for carrying out the functions performed by the elements for carrying out the objects of the invention.
As used herein, unless otherwise specified the use of the ordinal terms "first," "second," "third," etc., to describe a general object merely denote different instances of like objects, and are not intended to imply that the objects so described must have a given order, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of the above description, will appreciate that other embodiments are contemplated within the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is defined by the appended claims.

Claims (11)

1. A one-hand control method performed at a mobile terminal, under a screen of which a ranging sensor is arranged, the method comprising the steps of:
receiving a first one-hand sliding operation on the screen, and determining a first sliding track according to the first one-hand sliding operation through a ranging sensor;
Judging whether the first sliding track meets the preset track condition or not, and displaying a touch area setting interface on the screen when the first sliding track is determined to meet the preset track condition;
receiving a second one-hand sliding operation of a touch area setting interface on the screen, determining a second sliding track according to the second one-hand sliding operation through the ranging sensor, and determining a touch area on the screen according to the second sliding track; and
Scaling the screen display area into the touchable area to display a one-handed operation area to initiate a one-handed operation mode comprising:
two points which are positioned on a vertical straight line on the second sliding track and have the largest vertical distance are respectively an A point and a B point, wherein the coordinate of the A point is expressed as (a, 0), and the coordinate of the B point is expressed as (a, h);
Determining a corresponding scaling k according to the coordinates of the point A and the point B, wherein k=h/H, and H represents the height of the screen display area;
Determining the coordinates of a starting position M point as (a-kL, 0) according to the scaling k, the coordinates of the A point and the coordinates of the B point;
and if a-kL is more than or equal to 0, forming an inscribed rectangle positioned in the reachable area based on the point A, the point B and the point M, taking the inscribed rectangle as a display area of the zoomed one-hand operation area, and L represents the width of the screen display area.
2. The method of claim 1, wherein determining a sliding trajectory from the first one-handed sliding operation comprises: calculating a first sliding distance and a first sliding track according to a first single-hand sliding operation;
When the first sliding track meets the preset track condition, the step of displaying a touch area setting interface on the screen comprises the following steps:
when it is determined that the first sliding distance meets the predetermined distance condition and the first sliding track meets the predetermined track condition, a touch area setting interface is displayed on the screen.
3. The method of claim 1 or 2, wherein the step of zooming the screen display area into the accessible area to display a one-handed operation area comprises:
Calculating a scaling and a starting position of a corresponding screen display area when the area of the single-hand operation area is maximized based on the touchable area;
and scaling the screen display area according to the scaling scale and the initial position so as to display the scaled one-hand operation area.
4. The method of claim 1 or 2, further comprising the step of:
If a-kL < 0, translating AB to the right to intersect the second sliding track at points A 1 and B 1, wherein the point A 1 coordinate is denoted (a 1,s1),B1 point coordinate is denoted (a 1,s1+h1);
Determining a corresponding scaling k 1 according to the coordinates of the A 1 point and the B 1 point, wherein k 1=h1/H;
Determining the coordinates of the M 1 point at the starting position according to the coordinates of the k 1、A1 point and the coordinates of the B 1 point of the scaling scale, wherein the coordinates of the M 1 point are M 1(a1-k1L,s1);
When a 1-k1 l=0, an inscribed rectangle within the touchable region is configured based on the a 1 point, the B 1 point, and the M 1 point, and is taken as a display region of the zoomed one-hand operation region.
5. The method of claim 1 or 2, further comprising the step of:
receiving a third one-hand sliding operation on the screen, and determining a third sliding track according to the third one-hand sliding operation through a ranging sensor;
and judging whether the third sliding track meets the exit track condition, and when the first sliding track meets the exit track condition, redisplaying the screen display area before zooming so as to exit the single-hand operation mode.
6. The method of claim 5, wherein,
Determining a sliding track according to the third one-handed sliding operation includes: calculating a third sliding distance and a third sliding track according to a third single-hand sliding operation;
when the first sliding track meets the exit track condition, the step of redisplaying the screen display area before zooming comprises the following steps:
When it is determined that the third sliding distance meets the exit distance condition and the third sliding track meets the exit track condition, the screen display area before zooming is redisplayed.
7. The method of claim 1 or 2, wherein the ranging sensor is adapted to:
transmitting a laser beam to a finger area sliding on a screen, and receiving the laser beam reflected by the finger area;
the distance of the finger from the screen is calculated from the reflected laser beam to determine the sliding distance and the sliding track of the finger sliding on the screen.
8. The method of claim 1 or 2, wherein the ranging sensor is a TOF sensor.
9. A one-hand control device residing within a mobile terminal having a ranging sensor disposed under a screen of the mobile terminal, the one-hand control device comprising:
A receiving unit adapted to receive a first one-hand sliding operation on the screen, determining a first sliding track by a ranging sensor according to the first one-hand sliding operation; and is adapted to receive a second one-hand sliding operation of the touch area setting interface on the screen, determine, by the ranging sensor, a second sliding trajectory according to the second one-hand sliding operation, and determine, according to the second sliding trajectory, a touch area on the screen; and
The processing unit is suitable for judging whether the first sliding track meets the preset track condition, and displaying a touch area setting interface on the screen when the first sliding track is determined to meet the preset track condition; and is adapted to zoom the screen display area into the touchable area to display a one-handed operation area for initiating a one-handed operation mode comprising: two points which are positioned on a vertical straight line on the second sliding track and have the largest vertical distance are respectively an A point and a B point, wherein the coordinate of the A point is expressed as (a, 0), and the coordinate of the B point is expressed as (a, h); determining a corresponding scaling k according to the coordinates of the point A and the point B, wherein k=h/H, and H represents the height of the screen display area; determining the coordinates of a starting position M point as (a-kL, 0) according to the scaling k, the coordinates of the A point and the coordinates of the B point; and if a-kL is more than or equal to 0, forming an inscribed rectangle positioned in the reachable area based on the point A, the point B and the point M, taking the inscribed rectangle as a display area of the zoomed one-hand operation area, and L represents the width of the screen display area.
10. A mobile terminal, comprising:
at least one processor; and
A memory storing program instructions, wherein the program instructions are configured to be adapted to be executed by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-8.
11. A readable storage medium storing program instructions which, when read and executed by a mobile terminal, cause the mobile terminal to perform the method of any of claims 1-8.
CN202110791445.6A 2021-07-13 2021-07-13 Single-hand control method and device and mobile terminal Active CN113534957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110791445.6A CN113534957B (en) 2021-07-13 2021-07-13 Single-hand control method and device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110791445.6A CN113534957B (en) 2021-07-13 2021-07-13 Single-hand control method and device and mobile terminal

Publications (2)

Publication Number Publication Date
CN113534957A CN113534957A (en) 2021-10-22
CN113534957B true CN113534957B (en) 2024-05-14

Family

ID=78098919

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110791445.6A Active CN113534957B (en) 2021-07-13 2021-07-13 Single-hand control method and device and mobile terminal

Country Status (1)

Country Link
CN (1) CN113534957B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115291784A (en) * 2022-07-29 2022-11-04 维沃移动通信有限公司 Function control method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103999027A (en) * 2012-06-18 2014-08-20 宇龙计算机通信科技(深圳)有限公司 Terminal and interface operation management method
CN105260115A (en) * 2015-09-23 2016-01-20 小米科技有限责任公司 Method and device for realizing single-hand mode, and intelligent terminal
CN107613094A (en) * 2017-08-17 2018-01-19 珠海格力电器股份有限公司 A kind of method and mobile terminal of one-handed performance mobile terminal
CN111078086A (en) * 2019-11-22 2020-04-28 深圳传音控股股份有限公司 Mobile terminal, one-handed mode control method, and computer-readable storage medium
CN111638810A (en) * 2020-05-26 2020-09-08 维沃移动通信有限公司 Touch method and device and electronic equipment
CN112764606A (en) * 2020-12-31 2021-05-07 维沃移动通信有限公司 Identification display method and device and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019056393A1 (en) * 2017-09-25 2019-03-28 华为技术有限公司 Terminal interface display method and terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103999027A (en) * 2012-06-18 2014-08-20 宇龙计算机通信科技(深圳)有限公司 Terminal and interface operation management method
CN105260115A (en) * 2015-09-23 2016-01-20 小米科技有限责任公司 Method and device for realizing single-hand mode, and intelligent terminal
CN107613094A (en) * 2017-08-17 2018-01-19 珠海格力电器股份有限公司 A kind of method and mobile terminal of one-handed performance mobile terminal
CN111078086A (en) * 2019-11-22 2020-04-28 深圳传音控股股份有限公司 Mobile terminal, one-handed mode control method, and computer-readable storage medium
CN111638810A (en) * 2020-05-26 2020-09-08 维沃移动通信有限公司 Touch method and device and electronic equipment
CN112764606A (en) * 2020-12-31 2021-05-07 维沃移动通信有限公司 Identification display method and device and electronic equipment

Also Published As

Publication number Publication date
CN113534957A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
US10642476B2 (en) Method and apparatus for single-hand operation on full screen
US9830067B1 (en) Control of display of content with dragging inputs on a touch input surface
US10306044B2 (en) Method and apparatus for preventing screen off during automatic response system service in electronic device
EP3049908B1 (en) Presentation of a control interface on a touch-enabled device based on a motion or absence thereof
US20150268789A1 (en) Method for preventing accidentally triggering edge swipe gesture and gesture triggering
US9223406B2 (en) Screen display control method of electronic device and apparatus therefor
US20110022307A1 (en) Method for operating navigation frame, navigation apparatus and recording medium
CN107239166B (en) Method for adjusting interface display of mobile terminal and mobile terminal
US11115517B2 (en) Method and apparatus for preventing screen off during automatic response system service in electronic device
EP2815292A1 (en) Engagement-dependent gesture recognition
US20150186004A1 (en) Multimode gesture processing
CN107153546B (en) Video playing method and mobile device
KR101763270B1 (en) Method, apparatus, program and computer-readable recording medium for determining character
CN109558000B (en) Man-machine interaction method and electronic equipment
CN107145294B (en) Method and device for realizing single-hand mode of mobile terminal
US20150177902A1 (en) Method and System for Turning off an Alarm of a Mobile Terminal
CN113534957B (en) Single-hand control method and device and mobile terminal
US11354031B2 (en) Electronic apparatus, computer-readable non-transitory recording medium, and display control method for controlling a scroll speed of a display screen
CN107979701B (en) Method and device for controlling terminal display
CN113110786B (en) Page sliding control method and mobile terminal
KR20110040206A (en) Mobile terminal and method for displaying object using distance and eyes sensing
CN107861683B (en) Unmanned aerial vehicle button-free operation method and device
CN113434076B (en) Single-hand control method and device and mobile terminal
US20200033959A1 (en) Electronic apparatus, computer-readable non-transitory recording medium, and display control method
CN116594544A (en) Method for operating navigation bar by one hand, processing device and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant