CN112740166A - Interface control method and electronic terminal - Google Patents

Interface control method and electronic terminal Download PDF

Info

Publication number
CN112740166A
CN112740166A CN201880096005.5A CN201880096005A CN112740166A CN 112740166 A CN112740166 A CN 112740166A CN 201880096005 A CN201880096005 A CN 201880096005A CN 112740166 A CN112740166 A CN 112740166A
Authority
CN
China
Prior art keywords
interactive interface
boundary
touch display
display area
trigger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880096005.5A
Other languages
Chinese (zh)
Inventor
王金周
付洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Royole Technologies Co Ltd
Original Assignee
Shenzhen Royole Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Royole Technologies Co Ltd filed Critical Shenzhen Royole Technologies Co Ltd
Publication of CN112740166A publication Critical patent/CN112740166A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An interface control method is applied to an electronic terminal (100), the electronic terminal (100) comprises a touch display area (101), an interactive interface (12) is displayed in the touch display area (101), and the interface control method comprises the following steps: detecting a trigger event at the touch display area (101); judging whether the trigger event corresponds to a position or size adjustment trigger condition of the interactive interface (12); if yes, judging whether continuous trigger points are detected in the touch display area (101) after the trigger event; if yes, adjusting the position or size of the interactive interface (12) according to the trigger event and the continuous trigger points; and displaying the adjusted interactive interface (12) in the touch display area (101).

Description

Interface control method and electronic terminal Technical Field
The present disclosure relates to the field of interface adjustment technologies, and in particular, to an interface control method and an electronic terminal.
Background
In the traditional screen zooming and moving process, the original display picture is displayed in a designated small screen area after being reduced by modifying the size and the position of a bottom layer display area. However, this approach may result in the screen not being free to move and zoom, making the user experience poor.
Disclosure of Invention
The application provides an interface control method and an electronic terminal.
The application provides an interface control method applied to an electronic terminal, wherein the electronic terminal comprises a touch display area, an interactive interface is displayed in the touch display area, and the interface control method comprises the following steps:
detecting a trigger event at the touch display area;
judging whether the trigger event corresponds to a position or size adjustment trigger condition of the interactive interface;
if so, judging whether continuous trigger points are detected in the touch display area after the trigger event;
if so, adjusting the position or size of the interactive interface according to the trigger event and the continuous trigger points; and
and displaying the adjusted interactive interface in the touch display area.
The interface control method can adjust the position and the size of the interactive interface, so that the interactive interface can be zoomed and adjusted in position as required, and the user experience is improved.
An electronic terminal of an embodiment of the present application, it includes:
the touch display area is used for displaying an interactive interface;
the touch monitoring module is used for detecting a trigger event in the touch display area;
the processing module is used for judging whether the trigger event corresponds to the position or size adjustment trigger condition of the interactive interface; if so, judging whether continuous trigger points are detected in the touch display area after the trigger event; and if continuous trigger points are detected in the touch display area after the trigger event, adjusting the position or the size of the interactive interface according to the trigger event and the continuous trigger points.
And the display module is used for displaying the adjusted interactive interface in the touch display area.
An electronic terminal of an embodiment of the present application includes a touch display area, a processor, and a memory, where the processor is connected to the touch display area, the touch display area is used for displaying an interactive interface, and the memory stores computer-readable instructions, and when the instructions are executed by the processor, the processor executes the interface control method of the above embodiment.
The electronic terminal of the embodiment of the application can adjust the position and the size of the interactive interface, so that the interactive interface can be zoomed and adjusted in position as required, and the user experience is improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart of an interface control method according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a touch display area and an interactive interface of an electronic terminal according to an embodiment of the present application.
Fig. 3 is a flowchart of an interface control method according to an embodiment of the present application.
Fig. 4 is a sub-flowchart of step S161 in fig. 3.
Fig. 5 is a schematic adjustment diagram of an interactive interface of an electronic terminal according to an embodiment of the present application.
Fig. 6 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 7 is another adjustment diagram of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 8 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 9 is a sub-flowchart of step S162 in fig. 3.
Fig. 10 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 11 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 12 is a schematic diagram illustrating further adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 13 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 14 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 15 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 16 is still another flowchart of the interface control method according to the embodiment of the present application.
Fig. 17 is still another flowchart of the interface control method according to the embodiment of the present application.
Fig. 18 is a sub-flowchart of step S18 in fig. 12.
Fig. 19 is a schematic adjustment diagram of an interactive interface of an electronic terminal according to an embodiment of the present application.
Fig. 20 is a schematic diagram illustrating further adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 21 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
Fig. 22 is a block diagram of an electronic terminal according to an embodiment of the present application.
Fig. 23 is a schematic block diagram of an electronic terminal according to an embodiment of the present application.
Fig. 24 is a schematic structural diagram of an electronic terminal according to an embodiment of the present application.
Description of the main element symbols:
the electronic terminal 1000, the electronic terminal 100, the touch display area 101, the input focus area 1011, the processor 1001, the memory 1002, the touch display 1003, the system bus 1004, the zoom trigger point 11, the interactive interface 12, the movement trigger point 13, the touch listening module 102, the processing module 103, the display module 104, and the focus correction module 105.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In the description of the present application, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise direct contact of the first and second features, or may comprise contact of the first and second features not directly but through another feature in between. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The following disclosure provides many different embodiments or examples for implementing different features of the application. In order to simplify the disclosure of the present application, specific example components and arrangements are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, examples of various specific processes and materials are provided herein, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Referring to fig. 1, fig. 2 and fig. 24, the interface control method according to the embodiment of the present application is applied to an electronic terminal 100, where the electronic terminal 100 includes a touch display area 101, and an interactive interface 12 is displayed in the touch display area 101, and the interface control method includes:
step S11, detecting a trigger event in the touch display area 101;
step S12, judging whether the trigger event corresponds to the position or size adjustment trigger condition of the interactive interface 12;
if yes, step S13, determine whether a continuous trigger point is detected in the touch display area 101 after the trigger event;
if yes, step S14, adjusting the position or size of the interactive interface 12 according to the trigger event and the continuous trigger point;
in step S15, the adjusted interactive interface 12 is displayed in the touch display area 101.
The interface control method can adjust the position and the size of the interactive interface 12, so that the interactive interface 12 can be zoomed and adjusted in position as required, and the user experience is improved.
The triggering event represents that a triggering point is detected in the touch area for the first time; in other words, the trigger event is not preceded by a preceding trigger point that is contiguous therewith. And the "consecutive trigger points" referred to above indicate consecutive trigger points following the trigger point corresponding to the trigger event.
The continuous trigger point refers to that the user presses the trigger point and drags the trigger point without disconnection in the middle when the touch display area 101 detects that the trigger point is pressed. For example, a user pressing the trigger point before leaving the trigger point and then pressing the trigger point again may belong to a discrete trigger point.
In some embodiments, the continuous trigger point comprises a sliding operation.
Therefore, the operation of the sliding operation mode in the touch display area 101 is more convenient and faster, so that the user operation is simpler, and the user experience is improved. In one example, the electronic terminal 1000 includes a touch display 1003, the touch display 1003 having a touch display area 101. When the touch display screen 1003 is bright, the touch display area 101 displays an interactive interface 12, so that a user can interact with the electronic terminal 1000. The touch display 1003 may be a flexible screen (e.g., an OLED screen). The touch display 1003 may be a capacitive touch screen.
In some embodiments, detecting a trigger event at touch display area 101 includes detecting a trigger point on touch display area 101.
When the touch display area 101 detects the trigger point, the next operation of adjusting the interactive interface 12 can be performed according to the trigger point.
In some embodiments, the interactive interface 12 includes a first area and a second area, and when the trigger event is located in the first area, the trigger event adjusts the trigger condition corresponding to the position of the interactive interface 12; when the trigger event is located in the second area, the trigger event corresponds to a resizing trigger condition of the interactive interface 12.
The first area and the second area are arranged, so that the operation of a user is facilitated. The user only needs to press the trigger point in the corresponding area, the adjustment of the interactive interface 12 can be realized, the one-hand operation of the user is facilitated, and the user experience is improved.
In some embodiments, the first region comprises: a title bar, an area surrounded by a border of the interactive interface 12; the second region includes: the border of the interactive interface 12 or the four corners of the interactive interface 12.
So set up, convenience of customers one-hand operation promotes user experience.
Referring to fig. 3, in some embodiments, the interface control method includes:
step S16, determining whether the starting point of the trigger event is the movement trigger point 13 in the first area or the zoom trigger point 11 in the second area;
if the starting point of the trigger event is the mobile trigger point 13, step S161, determining a position adjustment trigger condition of the interactive interface 12 corresponding to the trigger event;
if the starting point of the trigger event is the zoom trigger point 11, step S162 determines that the trigger event corresponds to the resizing trigger condition of the interactive interface 12.
In this manner, the arrangement of the first and second regions may facilitate monitoring whether the user intends to move the interactive interface 12 or to zoom the interactive interface 12, and then perform a size adjustment or a position adjustment of the interactive interface 12 according to the user's intention.
For example, when the moving trigger point 13 in the first area is clicked, the electronic terminal 100 determines that the trigger event is to perform position adjustment on the interactive interface 12, so that the position of the interactive interface 12 can be adjusted according to the moving point when the continuous trigger point ends, so that the electronic terminal 100 can enable the interactive interface 12 to be displayed at different positions on the touch display area 101 according to the distance that the user moves the interactive interface 12.
For another example, when the zoom trigger point 11 in the second area is clicked, the electronic terminal 100 determines that the trigger event is to resize the interactive interface 12, so that the size of the interactive interface 12 can be adjusted according to the moving point when the continuous trigger point ends, so that the electronic terminal 100 can adjust the boundary of the interactive interface 12 according to the distance that the user moves the interactive interface 12 to enable the boundary of the interactive interface 12 to be scaled proportionally to the boundary of the touch display area 101.
Specifically, the above-described moving point refers to a trigger point at the end of a continuous trigger point.
Referring to fig. 2, in a general case, the zoom trigger point 11 in the second area and the move trigger point 13 in the first area may be located at any position on the touch display area 101, and the zoom trigger point 11 and the move trigger point 13 may be points that a user commonly and customarily touches when using the touch display area 101. For example, the zoom trigger point 11 in the second area is located in the frame of the touch display area 101, and the move trigger point 13 in the first area is located in the touch display screen 1003.
The movement trigger point 13 or the zooming trigger point 11 can be pressed or touched by the finger of the user on the zooming trigger point 11 or the movement trigger point 13 on the touch display area 101, or the non-contact approach sensing zooming trigger point 11 or the movement trigger point 13 on the touch display area 101. For example, the user's finger is located at a position above the zoom trigger point 11 or the pan trigger point 13.
Referring to fig. 4, in some embodiments, step S161 includes:
step S1611, determining an offset of the interactive interface 12 according to the trigger event and the travel distance of the consecutive trigger points;
step S1612, determining a triggered boundary of the interactive interface 12 according to the offset;
step S1613, comparing the triggered boundary of the interactive interface 12 with the boundary of the touch display area 101 to determine whether the triggered boundary of the interactive interface 12 is out of bounds;
if yes, step S1614, perform boundary correction on the interactive interface 12;
if not, in step S1615, the triggered boundary is set as the boundary of the interactive interface 12.
In this way, the boundary of the interactive interface 12 can be corrected, so that the interactive interface 12 can be accurately displayed in the touch display area 101, and the interaction accuracy of the interactive interface 12 is improved.
Referring to fig. 5 and 6, how to obtain the boundary of the interactive interface 12 and modify the boundary in step S161 will be described below.
Assuming that the touch display area 101 is rectangular, the resolution of the touch display area 101 is 1080 × 1920, and a coordinate system is established with the upper boundary and the left boundary of the touch display area 101 as xy axes, at this time, it is assumed that the coordinate of the point a at the upper left corner of the touch display area 101 is (0, 0) and the coordinate of the point B at the lower right corner is (1080, 1920), that is, x is 1080 and y is 1920. At this time, the touch display area 101 has a length 1920 and a width 1080. Meanwhile, assume that the coordinates of the point a1 in the upper left corner and the coordinates of the point B1 in the lower right corner of the interactive interface 12 are (0, 0) and (864, 1536). At this time, the interactive interface 12 has a length of 1536 and a width of 864. Assuming that the trigger point corresponding to the trigger event is the moving trigger point 13, and the coordinate of the starting point C of the moving trigger point 13 is (860, 1500), the consecutive trigger points operate the starting point C to shift 100 to the right, and at this time, the coordinate of the moving point C1 of the moving trigger point 13 changes (960, 1500), as shown in fig. 6, the shift amount of the moving point C1 to the right from the starting point C is 100, and the shift amount to the bottom is 0, so that the interactive interface 12 shifts 100 to the right and shifts 0 to the bottom, and the coordinate of the point a2 at the top left corner and the coordinate of the point B2 at the bottom right corner of the moved interactive interface 12 are calculated to be (100, 0) and (964, 1536). In this way, the shift amount of the moving point C1 from the vertical position of the starting point C is calculated from the change in the coordinates of the moving point C1 from the starting point C, and the boundary of the interactive interface 12 is obtained.
Judging whether the boundary of the interactive interface 12 is out of range refers to controlling the mobile trigger point 13 to move, meanwhile, obtaining the coordinate of the mobile point C1, and calculating the offset of the movement, so as to obtain the coordinate of the point at the upper left corner and the coordinate of the point at the lower right corner of the mobile interface 12 after the movement, comparing the coordinate of the point at the upper left corner and the coordinate of the point at the lower right corner of the interactive interface 12 with the coordinate of the point at the upper left corner and the coordinate of the point at the lower right corner of the touch display area 101, and when the numerical values of the coordinate of the point at the upper left corner and the coordinate of the point at the lower right corner of the interactive interface 12 are not between the numerical values of the coordinate of the point at the upper left corner and the coordinate of the point at the lower right corner of the touch display area 101, the boundary is out.
For example, referring to fig. 5 and fig. 7, taking the above-mentioned exemplary values as an example, the starting point C is manipulated to shift to the right by 300, and at this time, the coordinate of the moving point C2 of the moving trigger point 13 is changed to (1160, 1500), the amount of shift of the moving point C2 to the right from the starting point C is 300, and the amount of shift to the lower is 0, so that the entire interactive interface 12 is shifted to the right by 300 and to the lower by 0, and it is calculated that the coordinate of the point A3 at the upper left corner of the moved interactive interface 12 is changed to (300, 0) and the coordinate of the point B3 at the lower right corner is changed to (1164, 1536), and it is possible that the size of the interactive interface 12 is too large and the. At this time, the width of the interactive interface 12 is 1164, which is greater than the width 1080 of the touch display area 101, the boundary of the interactive interface 12 already exceeds the boundary of the touch display area 101, and at this time, the interactive interface 12 is out-of-range, and the boundary of the interactive interface 12 needs to be corrected.
For example, referring to FIG. 7, at this time point B3 at the lower right corner of the interactive interface 12 has coordinates (1164, 1536), at this time point the interactive interface 12 is out of range, and the interactive interface 12 is shifted to the right by 300, beyond the touch display area 101 by 84. At this time, the processing module 103 is configured to shift the moving point C2 to the left by 84 while driving the interactive interface 12 to shift to the left by 84, as shown in fig. 8, so that the interactive interface 12 is located in the touch display area 101, at this time, the coordinate of the moving point C3 is changed to (1076, 1500), the coordinate of the point a4 at the upper left corner of the interactive interface 12 is changed to (216, 0), and the coordinate of the point B4 at the lower right corner of the interactive interface 12 is changed to (1080, 1536), so that the interactive interface 12 is located in the touch display area 101 without going beyond the border. The interactive interface 12 can be automatically corrected when the boundary is out of range through a correction mode, so that the boundary-out of range condition of the interactive interface 12 does not occur, and the user can conveniently operate the interactive interface by one hand.
Therefore, the situation that the interaction accuracy of the user and the interaction interface 12 is reduced due to the fact that the user moves the interaction interface 12 out of the bound can be prevented, single-hand operation of the user is facilitated, and user experience is improved.
Referring to fig. 9, in some embodiments, step S162 includes:
step S1621, determining a zoom factor of the interactive interface 12 according to the trigger event and the travel distance of the continuous trigger points;
step S1622, determining a triggered boundary of the interactive interface 12 according to the zoom factor;
step S1623, comparing the triggered boundary of the interactive interface 12 with the boundary of the touch display area 101 to determine whether the triggered boundary of the interactive interface 12 is out of range;
if yes, step S1624, perform boundary correction on the interactive interface 12;
if not, in step S1625, the triggered boundary is set as the boundary of the interactive interface 12.
The boundary of the interactive interface 12 is corrected, so that the interactive interface 12 can be accurately displayed in the touch display area 101, and the interaction accuracy of the interactive interface 12 is improved.
Referring to fig. 10 to 12, how to obtain the boundary of the interactive interface 12 and the boundary correction in step S152 will be described below.
Assuming that the touch display area 101 is rectangular, the resolution of the touch display area 101 is 1080 × 1920, and a coordinate system is established with the upper boundary and the left boundary of the touch display area 101 as xy axes, at this time, it is assumed that the coordinate of the point a at the upper left corner of the touch display area 101 is (0, 0) and the coordinate of the point B at the lower right corner is (1080, 1920), that is, x is 1080 and y is 1920. At this time, the touch display area 101 has a length 1920 and a width 1080. Meanwhile, assume that the coordinates of the point a1 in the upper left corner of the interactive interface 12 are (0, 0), and the coordinates of the point B1 in the lower right corner are (1080, 1920). At this time, the interactive interface 12 has a length 1920 and a width 1080. Assuming that the coordinate of the starting point D of the zoom trigger point 11 is (1080, 860), the manipulation starting point D is shifted to the left by 216, the coordinate of the moving point D1 of the zoom trigger point 11 is changed to (864, 860), as shown in fig. 10, the shift amount of the moving point D1 to the left from the starting point D is 216, the coordinate of the point a2 at the upper left corner of the interactive interface 12 is (0, 0), the coordinate of the point B2 at the lower right corner is (864, 1920), 864 is divided by 1080 to obtain 0.8, and the zoom factor is 0.8, so that the interactive interface 12 can be scaled to the same scale with respect to the touch display area 101, and therefore, the unchanged value in the coordinate of the point B2 at the lower right corner is multiplied by 0.8, so as to obtain the zoomed interactive interface 12, as shown in fig. 11, the coordinate of the point A3 at the upper left corner of the zoomed interactive interface 12 is (0, 0), the coordinate of the point B3 at the lower right corner is (864, 1536), that is the shift amount of the point B3 at the lower, The offset to the left of point B3 in the lower right corner is 216, and at this time, the value of point B3 in the lower right corner of the interactive interface 12 is 0.8 times the value of point B in the lower right corner of the touch display area 101, i.e., the zoom factor is 0.8. In this way, the shift amount of the moving point D1 from the vertical position of the start point D is calculated from the change in the coordinates of the moving point D1 from the start point D to obtain the zoom factor, and the interactive interface 12 is scaled by the zoom factor. The scaling factor is obtained by dividing the abscissa of the starting point of the scaling trigger point 11 by the abscissa of the moving point of the scaling trigger point 11 or dividing the ordinate of the starting point of the scaling trigger point 11 by the ordinate of the moving point of the scaling trigger point 11.
In order to avoid that the image is enlarged beyond the physical size of the screen or that the zoom is too small to enable the user to operate the interactive interface 12, a zoom factor interval is defined, i.e. the zoom system is within a preset zoom factor interval, e.g. [0.3,1 ].
Specifically, judging whether the boundary of the interactive interface 12 is out of range refers to controlling the zoom trigger point 11 to move, obtaining the coordinates of the moving point at the same time, obtaining a zoom coefficient, and comparing the obtained zoom coefficient with a numerical value interval, wherein if the obtained zoom coefficient is not in a preset zoom coefficient interval, the boundary is out of range.
For example, referring to fig. 9, 13 and 14, taking the numerical value of the above distance as an example, the control start point D shifts to the left 864, when the coordinate of the moving point D2 of the zoom trigger point 11 changes to (216, 860), the shift amount of the moving point D2 to the left of the start point D is 864, when the coordinate of the point a4 at the upper left corner of the interactive interface 12 is (0, 0), the coordinate of the point B4 at the lower right corner is (216, 1920), and the division of 216 by 1080 results in 0.2, when the zoom factor is 0.2, so that the interactive interface 12 can be scaled in the same scale with respect to the touch display area 101, therefore, the unchanged numerical value in the coordinate of the point B4 at the lower right corner is multiplied by 0.2, so as to obtain the scaled interactive interface 12, as shown in fig. 13, when the coordinate of the point a5 at the upper left corner of the scaled interactive interface 12 is (0, 0), the coordinate of the point B5 at the lower right corner is (216, 384), that the shift amount of the point B5 at the lower right corner of the, The offset of the point B3 at the lower right corner to the left is 864, and at this time, the numerical value of the point B3 at the lower right corner of the interactive interface 12 is 0.2 times the numerical value of the point B at the lower right corner of the touch display area 101, i.e., the zoom factor is 0.2, and it can be seen from the figure that when the zoom factor is 0.2, the size of the interactive interface 12 is too small and the boundary is out of range. At this time, the width 216 and the length 384 of the interactive interface 12 are too small for the user to operate, so that the boundary of the interactive interface 12 needs to be corrected.
For example, referring to FIG. 14, the coordinate of the point B5 at the lower right corner of the interactive interface 12 is (216, 384), the interactive interface 12 is out-of-range, and the zoom factor of the interactive interface 12 is 0.2, which is not between 0.3 and 1. At this time, the processing module 103 is configured to change the zoom factor to be 0.3, please refer to fig. 15, at this time, the coordinate of the moving point D3 is changed to (324, 860), the coordinate of the point a6 at the upper left corner of the interactive interface 12 is (0, 0), and the coordinate of the point B6 at the lower right corner is changed to (234, 576), so that the interactive interface 12 is located in the touch display area 101 without going out of bounds. The interactive interface 12 can be automatically corrected when the boundary is out of range through a correction mode, so that the boundary-out of range condition of the interactive interface 12 does not occur, and the user can conveniently operate the interactive interface by one hand.
Therefore, the situation that the interaction accuracy of the user and the interaction interface 12 is reduced due to the fact that the user moves the interaction interface 12 out of the bound can be prevented, single-hand operation of the user is facilitated, and user experience is improved.
In some embodiments, a method of controlling an interface includes:
after the size or position adjustment of the interactive interface 12 is completed, the input focus area 1011 of the touch display area 101 is adjusted so that the input focus area 1011 of the interactive interface 12 matches the adjusted input focus area 1011 of the touch display area 101.
Specifically, referring to fig. 16 and 17, in some embodiments, the interface control method includes:
step S17, determining whether the continuous trigger point is interrupted;
if yes, in step S18, the input focus area 1011 of the touch display area 101 is adjusted according to the boundary of the set interactive interface 12 so that the interactive interface 12 matches the input focus area 1011.
Generally, when a continuous trigger point is interrupted, the electrical signal (e.g., voltage) output by the touch display area 101 is reduced to the extent that the touch display area 101 is not touched, so that the judgment of the electrical signal can determine whether the continuous trigger point is interrupted. For example, when the finger presses the moving trigger point 13 or the zooming trigger point 11, and moves to a certain position on the touch display area 101, and lifts the finger, the electrical signal output by the touch display area 101 is reduced to the extent that the touch display area 101 is not touched, and thus, it can be determined that the continuous trigger points are interrupted. If the finger is pressed against the touch display area 101 without moving, the output electrical signal is still large, and it can be determined that the continuous trigger points are not interrupted.
At the end of the trigger event, the starting point and the moving point need to be reacquired, and then steps S6 and S7 are repeated until the successive trigger points have been interrupted.
When the continuous trigger point is interrupted, the adjusted interactive interface 12 is obtained.
Since the size or position of the interface area after the adjustment of the interactive interface 12 may change, in order to ensure that the content of the adjusted interactive interface 12 can be matched with the content of the interactive interface 12 before the adjustment, it is necessary to correct the input focus of the adjusted interactive interface 12, so that the content of the adjusted interactive interface 12 can be matched with the content of the interactive interface 12 before the adjustment.
Referring to fig. 18, the interface control method includes:
step S181 of determining the boundary of the input focus area 1011 of the touch display area 101;
step S182, determining the boundary of the input focus area 1011 of the adjusted interactive interface 12;
in step S183, the boundary of the input focus area 1011 of the touch display area 101 is adjusted according to the trigger event and the continuous trigger point, so that the boundary of the input focus area 1011 of the touch display area 101 after adjustment matches the boundary of the input focus area 1011 of the interactive interface 12 after adjustment.
This enables the content of the interactive interface 12 after adjustment to be matched with the content of the interactive interface 12 before adjustment.
Referring to fig. 19 to 21, how to match the adjusted input focus area 1011 of the interactive interface 12 with the input focus area 1011 of the touch display area 101 in step S173 is described below.
Specifically, the following is a description of how to match the input focus area 1011 of the adjusted interactive interface 12 with the input focus area 1011 of the touch display area 101 after the size adjustment of the interactive interface 12.
Referring to fig. 19, assuming that the touch display area 101 is rectangular, the resolution of the touch display area 101 is 1080 × 1920, and a coordinate system is established with the upper boundary and the left boundary of the touch display area 101 as xy axes, at this time, it is assumed that the coordinates of the point a at the upper left corner of the touch display area 101 are (0, 0) and the coordinates of the point B at the lower right corner are (1080, 1920), that is, x is 1080 and y is 1920. At this time, the touch display area 101 has a length 1920 and a width 1080. Assume that the coordinates of point a1 in the upper left corner of the interactive interface 12 are (0, 0) and the coordinates of point B1 in the lower right corner are (1080, 1920). At this time, the interactive interface 12 has a length 1920 and a width 1080. Meanwhile, assuming that the input focus area 1011 of the touch input area is rectangular at this time, the coordinate of the point E at the upper left corner of the input focus area 1011 is (500, 800), the coordinate of the point F at the lower right corner of the input focus area 1011 is (600, 1060), and the zoom factor is 0.5, after adjustment, please refer to fig. 20, the coordinate of the point a2 at the upper left corner of the interactive interface 12 is (0, 0), the coordinate of the point B2 at the lower right corner is (540, 960), at this time, it is necessary to adjust the boundary of the input focus area 1011, and the data of the points at the boundary of the input focus area 1011 are multiplied by the zoom factor of 0.5, please refer to fig. 21, at this time, the coordinate of the point E1 at the upper left corner of the input focus area 1011 is (250, 400), and the coordinate of the point F1 at the lower right corner of the input focus area 1011 is (300, 530), so that the adjusted input focus area 1011 of the interactive interface 12 and the touch input focus area 1011 of the touch display area 101 are matched The content of the input area is matched, and the interaction accuracy of the interaction interface 12 is improved.
How to match the input focus area 1011 of the adjusted interactive interface 12 with the input focus area 1011 of the touch display area 101 after the position of the interactive interface 12 is adjusted is similar to that described above, and therefore, the details are not repeated.
Referring to fig. 22, an electronic terminal 100 according to an embodiment of the present disclosure includes:
a touch display area 101 for displaying the interactive interface 12;
a touch monitoring module 102, configured to detect a trigger event in the touch display area 101;
the processing module 103 is configured to determine whether the trigger event corresponds to a position or size adjustment trigger condition of the interactive interface 12; if so, judging whether continuous trigger points are detected in the touch display area 101 after the trigger event, and detecting the continuous trigger points in the touch display area 101 after the trigger event, and adjusting the position or the size of the interactive interface 12 according to the trigger event and the continuous trigger points;
and a display module 104, configured to display the adjusted interactive interface 12 in the touch display area 101.
The electronic terminal 100 of the embodiment of the application can adjust the position and the size of the interactive interface 12, so that the interactive interface 12 can be zoomed and adjusted in position as required, and the user experience is improved.
The interface control method according to the embodiment of the present application may be applied to the electronic terminal 100. It should be noted that the above explanation of the embodiment and the advantageous effects of the interface control method is also applicable to the electronic terminal 100 of the present embodiment, and is not detailed here to avoid redundancy.
In some embodiments, the touch monitoring module 102 is configured to determine that when the trigger event is located in the first area, the trigger event corresponds to a position adjustment trigger condition of the interactive interface 12; when the trigger event is located in the second area, the trigger event corresponds to a resizing trigger condition of the interactive interface 12.
In some embodiments, when the position of the interactive interface 12 is adjusted by the trigger event and the continuous trigger point, the processing module 103 is configured to determine an offset of the interactive interface 12 according to the travel distance of the trigger event and the continuous trigger point, determine a triggered boundary of the interactive interface 12 according to the offset, compare the triggered boundary of the interactive interface 12 with the boundary of the touch display area 101 to determine whether the triggered boundary of the interactive interface 12 is out of bounds, if so, perform boundary modification on the interactive interface 12, and if not, set the triggered boundary as the boundary of the interactive interface 12.
In some embodiments, when the size of the interactive interface 12 is adjusted by the trigger event and the continuous trigger points, the processing module 103 is configured to determine a scaling factor of the interactive interface 12 according to the travel distance of the trigger event and the continuous trigger points, determine a triggered boundary of the interactive interface 12 according to the scaling factor, compare the triggered boundary of the interactive interface 12 with the boundary of the touch display area 101 to determine whether the triggered boundary of the interactive interface 12 is out of bounds, if so, perform boundary modification on the interactive interface 12, and if not, set the triggered boundary as the boundary of the interactive interface 12.
Referring to fig. 23, in some embodiments, the electronic terminal 100 further includes:
and the focus correction module 105 is configured to, after the size or position of the interactive interface 12 is adjusted, adjust the input focus area 1011 of the touch display area 101 so that the input focus area 1011 of the interactive interface 12 matches the adjusted input focus area 1011 of the touch display area 101.
The focus correction module 105 is configured to determine a boundary of the input focus area 1011 of the touch display area 101, determine a boundary of the input focus area 1011 of the adjusted interactive interface 12, and adjust the boundary of the input focus area 1011 of the touch display area 101 according to a trigger event so that the boundary of the input focus area 1011 of the adjusted touch display area 101 matches the boundary of the input focus area 1011 of the adjusted interactive interface 12.
Referring to fig. 24, the present application provides an electronic terminal 1000, which includes a touch display area 101, a processor 1001 and a memory 1002, wherein the processor 1001 is connected to the touch display area 101, the touch display area 101 is used for displaying an interactive interface 12, and the memory 1002 stores computer readable instructions, and when the instructions are executed by the processor 1001, the processor 1001 executes the interface control method according to the above embodiment.
The electronic terminal 1000 according to the embodiment of the present application can adjust the position and size of the interactive interface 12, so that the interactive interface 12 can be zoomed and adjusted in position as required, thereby improving the user experience. And the input focus is corrected for the adjusted interactive interface 12, so that the triggering of the interactive interface 12 is matched with the content of the interactive interface 12, and the interactive accuracy of the interactive interface 12 is improved.
The interface control method according to the embodiment of the present application may be applied to the electronic terminal 1000. It should be noted that the above explanation of the embodiment and the advantageous effects of the interface control method is also applicable to the electronic terminal 1000 of the present embodiment, and is not detailed here to avoid redundancy. In addition, the electronic terminal includes a touch display screen 1003, and the touch display screen 1003 has a touch display area 101.
Fig. 24 is a schematic diagram of internal modules of electronic terminal 1000 in one embodiment. The electronic terminal 1000 is connected to the touch display area 101, the processor 1001, and the memory 1002 (e.g., a nonvolatile storage medium) via a system bus 1004. Memory 1002 has stored therein computer readable instructions. The computer readable instructions can be executed by the processor 1001 to implement the interface control method according to any one of the above embodiments. The processor 1001 may be used to provide computing and control capabilities to support the operation of the overall electronic terminal 1000. The electronic terminal 1000 may be a device such as a mobile phone and a tablet computer that can be used for touch operation. As will be appreciated by those skilled in the art. The configuration shown in fig. 24 is only a schematic diagram of a part of the configuration related to the present embodiment, and does not constitute a limitation on the electronic terminal 1000 to which the present embodiment is applied, and a specific electronic terminal 1000 may include more or less components than those in the figure, or combine some components, or have a different arrangement of components.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and the scope of the preferred embodiments of the present application includes additional implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium include the following, an electrical connection (an electronic device) having one or more wires, a portable computer diskette cartridge (a magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, various steps or methods may be performed by software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, discrete logic circuitry with logic gates to perform logic functions on data signals, application specific integrated circuits with appropriate combinational logic gates, Programmable Gate Arrays (PGAs), Field Programmable Gate Arrays (FPGAs), and so forth, as are known in the art, may be implemented.
It will be understood by those skilled in the art that all or part of the steps carried out in the above method may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be executed in the form of hardware or in the form of a software functional module. The integrated module, if executed in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (15)

  1. An interface control method is applied to an electronic terminal, the electronic terminal comprises a touch display area, and an interactive interface is displayed in the touch display area, and the interface control method is characterized by comprising the following steps:
    detecting a trigger event at the touch display area;
    judging whether the trigger event corresponds to a position or size adjustment trigger condition of the interactive interface;
    if so, judging whether continuous trigger points are detected in the touch display area after the trigger event;
    if so, adjusting the position or size of the interactive interface according to the trigger event and the continuous trigger points; and
    and displaying the adjusted interactive interface in the touch display area.
  2. The method for controlling an interface of claim 1, wherein said detecting a trigger event on said touch display area comprises detecting a trigger point on said touch display area.
  3. The interface control method of claim 1, wherein the interactive interface includes a first area and a second area, and when the trigger event is located in the first area, the trigger event adjusts a trigger condition corresponding to a position of the interactive interface; and when the trigger event is located in the second area, the trigger event corresponds to the size adjustment trigger condition of the interactive interface.
  4. The method of controlling an interface of claim 3, wherein the first area comprises: a title bar and an area surrounded by a frame of the interactive interface; the second region includes: the frame of the interactive interface or the four corners of the interactive interface.
  5. The method for controlling an interface of claim 3, wherein when the trigger event is in the first area, adjusting the position of the interactive interface according to the trigger event and the consecutive trigger points comprises:
    determining the offset of the interactive interface according to the trigger event and the travel distance of the continuous trigger point;
    determining a triggered boundary of the interactive interface according to the offset;
    comparing the triggered boundary of the interactive interface with the boundary of the touch display area to determine whether the triggered boundary of the interactive interface is out of range;
    if so, carrying out boundary correction on the interactive interface;
    and if not, setting the boundary after the triggering as the boundary of the interactive interface.
  6. The method for controlling an interface of claim 3, wherein when the trigger event is in the second area, adjusting the size of the interactive interface according to the trigger event and the consecutive trigger points comprises:
    determining a zoom factor of the interactive interface according to the trigger event and the travel distance of the continuous trigger point;
    determining a triggered boundary of the interactive interface according to the zooming coefficient;
    comparing the triggered boundary of the interactive interface with the boundary of the touch display area to determine whether the triggered boundary of the interactive interface is out of range;
    if so, carrying out boundary correction on the interactive interface;
    and if not, setting the boundary after the triggering as the boundary of the interactive interface.
  7. The method for controlling an interface according to claim 1, comprising:
    after the size or position of the interactive interface is adjusted, adjusting an input focus area of the touch display area so that the input focus area of the interactive interface is matched with the adjusted input focus area of the touch display area.
  8. The method of controlling an interface of claim 7, comprising:
    determining a boundary of an input focus area of the touch display area;
    determining the boundary of the input focus area of the adjusted interactive interface;
    and adjusting the boundary of the input focus area of the touch display area according to the trigger event and the continuous trigger points to enable the adjusted boundary of the input focus area of the touch display area to be matched with the adjusted boundary of the input focus area of the interactive interface.
  9. An electronic terminal, comprising:
    the touch display area is used for displaying an interactive interface;
    the touch monitoring module is used for detecting a trigger event in the touch display area;
    the processing module is used for judging whether the trigger event corresponds to the position or size adjustment trigger condition of the interactive interface; if so, judging whether continuous trigger points are detected in the touch display area after the trigger event; and if continuous trigger points are detected in the touch display area after the trigger event, adjusting the position or the size of the interactive interface according to the trigger event and the continuous trigger points.
    And the display module is used for displaying the adjusted interactive interface in the touch display area.
  10. The electronic terminal of claim 9, wherein the touch monitoring module is configured to determine that the trigger event corresponds to a position adjustment trigger condition of the interactive interface when the trigger event is located in the first area; and when the trigger event is located in the second area, the trigger event corresponds to the size adjustment trigger condition of the interactive interface.
  11. The electronic terminal according to claim 9, wherein when the trigger event and the connected trigger point adjust the position of the interactive interface, the processing module is configured to determine an offset of the interactive interface according to the travel distance of the trigger event and the consecutive trigger points, determine a triggered boundary of the interactive interface according to the offset, compare the triggered boundary of the interactive interface with the boundary of the touch display area to determine whether the triggered boundary of the interactive interface is out of bounds, if so, perform boundary modification on the interactive interface, and if not, set the triggered boundary as the boundary of the interactive interface.
  12. The electronic terminal according to claim 9, wherein when the trigger event and the consecutive trigger points adjust the size of the interactive interface, the processing module is configured to determine a scaling factor of the interactive interface according to the travel distances of the trigger event and the consecutive trigger points, determine a triggered boundary of the interactive interface according to the scaling factor, compare the triggered boundary of the interactive interface with the boundary of the touch display area to determine whether the triggered boundary of the interactive interface is out of bounds, if so, perform boundary modification on the interactive interface, and if not, set the triggered boundary as the boundary of the interactive interface.
  13. The electronic terminal according to claim 11 or 12, comprising:
    and the focus correction module is used for adjusting the input focus area of the touch display area after the size or position of the interactive interface is adjusted so as to enable the input focus area of the interactive interface to be matched with the input focus area of the adjusted touch display area.
  14. The electronic terminal of claim 13, wherein the focus correction module is configured to determine a boundary of an input focus area of the touch display area, determine a boundary of an input focus area of the adjusted interactive interface, and adjust the boundary of the input focus area of the touch display area according to the trigger event so that the adjusted boundary of the input focus area of the touch display area matches the adjusted boundary of the input focus area of the interactive interface.
  15. An electronic terminal comprising a touch display area, a processor electrically connected to the touch display area for displaying an interactive interface, and a memory storing computer readable instructions that, when executed by the processor, cause the processor to perform the control method of any of claims 1 to 8.
CN201880096005.5A 2018-10-29 2018-10-29 Interface control method and electronic terminal Pending CN112740166A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/112469 WO2020087218A1 (en) 2018-10-29 2018-10-29 Interface control method and electronic terminal

Publications (1)

Publication Number Publication Date
CN112740166A true CN112740166A (en) 2021-04-30

Family

ID=70464222

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880096005.5A Pending CN112740166A (en) 2018-10-29 2018-10-29 Interface control method and electronic terminal

Country Status (2)

Country Link
CN (1) CN112740166A (en)
WO (1) WO2020087218A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115098205A (en) * 2022-06-17 2022-09-23 来也科技(北京)有限公司 Control method for realizing IA flow editing interface based on RPA and AI

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782103B (en) * 2020-07-15 2022-02-08 网易(杭州)网络有限公司 Method, device, equipment and medium for adjusting position of interactive control

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479041A (en) * 2010-11-25 2012-05-30 英业达股份有限公司 Operation method for resizing picture on small touch screen by one hand
CN102981596A (en) * 2012-12-21 2013-03-20 东莞宇龙通信科技有限公司 Terminal and screen interface display method
CN104298433A (en) * 2014-09-30 2015-01-21 小米科技有限责任公司 Screen display method, device and mobile terminal
CN104461232A (en) * 2014-09-30 2015-03-25 小米科技有限责任公司 Method and device for determining reduction scale in screen display process
CN106445354A (en) * 2016-11-24 2017-02-22 北京小米移动软件有限公司 Terminal equipment touch control method and terminal equipment touch control device
CN106527860A (en) * 2016-11-07 2017-03-22 上海与德信息技术有限公司 Screen interface display method and device
WO2017107715A1 (en) * 2015-12-25 2017-06-29 珠海格力电器股份有限公司 Method and apparatus for controlling one-hand operation mode of terminal
US20170212631A1 (en) * 2016-01-25 2017-07-27 Lg Electronics Inc. Mobile terminal for one-hand operation mode of controlling paired device, notification and application
CN108696638A (en) * 2018-05-10 2018-10-23 维沃移动通信有限公司 A kind of control method and mobile terminal of mobile terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2652433C2 (en) * 2013-12-26 2018-04-26 Юйлун Компьютер Телекоммьюникейшн Сайнтифик (Шэньчжэнь) Ко., Лтд. Terminal operating method and terminal
CN104932821A (en) * 2015-06-02 2015-09-23 青岛海信移动通信技术股份有限公司 Display method of operation interface of intelligent terminal and intelligent terminal
CN105867715A (en) * 2015-10-30 2016-08-17 乐视移动智能信息技术(北京)有限公司 Interface display processing method and apparatus as well as terminal device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479041A (en) * 2010-11-25 2012-05-30 英业达股份有限公司 Operation method for resizing picture on small touch screen by one hand
CN102981596A (en) * 2012-12-21 2013-03-20 东莞宇龙通信科技有限公司 Terminal and screen interface display method
CN104298433A (en) * 2014-09-30 2015-01-21 小米科技有限责任公司 Screen display method, device and mobile terminal
CN104461232A (en) * 2014-09-30 2015-03-25 小米科技有限责任公司 Method and device for determining reduction scale in screen display process
WO2017107715A1 (en) * 2015-12-25 2017-06-29 珠海格力电器股份有限公司 Method and apparatus for controlling one-hand operation mode of terminal
US20170212631A1 (en) * 2016-01-25 2017-07-27 Lg Electronics Inc. Mobile terminal for one-hand operation mode of controlling paired device, notification and application
CN106527860A (en) * 2016-11-07 2017-03-22 上海与德信息技术有限公司 Screen interface display method and device
CN106445354A (en) * 2016-11-24 2017-02-22 北京小米移动软件有限公司 Terminal equipment touch control method and terminal equipment touch control device
CN108696638A (en) * 2018-05-10 2018-10-23 维沃移动通信有限公司 A kind of control method and mobile terminal of mobile terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115098205A (en) * 2022-06-17 2022-09-23 来也科技(北京)有限公司 Control method for realizing IA flow editing interface based on RPA and AI

Also Published As

Publication number Publication date
WO2020087218A1 (en) 2020-05-07

Similar Documents

Publication Publication Date Title
US20160004373A1 (en) Method for providing auxiliary information and touch control display apparatus using the same
US8723988B2 (en) Using a touch sensitive display to control magnification and capture of digital images by an electronic device
EP2754025B1 (en) Pinch to adjust
CN104246681B (en) Information processing unit, information processing method and program
US10296139B2 (en) Refreshing method of sensing baseline values for capacitive sensor device and capacitive sensor device
CN107980158B (en) Display control method and device of flexible display screen
US10802704B2 (en) Gesture control method, apparatus, terminal device, and storage medium
US11693544B2 (en) Mobile terminal display picture control method, apparatus, and device and storage medium
CN110286840B (en) Gesture zooming control method and device of touch equipment and related equipment
CN105487775A (en) Touch screen control method and mobile terminal
US20090135152A1 (en) Gesture detection on a touchpad
AU2015202763B2 (en) Glove touch detection
US10296130B2 (en) Display control apparatus, display control method, and storage medium storing related program
CN112740166A (en) Interface control method and electronic terminal
US10353569B2 (en) Crop frame adjusting method, image processing device, and non-transitory computer readable storage medium
US20140089845A1 (en) Apparatus and method capable of switching displayed pictures
US20140078082A1 (en) Operating method of electronic device
US9632697B2 (en) Information processing apparatus and control method thereof, and non-transitory computer-readable medium
JP2015138360A (en) System, control program, and control method for object manipulation
US20120068958A1 (en) Portable electronic device and control method thereof
US20150281585A1 (en) Apparatus Responsive To At Least Zoom-In User Input, A Method And A Computer Program
TWI462000B (en) Signal process methods for a touch panel and touch panel systems
KR101403079B1 (en) method for zooming in touchscreen and terminal using the same
CN108475166B (en) Information processing apparatus, control method therefor, and program
US10983686B2 (en) Display control apparatus equipped with touch panel, control method therefor, and storage medium storing control program therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210430