CN110286840B - Gesture zooming control method and device of touch equipment and related equipment - Google Patents
Gesture zooming control method and device of touch equipment and related equipment Download PDFInfo
- Publication number
- CN110286840B CN110286840B CN201910555666.6A CN201910555666A CN110286840B CN 110286840 B CN110286840 B CN 110286840B CN 201910555666 A CN201910555666 A CN 201910555666A CN 110286840 B CN110286840 B CN 110286840B
- Authority
- CN
- China
- Prior art keywords
- target element
- state
- boundary
- zooming
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 78
- 230000008569 process Effects 0.000 claims abstract description 21
- 230000008859 change Effects 0.000 claims description 33
- 230000003321 amplification Effects 0.000 claims description 31
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 31
- 238000004590 computer program Methods 0.000 claims description 19
- 230000001960 triggered effect Effects 0.000 claims description 16
- 238000003860 storage Methods 0.000 claims description 7
- 230000009467 reduction Effects 0.000 description 37
- 230000006870 function Effects 0.000 description 16
- 238000012217 deletion Methods 0.000 description 13
- 230000037430 deletion Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 7
- 230000002452 interceptive effect Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000005520 cutting process Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000003825 pressing Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a gesture zooming control method and device of a touch device and related equipment. The gesture zooming control method of the touch equipment comprises the following steps: determining a target element on a display page; zooming out or zooming in the target element according to the gesture zooming operation on the display page; determining the state of the target element in the process of reducing or enlarging the target element or at the end of reducing or enlarging the target element; and when the state of the target element reaches the boundary state, triggering boundary operation according to the boundary state. The gesture zooming control method of the touch-control device can effectively limit the zooming limit of the elements, trigger boundary operation in a boundary state, further process the elements, and improve operability and operation efficiency in actual scene application.
Description
[ technical field ] A
The present invention relates to the field of gesture control, and in particular, to a gesture zooming control method and apparatus for a touch device, and a related device.
[ background of the invention ]
When zooming in or out an element of a page displayed on a touch-enabled device, it often happens that the element is zoomed out to be extremely small (the smallest display size of the element on the displayed page), or the element is zoomed in completely out of the screen. The two situations lack practicability in practical application, do not reasonably control the scaling of elements, and have low operation efficiency in practical scene application.
[ summary of the invention ]
In view of this, embodiments of the present invention provide a gesture zooming control method and apparatus for a touch device, and a related device, so as to solve the problem that the operation efficiency of the current gesture zooming control is low when the current gesture zooming control is applied in an actual scene.
In a first aspect, an embodiment of the present invention provides a gesture zoom control method for a touch-enabled device, including:
determining a target element on a display page;
zooming out or magnifying the target element according to the gesture zooming operation on the display page;
determining the state of the target element in the process of reducing or enlarging the target element or at the end of reducing or enlarging the target element;
and when the state of the target element reaches a boundary state, triggering boundary operation according to the boundary state.
As with the above-described aspects and any possible implementation, the boundary state includes a boundary reduction state and/or a boundary enlargement state.
The above aspect and any possible implementation manner further provide an implementation manner, where the determining a target element on a display page includes:
if the selected elements exist on the display page, taking the selected elements as target elements;
and if the selected elements do not exist on the display page, taking the elements on the display page as target elements.
The above aspect and any possible implementation manner further provide an implementation manner, where determining the state of the target element and determining whether the target element reaches a boundary state includes:
determining a state of the target element based on an area of the target element, wherein,
when the area of the target element is smaller than or equal to a first preset value, determining that the state of the target element reaches the boundary reduction state;
when the area of the target element is larger than or equal to a second preset value, determining that the state of the target element reaches the boundary amplification state;
wherein the first preset value is smaller than the second preset value.
The above aspect and any possible implementation manner further provide an implementation manner, where determining the state of the target element and determining whether the target element reaches a boundary state includes:
determining a state of the target element according to an area change ratio of the target element, wherein the area change ratio of the target element is a ratio of a current area of the target element to an original area of the target element,
when the area change ratio of the target element is smaller than or equal to a first preset ratio, determining that the state of the target element reaches the boundary reduction state;
when the area change ratio of the target element is greater than or equal to a second preset ratio, determining that the state of the target element reaches the boundary amplification state;
wherein the first preset ratio is smaller than the second preset ratio.
The above aspect and any possible implementation manner further provide an implementation manner, where determining the state of the target element and determining whether the target element reaches a boundary state includes:
determining a state of the target element according to an average distance between touch points of the finger in contact with the touch screen, wherein,
when the average distance between the touch points of the finger and the touch screen is smaller than or equal to a third preset numerical value, determining that the state of the target element reaches the boundary reduction state;
when the average distance between the touch points of the finger and the touch screen is larger than or equal to a fourth preset numerical value, determining that the state of the target element reaches the boundary amplification state;
wherein the third preset value is smaller than the fourth preset value.
The above aspect and any possible implementation manner further provide an implementation manner, where the determining the state of the target element determines whether the target element reaches a boundary state, and includes:
determining the state of the target element according to a change ratio of an average distance between touch points of the finger and the touch screen, wherein the change ratio of the average distance between the touch points of the finger and the touch screen is a ratio of the average distance between the touch points at present and the average distance between the original touch points, and the change ratio of the average distance between the touch points at present and the original touch points is determined,
when the variation ratio of the average distance between the touch points of the finger and the touch screen is smaller than or equal to a third preset ratio, determining that the state of the target element reaches the boundary reduction state;
when the variation ratio of the average distance between the touch points of the finger and the touch screen is larger than or equal to a fourth preset ratio, determining that the state of the target element reaches the boundary amplification state;
wherein the third preset ratio is smaller than the fourth preset ratio.
The above-mentioned aspect and any possible implementation manner further provide an implementation manner, where the boundary operation includes an element deletion operation and/or an element copy operation, and when the state of the target element reaches a boundary state, triggering a boundary operation according to the boundary state, and if the target element reaches the boundary state, triggering a boundary operation according to the boundary state includes:
when the target element reaches the boundary reduction state, triggering element deletion operation according to the boundary reduction state; and/or the presence of a gas in the gas,
and when the target element reaches the boundary amplification state, triggering element copying operation according to the boundary amplification state, and copying the target element to a page with a preset jump relation, wherein the page comprises a newly-built page.
The above-described aspect and any possible implementation manner further provide an implementation manner in which the target element is reduced or enlarged according to a gesture zoom operation on the display page, and the method includes:
and executing zooming operation on the target element according to the received gesture zooming operation information, wherein the zooming operation comprises zooming-out operation and zooming-in operation.
The above-described aspect and any possible implementation further provide an implementation, where the determining a state of the target element is performed during the zooming in or out of the target element, or at the end of zooming in or out of the target element, and the method includes:
and when the target element is subjected to the zooming operation or after the target element is subjected to the zooming operation, judging whether the target element reaches a boundary state or not.
In a second aspect, an embodiment of the present invention provides a gesture scaling control apparatus for a touch-enabled device, including:
the target element determining module is used for determining a target element on the display page;
the zooming module is used for zooming the target element according to the received gesture zooming operation information;
a boundary state determining module, configured to determine whether the target element reaches a boundary state during zooming of the target element or when zooming of the target element is finished, where the boundary state includes a boundary zooming-out state and a boundary zooming-in state;
and the boundary operation triggering module is used for triggering boundary operation according to the boundary state when the target element reaches the boundary state.
In a third aspect, an embodiment of the present invention provides a gesture scaling control method for a touch-enabled device, including:
determining a target element on a display page;
performing zooming operation on the target element according to the received gesture zooming operation information, wherein the zooming operation comprises zooming-out operation and zooming-in operation;
when the target element is subjected to the zooming operation or after the target element is subjected to the zooming operation, judging whether the target element reaches a boundary state;
and if the state of the target element reaches the boundary state, triggering boundary operation according to the boundary state.
In a fourth aspect, an embodiment of the present invention provides a gesture scaling control apparatus for a touch-enabled device, including:
the determining module is used for determining a target element on the display page;
the execution module is used for executing zooming operation on the target element according to the received gesture zooming operation information, wherein the zooming operation comprises zooming-out operation and zooming-in operation;
a boundary state determining module, configured to determine whether the target element reaches a boundary state when the target element is zoomed or after the target element is zoomed;
and the triggering module is used for triggering boundary operation according to the boundary state if the state of the target element reaches the boundary state.
In a fifth aspect, a computer device includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the gesture zoom control method of the touch-enabled device of the first aspect when executing the computer program, or the processor implements the steps of the gesture zoom control method of the touch-enabled device of the third aspect when executing the computer program.
In a sixth aspect, an embodiment of the present invention provides a computer-readable storage medium, which includes a computer program, and the computer program, when executed by a processor, implements the steps of the gesture zoom control method of the touch-enabled device in the first aspect, or the computer program, when executed by the processor, implements the steps of the gesture zoom control method of the touch-enabled device in the third aspect.
In the gesture zooming control method of the touch-control device in the first aspect of the present invention, the determination of the boundary state of the target element during zooming is set, and when the target element reaches the boundary state, the boundary operation may be triggered according to the boundary state, so as to solve the problem that in the prior art, when the target element on the touch-control device is zoomed, the target element is often zoomed to be extremely small, or the target element is zoomed to be completely beyond the screen range. The method can effectively limit the scaling limit of the target element, trigger boundary operation in the boundary state, further process the target element and improve the operability and the operation efficiency in the actual scene application.
In the gesture zoom control method of the touch-enabled device in the third aspect of the present invention, by determining whether the target element reaches the boundary state after performing the zoom operation, and triggering the boundary operation when the target element reaches the boundary state, it is possible to solve the problem that the target element is often reduced to a minimum or enlarged to a point completely beyond the screen range when the target element on the touch-enabled device is zoomed in the prior art. The method can effectively limit the limit of the scaling of the target element, trigger the boundary operation in the boundary state, further process the target element, and improve the operability and the operation efficiency in the actual scene application.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
FIG. 1 is a flowchart illustrating a gesture zoom control method for a touch-enabled device according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a distance representation between touch points when multi-finger zooming is employed according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a gesture zoom control apparatus of a touch-enabled device according to an embodiment of the present invention;
FIG. 4 is another flowchart of a gesture zoom control method of a touch-enabled device according to an embodiment of the invention;
FIG. 5 is another schematic diagram of a gesture zoom control apparatus of a touch-enabled device according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a computer device according to an embodiment of the invention.
[ detailed description ] A
For better understanding of the technical solutions of the present invention, the following detailed descriptions of the embodiments of the present invention are provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely a field that describes the same of an associated object, meaning that three relationships may exist, e.g., A and/or B, may indicate: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that although the terms first, second, third, etc. may be used to describe preset ranges, etc. in embodiments of the present invention, these preset ranges should not be limited to these terms. These terms are only used to distinguish preset ranges from each other. For example, a first preset range may also be referred to as a second preset range, and similarly, a second preset range may also be referred to as a first preset range, without departing from the scope of embodiments of the present invention.
The word "if" as used herein may be interpreted as "at 8230; \8230;" or "when 8230; \8230;" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrase "if determined" or "if detected (a stated condition or event)" may be interpreted as "upon determining" or "in response to determining" or "upon detecting (a stated condition or event)" or "in response to detecting (a stated condition or event)", depending on the context.
Fig. 1 is a flowchart illustrating a gesture zoom control method of a touch-enabled device according to an embodiment of the present invention. The gesture zooming control method of the touch-controllable device is particularly applied to the touch-controllable device, wherein the touch-controllable device can perform human-computer interaction with a user and has a touch-control function, and the touch-controllable device comprises but is not limited to a touch-controllable mobile phone, a touch-controllable tablet and a touch-controllable all-in-one machine. As shown in fig. 1, the gesture zoom control method of the touch-controllable device includes the following steps:
s10: a target element on a display page is determined.
It is understood that the touchable device performs a page display through the touchable screen, and the displayed page is referred to as a display page. The display page has one or more elements, such as fonts, images, and controls (e.g., buttons, check boxes, and progress bars) on the page.
In one embodiment, before performing the gesture zoom operation, an element to be zoomed, that is, the target element, needs to be determined. Specifically, the target element may be determined by a user's selection operation (a closed area formed by connecting touch tracks) on the touch screen, where the selection operation may be performed multiple times. Further, when determining the target element on the display page, the target element may be determined by means of voice control, eye tracking, and the like, in addition to the manner of receiving the touch information, which is not limited herein.
S20: zooming out or zooming in the target element according to the gesture on the display page.
In one embodiment, the touchable device will zoom in, zoom out, or zoom in on the target element according to the gesture on the displayed page. Specifically, the zooming out or zooming in may be determined by information related to the zooming operation input by the user, and the touchable device executes a corresponding target element zooming out instruction or a target element zooming in instruction according to the information related to the zooming operation to zoom in or zoom out the target element.
Further, in step S20, zooming in or zooming out the target element according to the gesture on the display page specifically includes:
and zooming the target element according to the received gesture zooming operation information.
The gesture zooming operation information is operation information received by the touch device when the user performs the gesture zooming operation, such as zooming size, zooming speed, initial zooming coordinates, ending zooming coordinates and the like. The gesture zooming operation information is related to the target element and is used for zooming the target element.
In an embodiment, a user receives gesture zoom operation information through a touch screen, and generates a corresponding target element zoom-out instruction or a target element zoom-in instruction according to the gesture zoom operation information, so as to perform zoom operation on a target element. The touch-controlled equipment receives gesture zooming operation information through the touch-controlled screen, and zooming results of the target elements are displayed on the touch-controlled screen according to the gesture zooming operation information.
S30: the state of the target element is determined during the reduction or enlargement of the target element or at the end of the reduction or enlargement of the target element.
Wherein the state of the target element includes a normal zoom state and a boundary state. When the target element is in a normal zooming state, the target element is zoomed normally, and when the target element is in a boundary state or reaches the boundary state, corresponding boundary operation is triggered according to the boundary state. It can be understood that, during the process of zooming the target element or at the end of zooming, if the judgment condition of the boundary state is satisfied, the target element is considered to reach the boundary state, at this time, after the target element reaches the boundary state, a corresponding boundary operation will be triggered according to the boundary state, where the judgment condition of satisfying the boundary state will be described in the following embodiments.
In one embodiment, during the process of zooming by the user using the gesture, the touchable device will synchronously display the zooming result on the touchable screen, and at this time, there may be two stages for determining the current state of the target element. One is to judge the state of the target element during the zooming process of the target element, and this judgment can synchronously judge the state of the target element when the target element is zoomed; the other is to judge the state of the target element only when the target element finishes zooming, that is, the user's finger leaves the touch-sensitive screen. The two modes have advantages respectively, and can be flexibly configured according to specific scenes.
Further, in step S30, during the reduction or enlargement of the target element, or at the end of the reduction or enlargement of the target element, determining the state of the target element includes:
when the target element is subjected to the zooming operation, or after the target element is subjected to the zooming operation, whether the target element reaches the boundary state is judged.
It will be appreciated that the purpose of determining the state of the target element in this implementation is to determine whether the target element has reached a boundary state. In an embodiment, the touch-controllable device executes a corresponding zooming operation according to gesture zooming operation information received in real time, and if it is preset by a user that the state of a current target element is determined in real time in the zooming process of the touch-controllable device, when the touch-controllable device executes the zooming operation, whether the target element reaches a boundary state is judged in real time; if the user presets that the state of the target element at the end time is determined when the zooming operation of the touch-controllable device is finished, after the zooming operation of the touch-controllable device is finished, whether the target element reaches the boundary state or not is judged at the end time. In the implementation, whether the target element reaches the boundary state can be judged quickly and accurately according to the executed zooming operation.
S40: and when the state of the target element reaches the boundary state, triggering boundary operation according to the boundary state.
Wherein, the boundary operation refers to an operation performed when the target element reaches a boundary state.
In one embodiment, when the target element reaches the boundary state, the boundary operation corresponding to the boundary state is triggered, wherein the boundary state may be one or more, and the boundary operations corresponding to different types of boundary states may be different.
Further, the boundary state includes a boundary reduction state and/or a boundary enlargement state. In one embodiment, the target element may have a boundary state when zoomed out, and may also have a boundary state when zoomed in. It can be understood that there is often no limitation when performing a zoom operation on a target element in an actual scene, and when the target element is zoomed to a smaller resolution, there is no actual effect, and for a case where the target element is a picture, a user cannot obtain effective information on the picture with the smaller resolution, and for a case where the target element is a control, the user cannot perform an accurate operation on the control with the smaller resolution; there is also no practical effect when the target element is enlarged to a larger resolution. In addition, when the target element is in a smaller or larger zoom size, it is troublesome to restore the target element to a normal zoom size, and other operations performed by the user may be affected. In this embodiment, the practicality of scaling the target element can be improved and the operation efficiency can be improved by setting the boundary reduction state and/or the boundary enlargement state.
Further, in step S10, determining a target element on the display page specifically includes:
and if the selected elements exist on the display page, taking the selected elements as target elements.
And if the selected elements do not exist on the display page, taking the elements on the display page as target elements.
It will be appreciated that if the user does not pre-select an element, the user will directly zoom all elements on the displayed page when performing the zoom operation. Specifically, when the user operates all elements on the display page, the zoom operation may be directly performed on all elements, and when zooming to the boundary reduction state, all elements on the display page may be cleared through an element deletion operation (one type of boundary operation); when the user only operates partial elements, the elements can be selected in advance and then gesture zooming is carried out. The method for determining the target elements on the display page in the embodiment can improve the practicability, operability and operation efficiency.
Further, in step S30, determining the state of the target element specifically includes:
s311: and determining the state of the target element according to the area of the target element, wherein when the area of the target element is smaller than or equal to a first preset value, the state of the target element is determined to reach a boundary reduction state.
S312: and when the area of the target element is larger than or equal to a second preset value, determining that the state of the target element reaches a boundary amplification state.
Wherein the first preset value is smaller than the second preset value.
In one embodiment, the state of the target element is determined using the area of the target element to determine whether the target element reaches the boundary state. The first preset value in this embodiment may specifically be a resolution of 20 × 20 (the resolution is a pixel area of a screen, and the area of the target element refers to the pixel area), and the second preset value may specifically be 2048 × 1536. It is understood that when the target element reaches the boundary reduction state at the time of zooming, the target element has a resolution of only 20 × 20 at this time, and is almost unclear. Assuming that the screen resolution of the touchable device is 1024 × 768, when the target element reaches the boundary zoom-in state at the time when the resolution of the target element is 2048 × 1536, the target element is likely to be out of the screen at the time. In this embodiment, by setting the first preset value and the second preset value, the scaling of the target element can be flexibly adjusted in the touch-controllable devices with different resolutions, so that the display size of the target element in the touch-controllable devices with different resolutions is more reasonable.
Further, in step S30, determining the state of the target element further includes:
s321: and determining the state of the target element according to the area change ratio of the target element, wherein the area change ratio of the target element is the ratio of the current area of the target element to the original area of the target element, and when the area change ratio of the target element is smaller than or equal to a first preset ratio, determining that the state of the target element reaches a boundary reduction state.
The current area of the target element refers to the area size of the target element at the current moment. The original area of the target element refers to the area size of the target element corresponding to the time when the scaling is started.
S322: and when the area change ratio of the target element is greater than or equal to a second preset ratio, determining that the state of the target element reaches a boundary amplification state.
Wherein the first preset ratio is smaller than the second preset ratio.
In an embodiment, the boundary state may be determined by an area change ratio of the target element, where the area change ratio of the target element is a ratio obtained by dividing a current area of the target element by an original area of the target element.
The difference between the area change ratio of the target element and the area of the target element is that the screen resolutions of different touch devices may be different, and if the boundary state is determined by using the area of the fixed target element, the boundary state is not necessarily applicable to all touch devices. Specifically, if the boundary determination is determined after the user stops zooming, when the area of the target element is 10% of the original area of the target element, the target element may be considered to be unclear, and at this time, a boundary reduction state is reached; when the area of the target element is 400% of the original area of the target element, the target element may be considered to be out of the screen, and the boundary enlargement state is reached. The area change ratio of the target element in this embodiment can be adjusted through configuration, and the requirements of different users can be met.
Further, in step S30, determining the state of the target element further includes:
s331: and determining the state of the target element according to the average distance between the touch points of the contact of the finger and the touch screen, wherein when the average distance between the touch points of the contact of the finger and the touch screen is less than or equal to a third preset value, the state of the target element is determined to reach a boundary reduction state.
S332: and when the average distance between the touch points of the contact of the finger and the touch screen is larger than or equal to a fourth preset numerical value, determining that the state of the target element reaches a boundary amplification state.
Wherein the third predetermined value is less than the fourth predetermined value.
In an embodiment, a specific implementation of determining the state of the target element and determining whether the target element reaches the boundary state is further provided. In this embodiment, the boundary state is determined by the average distance between the touch points where the user's fingers contact the touch screen, and it can be understood that the distance between the touch points may reflect the zoom degree of the target element when the user zooms in a gesture. The average distance between the touch points refers to an average distance obtained by connecting lines between each finger (corresponding to each touch point) for controlling zooming, and dividing the sum of the distances of the lines by the number of the lines. Specifically, referring to FIG. 2, FIG. 2 shows a distance representation between touch points (black points in FIG. 2) when multi-finger (three fingers for example) is used to control zoom, and then the average distance between touch points in FIG. 2 can be taken as (L) 1 +L 2 +L 3 ) And/3 obtaining.
Further, in step S30, determining the state of the target element further includes:
s341: determining the state of the target element according to the change ratio of the average distance between the touch points contacted by the finger and the touch screen, wherein the change ratio of the average distance between the touch points contacted by the finger and the touch screen is the ratio of the average distance between the current touch points and the average distance between the original touch points, and when the change ratio of the average distance between the touch points contacted by the finger and the touch screen is smaller than or equal to a third preset ratio, determining that the state of the target element reaches a boundary reduction state.
The average distance between the current touch points refers to the average distance between the touch points of the touch points at the current moment. The average distance between the original touch points is the average distance between the touch points corresponding to the target element at the moment of starting to zoom.
S342: and when the variation ratio of the average distance between the touch points of the finger and the touch screen is greater than or equal to a fourth preset ratio, determining that the state of the target element reaches a boundary amplification state.
And the third preset ratio is smaller than the fourth preset ratio.
In one embodiment, since different devices have different screen sizes and screen resolution sizes, whether the target element reaches the boundary state can be determined by a variation ratio of an average distance between touch points where a finger contacts the touch screen. Specifically, if the boundary determination is determined after the user stops zooming, when the variation ratio of the average distance between the touch points is 10%, it may be considered that the target element is not clearly seen, and at this time, the boundary reduction state is reached; when the variation ratio of the average distance between the touch points is 200%, it can be considered that the target element is out of the screen, and the boundary enlargement state is reached. The boundary state is judged by adopting the change ratio of the average distance between the touch points, and the boundary state can be adjusted through configuration, so that the requirements of different users can be met, and the method can be applied to touch equipment with different screen sizes and screen resolution ratios.
Further, the boundary operations include element deletion operations and/or element copy operations.
Further, besides the element deleting operation and the element copying operation, the boundary operation may also include other types of operations, and the boundary operation may be changed accordingly according to the requirements of different scenarios, so as to achieve different trigger effects.
In particular, the boundary operation may also be a cut operation, performing a cut of the target element. The user can paste the cut target elements to the current page or the new page by adopting a paste operation, or can execute a cut operation and uniformly store the cut target elements in the clipboard. The user can find the clipped target element from the clipping plate at any time; the border operation may also be a close document operation, and when the user zooms out a target element on the interface to reach a zoom-out border state, the touchable device will close the current document. For example, when a user adopts documents such as text documents, presentation documents, table documents and the like, the effect of quickly closing the documents can be realized; the boundary operation can also be screen projection operation, when the target element is zoomed to reach an enlarged boundary state, the target element is projected on the display device which is connected with the touch device in advance, the target element is displayed by adopting a preset display ratio, and the function of quickly projecting the screen can be realized.
Further, in step S40, when the state of the target element reaches the boundary state, triggering a boundary operation according to the boundary state specifically includes:
when the target element reaches the boundary reduction state, triggering element deletion operation according to the boundary reduction state; and/or when the target element reaches the boundary amplification state, triggering element copying operation according to the boundary amplification state, and copying the target element to a page with a preset jump relation, wherein the page comprises a newly-built page.
In one embodiment, different boundary operations may be triggered accordingly depending on the particular boundary state (boundary reduction state or boundary enlargement state) reached by the target element. When the boundary reduction state is reached, element deletion operation can be triggered to delete the target element; when the boundary magnification state is reached, an element copy operation may be triggered, which will copy the target element onto other pages. The page has a preset jump relationship, and the jump relationship is used for determining a copy path of the target element. In particular, the page may be a new page. It can be understood that when the element deletion operation is triggered, the element may be deleted directly, or an inquiry box for whether to execute the deletion operation may be displayed on the display interface before the deletion operation is executed, and after the user clicks to determine the execution, the deletion operation is executed; similarly, when the copy operation is triggered, the element may be directly copied, or an inquiry box for whether to execute the copy operation may be displayed on the display interface before the copy operation is executed, and after the user clicks to determine the execution, the copy operation is executed. Further, in addition to popping up the query box before the boundary operation is performed, the selection options may be displayed on the display interface before the boundary operation is performed. For example, when a copy operation is triggered, selection options are displayed on the display interface, wherein the selection options comprise 'copy element to new page' and 'copy element to 8230', 'two options, and if a user selects' copy element to 8230 ',' the selection option, the currently opened document name will continue to pop up on the display interface, and the user can select the document name (one or more are supported) to copy the target element to the document. Further, after the option is selected, a query box for whether to execute the deletion operation may be displayed on the display interface, so as to improve the accuracy of the user selection.
It will be appreciated that in triggering an element boundary operation, a number of different function blocks (query block, selection block, etc.) may be set before the touchable device performs the boundary operation, and the boundary operation is performed when the user completes all function blocks. Of course, it is also possible that no function block is set before the boundary operation, and after the element boundary operation is triggered, the corresponding boundary operation will be directly executed.
Further, in addition to the element deleting operation and/or the element copying operation, other boundary operations may be performed, such as a boundary operation of cutting, hiding a target element, closing a current page, and jumping to a new page. It will be appreciated that multiple (non-conflicting) boundary operations may be performed simultaneously when the target element reaches the boundary state.
Furthermore, the gesture zooming control method of the touch-control device can be particularly applied to the intelligent interaction tablet. And the intelligent interaction tablet realizes the control of element scaling according to the gesture scaling operation of the user on the tablet touch screen. Specifically, when the area of the target element is taken as a judgment condition by the intelligent interactive tablet, zooming the target element according to the received gesture zooming operation information, and judging whether the area of the target element reaches a boundary zooming-in state or a boundary zooming-out state in the zooming process of the target element or when the zooming of the target element is finished. Similarly, when the area change ratio of the target element, the average distance between the touch points in contact with the touch screen of the tablet or the change ratio of the average distance between the touch points in contact with the touch screen of the tablet is taken as the judgment condition, the target element is zoomed according to the gesture zooming operation information received by the touch screen, and whether the target element reaches the boundary zooming-in state or the boundary zooming-out state is judged in the process of zooming the target element or when the target element finishes zooming. In an embodiment, a user can perform zoom control on a target element on a smart interactive tablet display page, and when the target element reaches a boundary enlarging state or a boundary reducing state, a boundary operation is triggered. Specifically, when the target element reaches the boundary reduction state, element deletion operation is triggered, and the target element is deleted from the display page; and when the element reaches the boundary amplification state, triggering element copying or cutting operation, and copying or cutting the target element into a new page or an opened page of the intelligent interactive tablet. The embodiment of the invention can obviously improve the operability and the operation efficiency of the intelligent interactive tablet during gesture zooming control.
In an embodiment, the implementation process of the present invention may specifically be: a user circles and selects a target element on a display page on a touch screen of the touch equipment through circle selection operation; and zooming the target element by the user through gesture zooming operation. If the boundary operation trigger preset by the touchable device is realized in the process of zooming the target element, when the target element zoomed by the user reaches a boundary zooming-out state or a boundary zooming-in state, executing corresponding operation according to the boundary zooming-out state or the boundary zooming-in state; if the boundary operation trigger preset by the touchable device is realized when the zooming of the target element is finished, when the zooming gesture zooming operation is stopped by the user, if the target element reaches a boundary reduction state or a boundary enlargement state, executing corresponding operation according to the boundary reduction state or the boundary enlargement state.
In the embodiment of the invention, the boundary state of the target element in the zooming process is judged, when the target element reaches the boundary state, the boundary operation can be triggered according to the boundary state, and the problem that the target element is reduced to be extremely small or the target element is enlarged to be completely beyond the screen range when the target element on the touch equipment is zoomed in the prior art can be solved. The method can effectively limit the scaling limit of the target element, trigger boundary operation in the boundary state, further process the target element and improve the operability and the operation efficiency in the actual scene application.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Based on the gesture zoom control method of the touch-controllable device provided in the embodiment, the embodiment of the present invention further provides an embodiment of an apparatus for implementing each step and method in the above-mentioned method embodiments.
Fig. 3 is a schematic block diagram of a gesture scaling control apparatus of a touch-enabled device, which corresponds to the gesture scaling control method of the touch-enabled device in one-to-one manner in the embodiment. As shown in fig. 3, the gesture zoom control apparatus of the touch-enabled device includes a target element determination module 10, a zoom module 20, a boundary state determination module 30, and a boundary operation trigger module 40. The implementation functions of the target element determining module 10, the scaling module 20, the boundary state determining module 30, and the boundary operation triggering module 40 correspond to the steps corresponding to the gesture scaling control method of the touch-enabled device in the embodiment one by one, and for avoiding redundant description, detailed description is not repeated in this embodiment.
And a target element determining module 10, configured to determine a target element on the display page.
And the zooming module 20 is used for zooming out or zooming in the target element according to the gesture zooming operation on the display page.
And a boundary state determining module 30, configured to determine a state of the target element during reduction or enlargement of the target element or at the end of reduction or enlargement of the target element.
And the boundary operation triggering module 40 is used for triggering the boundary operation according to the boundary state when the state of the target element reaches the boundary state.
Optionally, the boundary state comprises a boundary reduction state and/or a boundary enlargement state.
Optionally, the target element determination module 10 comprises a first determination unit and a second determination unit.
And the first determination unit is used for taking the selected element as a target element when the selected element exists on the display page.
And the second determining unit is used for taking the elements on the display page as target elements when the selected elements do not exist on the display page.
Optionally, the boundary state determination module 30 is further specifically configured to:
determining a state of the target element based on the area of the target element, wherein,
when the area of the target element is smaller than or equal to a first preset value, determining that the state of the target element reaches a boundary reduction state;
when the area of the target element is larger than or equal to a second preset value, determining that the state of the target element reaches a boundary amplification state;
wherein the first preset value is smaller than the second preset value.
Optionally, the boundary state determination module 30 is further specifically configured to:
determining a state of the target element according to an area change ratio of the target element, wherein the area change ratio of the target element is a ratio of a current area of the target element to an original area of the target element,
when the area change ratio of the target element is smaller than or equal to a first preset ratio, determining that the state of the target element reaches a boundary reduction state;
when the area change ratio of the target element is larger than or equal to a second preset ratio, determining that the state of the target element reaches a boundary amplification state;
wherein the first preset ratio is smaller than the second preset ratio.
Optionally, the boundary state determination module 30 is further specifically configured to:
determining a state of the target element according to an average distance between touch points where the finger is in contact with the touch screen, wherein,
when the average distance between the touch points of the fingers and the touch screen is smaller than or equal to a third preset numerical value, determining that the state of the target element reaches a boundary reduction state;
when the average distance between the touch points of the fingers and the touch screen is larger than or equal to a fourth preset numerical value, determining that the state of the target element reaches a boundary amplification state;
wherein the third predetermined value is less than the fourth predetermined value.
Optionally, the boundary state determination module 30 is further specifically configured to:
determining the state of the target element according to a variation ratio of an average distance between touch points of the finger in contact with the touch screen, wherein the variation ratio of the average distance between the touch points of the finger in contact with the touch screen is a ratio of an average distance between current touch points to an average distance between original touch points, wherein,
when the variation ratio of the average distance between the touch points of the contact of the finger and the touch screen is smaller than or equal to a third preset ratio, determining that the state of the target element reaches a boundary reduction state;
when the change ratio of the average distance between the touch points of the contact of the finger and the touch screen is larger than or equal to a fourth preset ratio, determining that the state of the target element reaches a boundary amplification state;
wherein the third predetermined ratio is smaller than the fourth predetermined ratio.
Optionally, the boundary operations include element deletion operations and/or element copy operations.
Optionally, the boundary operation trigger module 40 includes a first boundary operation trigger unit and a second boundary operation trigger unit.
The first boundary operation triggering unit is used for triggering element deleting operation according to the boundary reduction state when the target element reaches the boundary reduction state; and/or the presence of a gas in the atmosphere,
and the second boundary operation triggering unit is used for triggering element copying operation according to the boundary amplification state when the target element reaches the boundary amplification state, and copying the target element to a page with a preset jump relation, wherein the page comprises a newly-built page.
In the embodiment of the invention, the boundary state of the target element in the zooming process is judged, when the target element reaches the boundary state, the boundary operation can be triggered according to the boundary state, and the problem that the target element is shrunk to be extremely small or the target element is enlarged to be completely beyond the screen range when the target element on the touch equipment is zoomed in the prior art can be solved. The method can effectively limit the limit of the scaling of the target element, trigger the boundary operation in the boundary state, further process the target element, and improve the operability and the operation efficiency in the actual scene application.
Fig. 4 shows another flowchart of the gesture zoom control method of the touch-enabled device in the embodiment. As shown in fig. 4, the gesture zoom control method of the touch-controllable device includes the following steps:
s10': and determining a target element on the display page.
And S20': and performing zooming operation on the target element according to the received gesture zooming operation information, wherein the zooming operation comprises zooming-out operation and zooming-in operation.
S30': when the target element is subjected to the zooming operation, or after the target element is subjected to the zooming operation, whether the target element reaches the boundary state is judged.
S40': and if the state of the target element reaches the boundary state, triggering boundary operation according to the boundary state.
In steps S10'-S40', first, the touchable device determines a target element on the display page according to a user operation to confirm the target element. Specifically, the touch-enabled device may determine the target element on the display page through a user's selection operation on the touch-enabled screen, and may also determine the target element through voice control, eye tracking, and the like. Then, the user will perform zooming by touching the touch-enabled device and using a gesture zooming operation to control the target element, at this time, the touch-enabled device obtains information such as coordinate change, pressing duration, pressing intensity, and the like of the user's finger on the touch screen in real time through the touch screen, and specifically, the information may be implemented by using a preset method function, for example, by custom-creating a gesture listener function ongeturelistener () related to the obtained gesture information. Further, after obtaining the gesture zoom operation information, the touchable device may adopt a pre-created or open-source method function, for example, using related methods of Canvas, bitmap factory, thumbnouilutils, and the like in the Android API, to implement the zoom operation on the target element. As can be appreciated, the touchable device can obtain real-time information of scaling of the target element, such as the size of the target element, in real time. And when the target element is subjected to zooming operation or after the target element is subjected to zooming operation, comparing the real-time information obtained by zooming the target element with a judgment condition meeting the boundary state, and judging whether the target element reaches the boundary state or not. And finally, if the real-time information of the scaling of the target element meets the judgment condition of the boundary state, and the state of the target element is considered to reach the boundary state, triggering boundary operation according to the boundary state. The invention can effectively limit the limit of element scaling, trigger boundary operation in the boundary state, further process the element and improve the operability and operation efficiency in the actual scene application.
Fig. 5 is a schematic block diagram of another gesture scaling control apparatus of a touch-enabled device, which corresponds to the gesture scaling control method of the touch-enabled device in one-to-one manner in the embodiment. As shown in fig. 5, the gesture scaling control apparatus of the touch-enabled device includes a determination module 10', an execution module 20', a boundary state determination module 30 'and a boundary operation triggering module 40'. The implementation functions of the determining module 10', the executing module 20', the boundary state determining module 30', and the boundary operation triggering module 40' correspond to the steps corresponding to the gesture zooming control method of the touch-controllable device corresponding to fig. 4 in the embodiment one to one, and for avoiding redundant description, a detailed description is not repeated in this embodiment.
A determination module 10' for determining a target element on a display page.
And the executing module 20' is used for executing a zooming operation on the target element according to the received gesture zooming operation information, wherein the zooming operation comprises a zooming-out operation and a zooming-in operation.
The boundary state determining module 30' is configured to determine whether the target element reaches the boundary state when the target element is zoomed or after the target element is zoomed.
And the triggering module 40' is used for triggering the boundary operation according to the boundary state if the state of the target element reaches the boundary state.
The present embodiment provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the method for controlling gesture zooming of a touch-enabled device in the embodiments is implemented, and in order to avoid repetition, the details are not repeated herein. Alternatively, the computer program is not described herein in detail to avoid repetition when being executed by the processor, or the computer program is not described herein in detail to realize the functions of each module/unit in the gesture zooming control apparatus of the touch-enabled device corresponding to fig. 1 in the embodiment, or the computer program is not described herein in detail to avoid repetition when being executed by the processor to realize the functions of each module/unit in the gesture zooming control apparatus of the touch-enabled device corresponding to fig. 4 in the embodiment.
FIG. 6 is a schematic diagram of a computer device provided by an embodiment of the invention. As shown in fig. 6, the computer apparatus 50 of this embodiment includes: the processor 51, the memory 52, and the computer program 53 stored in the memory 52 and capable of running on the processor 51, where the computer program 53 is executed by the processor 51 to implement the gesture zooming control method of the touch device in the embodiment, and in order to avoid repetition, the details are not repeated herein. Alternatively, the computer program 53 is executed by the processor 51 to implement the functions of each model/unit in the gesture zooming control apparatus of the touch-enabled device in the embodiments, which are not repeated herein to avoid repetition.
The computing device 50 may be a computing device such as a smart interactive tablet, a desktop computer, a notebook, a palmtop, and a cloud server. The computer device 50 may include, but is not limited to, a processor 51, a memory 52. Those skilled in the art will appreciate that fig. 6 is merely an example of a computer device 50 and is not intended to limit the computer device 50 and that it may include more or fewer components than shown, or some components may be combined, or different components, e.g., the computer device may also include input output devices, network access devices, buses, etc.
The Processor 51 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 52 may be an internal storage unit of the computer device 50, such as a hard disk or a memory of the computer device 50. The memory 52 may also be an external storage device of the computer device 50, such as a plug-in hard disk provided on the computer device 50, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 52 may also include both internal and external storage devices for the computer device 50. The memory 52 is used for storing computer programs and other programs and data required by the computer device. The memory 52 may also be used to temporarily store data that has been output or is to be output.
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above function distribution may be performed by different functional units and modules as needed, that is, the internal structure of the device is divided into different functional units or modules, so as to perform all or part of the above described functions.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.
Claims (11)
1. A gesture zoom control method of a touch-controllable device, the method comprising:
determining a target element on a display page;
zooming out or zooming in the target element according to the gesture zooming operation on the display page;
determining the state of the target element in the process of reducing or enlarging the target element or at the end of reducing or enlarging the target element;
when the state of the target element reaches a boundary state, triggering boundary operation according to the boundary state;
the boundary operation comprises an element copying operation, and when the state of the target element reaches a boundary state, the boundary operation is triggered according to the boundary state, and the method comprises the following steps:
and when the target element reaches a boundary amplification state, triggering element copying operation according to the boundary amplification state, and copying the target element to a page with a preset jump relation, wherein the page comprises a newly-built page.
2. The method of claim 1, wherein the determining a target element on a display page comprises:
if the selected element exists on the display page, taking the selected element as a target element;
and if the selected elements do not exist on the display page, taking the elements on the display page as target elements.
3. The method of claim 1, wherein determining the state of the target element comprises:
determining a state of the target element according to an area of the target element, wherein,
and when the area of the target element is larger than or equal to a second preset value, determining that the state of the target element reaches the boundary amplification state.
4. The method of claim 1, wherein determining the state of the target element comprises:
determining a state of the target element according to an area change ratio of the target element, wherein the area change ratio of the target element is a ratio of a current area of the target element to an original area of the target element,
and when the area change ratio of the target element is greater than or equal to a second preset ratio, determining that the state of the target element reaches the boundary amplification state.
5. The method of claim 1, wherein the determining the state of the target element comprises:
determining a state of the target element according to an average distance between touch points of the finger in contact with the touch screen, wherein,
and when the average distance between the touch points of the contact of the finger and the touch screen is greater than or equal to a fourth preset numerical value, determining that the state of the target element reaches the boundary amplification state.
6. The method of claim 1, wherein the determining the state of the target element comprises:
determining the state of the target element according to a change ratio of an average distance between touch points of the finger and the touch screen, wherein the change ratio of the average distance between the touch points of the finger and the touch screen is a ratio of the average distance between the touch points at present and the average distance between the original touch points, and the change ratio of the average distance between the touch points at present and the original touch points is determined,
and when the variation ratio of the average distance between the touch points of the finger and the touch screen is larger than or equal to a fourth preset ratio, determining that the state of the target element reaches the boundary amplification state.
7. A gesture zoom control method of a touch-controllable device, the method comprising:
determining a target element on a display page;
performing zooming operation on the target element according to the received gesture zooming operation information, wherein the zooming operation comprises zooming-out operation and zooming-in operation;
when the target element is subjected to the zooming operation or after the target element is subjected to the zooming operation, judging whether the target element reaches a boundary state or not;
if the state of the target element reaches a boundary state, triggering boundary operation according to the boundary state;
the boundary operation includes an element copy operation, and if the state of the target element reaches a boundary state, triggering the boundary operation according to the boundary state includes:
and when the target element reaches a boundary amplification state, triggering element copying operation according to the boundary amplification state, and copying the target element to a page with a preset jump relation, wherein the page comprises a newly-built page.
8. A gesture zoom control apparatus of a touch-enabled device, the apparatus comprising:
the target element determining module is used for determining a target element on the display page;
the zooming module is used for zooming the target element according to the received gesture zooming operation information;
a boundary state determining module, configured to determine whether the target element reaches a boundary state during zooming of the target element or when zooming of the target element is finished, where the boundary state includes a boundary magnification state;
the boundary operation triggering module is used for triggering boundary operation according to the boundary state when the target element reaches the boundary state;
the boundary operation comprises element copying operation, and the second boundary operation triggering unit is used for triggering the element copying operation according to the boundary amplification state when the target element reaches the boundary amplification state, and copying the target element to a page with a preset jump relation, wherein the page comprises a newly-built page.
9. A gesture zoom control apparatus of a touch-controllable device, the apparatus comprising:
the determining module is used for determining a target element on the display page;
the execution module is used for executing zooming operation on the target element according to the received gesture zooming operation information, wherein the zooming operation comprises zooming-out operation and zooming-in operation;
a boundary state determination module, configured to determine whether the target element reaches a boundary state when the target element is subjected to the scaling operation or after the target element is subjected to the scaling operation;
the triggering module is used for triggering boundary operation according to the boundary state if the state of the target element reaches the boundary state;
the boundary operation comprises element copying operation, and the second boundary operation triggering unit is used for triggering the element copying operation according to the boundary amplification state when the target element reaches the boundary amplification state, and copying the target element to a page with a preset jump relation, wherein the page comprises a newly-built page.
10. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the gesture zoom control method of the touch-enabled device according to any one of claims 1 to 6 when executing the computer program, or the processor implements the steps of the gesture zoom control method of the touch-enabled device according to claim 7 when executing the computer program.
11. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the gesture zoom control method of the touch-enabled device according to any one of claims 1 to 6, or which, when being executed by a processor, carries out the steps of the gesture zoom control method of the touch-enabled device according to claim 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910555666.6A CN110286840B (en) | 2019-06-25 | 2019-06-25 | Gesture zooming control method and device of touch equipment and related equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910555666.6A CN110286840B (en) | 2019-06-25 | 2019-06-25 | Gesture zooming control method and device of touch equipment and related equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110286840A CN110286840A (en) | 2019-09-27 |
CN110286840B true CN110286840B (en) | 2022-11-11 |
Family
ID=68005567
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910555666.6A Active CN110286840B (en) | 2019-06-25 | 2019-06-25 | Gesture zooming control method and device of touch equipment and related equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110286840B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110908585B (en) * | 2019-11-29 | 2021-10-29 | 稿定(厦门)科技有限公司 | Picture processing method and device |
CN113849104A (en) * | 2020-06-28 | 2021-12-28 | 华为技术有限公司 | Graphic control method, apparatus, terminal device and readable storage medium |
CN114430492B (en) * | 2020-10-29 | 2023-10-13 | 海信视像科技股份有限公司 | Display device, mobile terminal and picture synchronous scaling method |
CN112527165A (en) * | 2020-12-16 | 2021-03-19 | Oppo广东移动通信有限公司 | Method, device and equipment for adjusting interface display state and storage medium |
CN114691006B (en) * | 2020-12-31 | 2024-01-23 | 博泰车联网科技(上海)股份有限公司 | Information processing method and related device based on screen projection |
CN114265526B (en) * | 2021-12-10 | 2023-07-28 | 北京金堤科技有限公司 | Map scaling method and device, storage medium and electronic equipment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103067781A (en) * | 2012-12-20 | 2013-04-24 | 中国科学院软件研究所 | Multi-scale video expressing and browsing method |
CN103777876A (en) * | 2012-10-19 | 2014-05-07 | 阿里巴巴集团控股有限公司 | Page processing method, page processing device and electronic equipment of touch screen display |
CN105190514A (en) * | 2013-03-11 | 2015-12-23 | 三星电子株式会社 | Apparatus and method for deleting an item on a touch screen display |
CN105549824A (en) * | 2015-12-26 | 2016-05-04 | 魅族科技(中国)有限公司 | Display control method and mobile terminal |
CN106055331A (en) * | 2016-05-31 | 2016-10-26 | 福建天晴数码有限公司 | Model boundary generation method and device |
CN106095466A (en) * | 2016-06-24 | 2016-11-09 | 北京市育学林教育技术有限公司 | Electronic teaching material clicks on amplification method and system thereof |
CN106970753A (en) * | 2017-03-15 | 2017-07-21 | 福建中金在线信息科技有限公司 | A kind of picture amplifying method and device |
CN107977084A (en) * | 2012-05-09 | 2018-05-01 | 苹果公司 | Method and apparatus for providing touch feedback for the operation performed in the user interface |
CN108228050A (en) * | 2017-12-13 | 2018-06-29 | 阿里巴巴集团控股有限公司 | A kind of picture Zoom method, device and electronic equipment |
CN109478413A (en) * | 2016-05-27 | 2019-03-15 | Imint影像智能有限公司 | System and method for zoom function |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102566908A (en) * | 2011-12-13 | 2012-07-11 | 鸿富锦精密工业(深圳)有限公司 | Electronic equipment and page zooming method for same |
CN102708540A (en) * | 2012-04-21 | 2012-10-03 | 上海量明科技发展有限公司 | Method and client side for zooming screen capturing areas |
CN106445332A (en) * | 2016-09-05 | 2017-02-22 | 深圳Tcl新技术有限公司 | Icon display method and system |
CN109785256B (en) * | 2019-01-04 | 2024-05-14 | 平安科技(深圳)有限公司 | Image processing method, terminal equipment and computer readable medium |
-
2019
- 2019-06-25 CN CN201910555666.6A patent/CN110286840B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107977084A (en) * | 2012-05-09 | 2018-05-01 | 苹果公司 | Method and apparatus for providing touch feedback for the operation performed in the user interface |
CN103777876A (en) * | 2012-10-19 | 2014-05-07 | 阿里巴巴集团控股有限公司 | Page processing method, page processing device and electronic equipment of touch screen display |
CN103067781A (en) * | 2012-12-20 | 2013-04-24 | 中国科学院软件研究所 | Multi-scale video expressing and browsing method |
CN105190514A (en) * | 2013-03-11 | 2015-12-23 | 三星电子株式会社 | Apparatus and method for deleting an item on a touch screen display |
CN105549824A (en) * | 2015-12-26 | 2016-05-04 | 魅族科技(中国)有限公司 | Display control method and mobile terminal |
CN109478413A (en) * | 2016-05-27 | 2019-03-15 | Imint影像智能有限公司 | System and method for zoom function |
CN106055331A (en) * | 2016-05-31 | 2016-10-26 | 福建天晴数码有限公司 | Model boundary generation method and device |
CN106095466A (en) * | 2016-06-24 | 2016-11-09 | 北京市育学林教育技术有限公司 | Electronic teaching material clicks on amplification method and system thereof |
CN106970753A (en) * | 2017-03-15 | 2017-07-21 | 福建中金在线信息科技有限公司 | A kind of picture amplifying method and device |
CN108228050A (en) * | 2017-12-13 | 2018-06-29 | 阿里巴巴集团控股有限公司 | A kind of picture Zoom method, device and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN110286840A (en) | 2019-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110286840B (en) | Gesture zooming control method and device of touch equipment and related equipment | |
US9685143B2 (en) | Display control device, display control method, and computer-readable storage medium for changing a representation of content displayed on a display screen | |
US20150277571A1 (en) | User interface to capture a partial screen display responsive to a user gesture | |
DE112013002409T5 (en) | Apparatus, method and graphical user interface for displaying additional information in response to a user contact | |
CN105068727A (en) | Realization method and device for drawing tool | |
CN106897321B (en) | Method and device for displaying map data | |
WO2017088228A1 (en) | Picture zooming method and apparatus | |
CN114518820A (en) | Icon sorting method and device and electronic equipment | |
CN116107531A (en) | Interface display method and device | |
CN110286827B (en) | Element scaling control method, device, equipment and storage medium | |
CN114415886A (en) | Application icon management method and electronic equipment | |
WO2024046203A1 (en) | Content display method and apparatus | |
CN111796746B (en) | Volume adjusting method, volume adjusting device and electronic equipment | |
CN112765500A (en) | Information searching method and device | |
CN104777978A (en) | Terminal | |
CN109085974B (en) | Screen control method, system and terminal equipment | |
WO2023138509A1 (en) | Image processing method and apparatus | |
CN116088744A (en) | Application control method and device thereof | |
CN114115639A (en) | Interface control method and device, electronic equipment and storage medium | |
WO2020253058A1 (en) | Picture floating display method and apparatus, terminal and storage medium | |
CN112286430A (en) | Image processing method, apparatus, device and medium | |
CN110568989A (en) | service processing method, service processing device, terminal and medium | |
WO2019127770A1 (en) | Display method, device, equipment and storage medium for organization window | |
US11573689B1 (en) | Trace layer for replicating a source region of a digital image | |
JPH07219700A (en) | Information processor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |