CN117111823A - Scaling method and related device - Google Patents
Scaling method and related device Download PDFInfo
- Publication number
- CN117111823A CN117111823A CN202310863199.XA CN202310863199A CN117111823A CN 117111823 A CN117111823 A CN 117111823A CN 202310863199 A CN202310863199 A CN 202310863199A CN 117111823 A CN117111823 A CN 117111823A
- Authority
- CN
- China
- Prior art keywords
- desktop
- scaling
- double
- finger
- speed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 112
- 230000008569 process Effects 0.000 claims abstract description 39
- 230000009467 reduction Effects 0.000 claims description 70
- 230000003321 amplification Effects 0.000 claims description 51
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 51
- 230000000875 corresponding effect Effects 0.000 claims description 21
- 238000004590 computer program Methods 0.000 claims description 17
- 238000012512 characterization method Methods 0.000 claims description 5
- 230000002596 correlated effect Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 description 56
- 230000006870 function Effects 0.000 description 9
- 230000001976 improved effect Effects 0.000 description 9
- 238000004422 calculation algorithm Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 6
- 238000013016 damping Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000008093 supporting effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
- G04G21/08—Touch switches specially adapted for time-pieces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides a scaling method and a related device, which are applied to the technical field of terminals. The method is applied to the electronic equipment and comprises the following steps: displaying a desktop, the desktop including icons of at least one application; receiving double-finger operation for enlarging the desktop on the desktop; during the continuous receiving process of the operation for enlarging the desktop, enlarging the desktop according to different enlarging speeds at different stages; the operation for enlarging the desktop ends switching to an operation for reducing the desktop; during the continued receipt of the operation for zooming out the desktop, the desktop is zoomed out at different zooming-out speeds at different stages. In this way, the desktop can be supported to be zoomed through double fingers, so that the user can accurately trigger the icon which the user wants to trigger after the desktop is zoomed through double fingers.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a scaling method and a related device.
Background
The touch screen of the wearable device is usually smaller, for example, taking the wearable device as a smart watch as an example, the touch screen is usually a dial of the smart watch, and the area of the dial is smaller. A plurality of application icons may be displayed on a touch screen of the smart watch. The user may click on an application icon displayed on the touch screen to launch the application.
However, because the touch screen area of the wearable device is small, when the user clicks an application icon on the touch screen, a false trigger situation may occur.
Disclosure of Invention
The embodiment of the application provides a scaling method and a related device, which are applied to the technical field of terminals. The zooming method is applied to the electronic equipment, such as the wearable equipment, so that the electronic equipment can support zooming the desktop through double fingers, hardware equipment (such as a crown) is not needed, and after the desktop is zoomed by double fingers, a user can accurately trigger an icon which the user wants to trigger, and further the use experience of the user is improved.
In a first aspect, an embodiment of the present application provides a scaling method, applied to an electronic device, where the method includes:
the desktop is displayed, the desktop including icons of at least one application. A double-fingered operation for zooming in on the desktop is received on the desktop. In the process of continuously receiving the operation for magnifying the desktop, the method comprises the following stages: the desktop is enlarged according to the first enlarging speed, the desktop is enlarged according to the second enlarging speed, the second enlarging speed is later than the first enlarging speed, the second enlarging speed is smaller than the first enlarging speed, and the desktop is stopped to be enlarged in the third stage. The operation for enlarging the desktop ends switching to an operation for reducing the desktop. In the process of continuously receiving the operation for shrinking the desktop, the method comprises the following stages: and a fourth stage, wherein the desktop is reduced according to the first reduction speed, a fifth stage, wherein the desktop is reduced according to the second reduction speed, the fifth stage is later than the fourth stage, the second reduction speed is less than the first reduction speed, and the sixth stage stops reducing the desktop.
Therefore, by scaling the desktop, the icons of the applications on the desktop are scaled along with the scaling of the desktop, so that a user can accurately trigger the icons of the applications to be triggered, and in the process of touching the desktop of the electronic equipment by double fingers, the scaling speed of the desktop is smaller when the scaling is closer to the scaling limit, and the scaling damping sense is larger when the scaling is closer to the scaling limit in sense, so that the use experience sense of the user on the electronic equipment can be improved.
In one possible implementation, before stopping zooming in on the desktop, the method further includes: and a seventh step of enlarging the desktop at a third enlarging speed. The seventh stage is earlier than the third stage, and the third amplification speed is smaller than the second amplification speed.
Therefore, the gradient of the change of the zooming speed can be reduced, and the occurrence probability that the user experience is influenced due to the large change of the zooming speed is further reduced.
In one possible implementation, before stopping shrinking the desktop, the method further includes: and eighth stage, the desktop is scaled down according to the third scaling down speed. The eighth stage is earlier than the sixth stage, and the third reduction speed is smaller than the second reduction speed.
Therefore, the gradient of the change of the zoom-out speed can be reduced, and the occurrence probability that the user experience is influenced due to the large change of the zoom-out speed is further reduced.
In one possible implementation, the first amplification speed is obtained from a preset correspondence between the scaling rate and the scaling speed according to the scaling rate at the n+1th moment. Wherein, the n+1th moment is any moment in the first stage, n is an integer, and the amplification ratio of the n+1th moment is related to the following: the distance between the double fingers at the n+1 time, the distance between the double fingers at the n time, the scaling speed at the n time, and the distance between the double fingers when the double fingers are first pressed in the operation for enlarging the desktop.
In this way, the zooming speed can be adjusted according to the zooming scale, so that better experience of zooming the desktop can be realized for the user.
In one possible implementation, the magnification ratio at time n+1 is positively correlated with: distance between double fingers at time n+1, scaling at time n, and scaling speed at time n. The magnification ratio at time n+1 is inversely related to: distance between the double fingers at the nth time, and distance between the double fingers when the double fingers are first pressed in an operation for enlarging the desktop.
In this way, the zooming speed can be adjusted according to the zooming scale, so that better experience of zooming the desktop can be realized for the user.
In one possible implementation, the scaling at time n+1 satisfies the formula:
R n+1 =R n +(D n+1 -D n )×v n ÷D F,0
wherein R is n+1 For the n+1th time scaling, R n For the n-th time scale, D n+1 Is the distance between the two fingers at the n+1 time, D n V is the distance between the double fingers at the nth moment n For the zoom speed at the nth time, D F,0 To enlarge the distance between the two fingers when the two fingers are first pressed in the operation for enlarging the desktop.
In this way, the zooming speed can be adjusted according to the zooming scale, so that the zooming scales at different stages correspond to different zooming speeds, and the desktop of the electronic device is zoomed at different zooming speeds at different stages, so that better experience of zooming the desktop is provided for a user.
In one possible implementation manner, in the first stage, the table top is scaled according to the scaling ratio at the n+1th moment by taking the coordinates of the two-finger center at the n+1th moment as the scaling center.
In this way, the icon of the application which the user wants to zoom is displayed on the display position which the user is convenient to operate after zooming, and the occurrence probability that the icon of the application which the user wants to zoom is invisible or is not fully displayed after zooming is reduced.
In one possible implementation, in the course of continuously receiving the operation for zooming in on the desktop, the method further includes: a two-finger event is acquired. And when the target mark is judged to be a first value, obtaining an event in the double-finger scaling process. The event characterization double-finger scaling event in the double-finger scaling process has started and has not ended. The scale is calculated and the target identity is kept at a first value.
Thus, timely and accurate zooming of the desktop can be realized.
In one possible implementation, when receiving the operation of the double-finger for zooming in on the desktop, the method further includes: a two-finger event is acquired. And when the target mark is judged to be a second value, obtaining a double-finger zoom starting event. The two-finger zoom start event is indicated to the desktop application and the target identifier is modified to a first value.
Thus, timely and accurate zooming of the desktop can be realized.
In one possible implementation, a two-finger zoom start event is obtained, comprising: and obtaining the corresponding views of the two fingers. And when the views corresponding to the two fingers are the same, obtaining a two-finger zoom starting event.
Therefore, the operation of the double fingers can be determined to be the scaling operation, and the operation intention of the user can be further determined accurately.
In a second aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory.
The memory stores computer-executable instructions.
The processor executes computer-executable instructions stored in the memory to cause the electronic device to perform the method as in the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program. The computer program, when executed by a processor, implements a method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run, causes a computer to perform the method as in the first aspect.
In a fifth aspect, an embodiment of the application provides a chip comprising a processor for invoking a computer program in memory to perform a method as in the first aspect.
In a sixth aspect, an embodiment of the present application provides an apparatus, where the apparatus may be an electronic device, or may be a chip or a chip system in an electronic device. The apparatus may include a display unit and a processing unit. When the apparatus is an electronic device, the display unit may be a display screen or a touch screen. The display unit is configured to perform the step of displaying the desktop, so that the electronic device implements a zoom method described in the first aspect or any one of the possible implementations of the first aspect. When the apparatus is an electronic device, the processing unit may be a processor. The apparatus may further comprise a storage unit, which may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the electronic device implements the method described in the first aspect or any one of the possible implementation manners of the first aspect. When the apparatus is a chip or a system of chips within an electronic device, the processing unit may be a processor. The processing unit executes instructions stored by the storage unit to cause the electronic device to implement the method described in the first aspect or any one of the possible implementations of the first aspect. The memory unit may be a memory unit (e.g., a register, a cache, etc.) within the chip, or a memory unit (e.g., a read-only memory, a random access memory, etc.) within the electronic device that is external to the chip.
Illustratively, the display unit is configured to display a desktop, where the desktop includes at least one icon of an application. And the processing unit is used for receiving double-finger operation for amplifying the desktop on the desktop. In the process of continuously receiving the operation for magnifying the desktop, the method comprises the following stages: the desktop amplifying device comprises a first stage, a processing unit, a second stage and a processing unit, wherein the processing unit is further used for amplifying the desktop at a first amplifying speed, the second stage is later than the first stage, the second amplifying speed is smaller than the first amplifying speed, and the processing unit is further used for stopping amplifying the desktop. The operation for enlarging the desktop ends switching to an operation for reducing the desktop. In the process of continuously receiving the operation for shrinking the desktop, the method comprises the following stages: the processing unit is further used for shrinking the desktop at a first shrinking speed, the processing unit is further used for shrinking the desktop at a second shrinking speed, the fifth stage is later than the fourth stage, the second shrinking speed is smaller than the first shrinking speed, the sixth stage is further used for stopping shrinking the desktop.
In a possible implementation, the processing unit is further configured to zoom in on the desktop at a third zoom-in speed in a seventh stage before stopping zooming in on the desktop. The seventh stage is earlier than the third stage, and the third amplification speed is smaller than the second amplification speed.
In a possible implementation, before stopping zooming out the desktop, the processing unit is further configured to zoom out the desktop at a third zooming-out speed in an eighth stage. The eighth stage is earlier than the sixth stage, and the third reduction speed is smaller than the second reduction speed.
In one possible implementation manner, the first amplifying speed is a scaling ratio according to the n+1th moment, and the processing unit is further configured to obtain the first amplifying speed corresponding to the scaling ratio at the n+1th moment from a preset correspondence between the scaling ratio and the scaling speed. Wherein, the n+1th moment is any moment in the first stage, n is an integer, and the amplification ratio of the n+1th moment is related to the following: the distance between the double fingers at the n+1 time, the distance between the double fingers at the n time, the scaling speed at the n time, and the distance between the double fingers when the double fingers are first pressed in the operation for enlarging the desktop.
In one possible implementation, the magnification ratio at time n+1 is positively correlated with: distance between double fingers at time n+1, scaling at time n, and scaling speed at time n. The magnification ratio at time n+1 is inversely related to: distance between the double fingers at the nth time, and distance between the double fingers when the double fingers are first pressed in an operation for enlarging the desktop.
In one possible implementation, the scaling at time n+1 satisfies the formula:
R n+1 =R n +(D n+1 -D n )×v n ÷D F,0
wherein R is n+1 For the n+1th time scaling, R n For the n-th time scale, D n+1 Is the distance between the two fingers at the n+1 time, D n V is the distance between the double fingers at the nth moment n For the zoom speed at the nth time, D F,0 To enlarge the distance between the two fingers when the two fingers are first pressed in the operation for enlarging the desktop.
In a possible implementation manner, in the first stage, the processing unit is further configured to scale the desktop according to the scaling ratio at the n+1 time, with the coordinates of the center of the two fingers at the n+1 time as the scaling center.
In one possible implementation, the processing unit is further configured to obtain the two-finger event during the duration of the operation for zooming in on the desktop. And the processing unit is also used for obtaining an event in the double-finger scaling process when the target mark is judged to be the first value. The event characterization double-finger scaling event in the double-finger scaling process has started and has not ended. The processing unit is further configured to calculate a scaling and keep the target identifier as a first value.
In one possible implementation, the processing unit is further configured to obtain a two-finger event when receiving an operation of the two-finger for zooming in on the desktop. And the processing unit is also used for obtaining a double-finger scaling start event when the target mark is judged to be the second value. The processing unit is further configured to indicate a two-finger zoom start event to the desktop application and modify the target identifier to a first value.
In a possible implementation manner, the processing unit is further configured to obtain a view corresponding to each of the two fingers. And the processing unit is also used for obtaining the double-finger zoom starting event when the views corresponding to the double fingers are the same.
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
FIG. 1 is a smart watch in a possible implementation;
fig. 2 is a schematic software structure of an electronic device according to an embodiment of the present application;
FIG. 3 is a flowchart of a scaling method according to an embodiment of the present application;
FIG. 4 is a flowchart of another scaling method according to an embodiment of the present application;
FIG. 5 is an enlarged schematic view of a desktop according to an embodiment of the present application;
fig. 6 is a schematic diagram of desktop reduction according to an embodiment of the present application.
Detailed Description
In order to facilitate the clear description of the technical solutions of the embodiments of the present application, the following simply describes some terms and techniques involved in the embodiments of the present application:
1. partial terminology
In embodiments of the present application, the words "first," "second," and the like are used to distinguish between identical or similar items that have substantially the same function and effect. For example, the first chip and the second chip are merely for distinguishing different chips, and the order of the different chips is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
The "at … …" in the embodiment of the present application may be an instant when a certain situation occurs, or may be a period of time after a certain situation occurs, which is not particularly limited. In addition, the display interface provided by the embodiment of the application is only used as an example, and the display interface can also comprise more or less contents.
2. Electronic equipment
The electronic device of the embodiment of the application can comprise a handheld device, a vehicle-mounted device, a wearable electronic device and the like with an image processing function. For example, some electronic devices are: smart watches, mobile phones, folding phones, tablet computers, palmtop computers, notebook computers, mobile internet devices (mobile internet device, MID), wearable devices, virtual Reality (VR) devices, augmented reality (augmented reality, AR) devices, wireless terminals in industrial control (industrial control), wireless terminals in unmanned aerial vehicles (self driving), wireless terminals in tele-surgery (remote medical surgery), wireless terminals in smart grid (smart grid), wireless terminals in transportation security (transportation safety), wireless terminals in smart home (smart home), cellular phones, cordless phones, session initiation protocol (session initiation protocol, SIP) phones, wireless local loop (wireless local loop, WLL) stations, personal digital assistants (personal digital assistant, PDA), handheld devices with wireless communication functions, computing devices or other processing devices connected to wireless modems, vehicle devices, wearable devices, terminals in the smart grid (smart grid), or terminals in the land-based network (PLMN) are not limited to this application and are not implemented in the future communication network (PLMN).
By way of example, and not limitation, in embodiments of the application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as hearing aids, glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
In addition, in the embodiment of the application, the electronic equipment can also be electronic equipment in an internet of things (internet of things, ioT) system, and the IoT is an important component of the development of future information technology, and the main technical characteristics of the IoT are that the article is connected with a network through a communication technology, so that the man-machine interconnection and the intelligent network of the internet of things are realized.
The electronic device in the embodiment of the application may also be referred to as: a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, or a user equipment, etc.
In an embodiment of the present application, the electronic device or each network device includes a hardware layer, an operating system layer running on top of the hardware layer, and an application layer running on top of the operating system layer. The hardware layer includes hardware such as a central processing unit (central processing unit, CPU), a memory management unit (memory management unit, MMU), and a memory (also referred to as a main memory). The operating system may be any one or more computer operating systems that implement business processes through processes (processes), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system. The application layer comprises applications such as a browser, an address book, word processing software, instant messaging software and the like.
Fig. 1 shows a smart watch in a possible implementation.
As shown in fig. 1, the touch screen of the smart watch may display a desktop of the smart watch, where the desktop includes icons of a plurality of applications. The icons of applications are also referred to as application icons. When the user wants to open application G on the smart watch, the user clicks the touch screen. Because the touch screen of the smart watch is smaller, the application icon displayed on the touch screen is smaller, and the icon of the application G is adjacent to the icon of the application C, when the user clicks the touch screen, the user may not be able to precisely click the icon of the application G, but click the icon of the application C, thereby causing a situation that the application G is not opened and the application C is opened.
In a possible implementation, the smart watch includes a rotatable crown. The rotary crown can realize the zooming of the desktop.
However, if some wearable devices do not have a rotatable crown, zooming on the desktop cannot be achieved, and a situation that a user triggers by mistake in the touch screen still occurs, so that the user experience of the user on the electronic device is reduced.
Therefore, the embodiment of the application provides a zooming method which can support zooming the desktop by double fingers without hardware equipment, so that a user can accurately trigger an icon which the user wants to trigger after the desktop is zoomed by double fingers, and further the use experience of the user is improved.
Fig. 2 shows a software structure schematic diagram of an electronic device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. As shown in fig. 2, the system of the electronic device is divided into four layers, from top to bottom, an application layer, a framework layer, algorithms and internal libraries, and a kernel, respectively.
The application layer may include a series of application packages. As shown in fig. 2, the application package may include applications such as a compass, a timer, weather, music, a desktop, an alarm clock, a stopwatch, etc.
The framework layer may provide an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The framework layer may include some predefined functions. As shown in fig. 2, the framework layer may include a User Interface (UI) framework, which may also be referred to as a UI kit. The UI framework may include fonts, layout, views, and the like.
The algorithms and internal libraries may include algorithm libraries, which may include gesture algorithms, pressure algorithms, and the like.
For example, a gesture algorithm may be used to calculate a two-finger scale when a two-finger scaling operation, and a pressure algorithm may be used to calculate a pressure when a finger touches a Touch Panel (TP) of the electronic device.
The kernel may include a hardware abstraction layer (hardware abstraction layer, HAL) and a hardware driver layer. The hardware abstraction layer may include a touch screen.
For example, the touch screen may be used to read the number of fingers and the touch position of the fingers when the fingers touch the touch screen. The touch location may be referred to as a touch point.
The workflow of electronic device software and hardware is illustrated below in connection with a scenario where a desktop of an electronic device is scaled by a two-finger touch TP.
When a finger touches TP, TP may read the touch position of the finger and the number of fingers, and the kernel generates a single-finger PRESS event (state_press), a single-finger lift event (state_release), or a double-finger event (state_scale) according to the touch position of the finger and the number of fingers read by TP.
Illustratively, when one finger touches TP, TP reads the number of fingers of 1 and the touch position of the finger, the kernel generates a single-finger press event and stores the single-finger press event in the kernel layer.
When the two fingers touch the TP, the TP reads the number of fingers as 2 and the touch positions of the two fingers, the kernel generates a double-finger event, and the double-finger event is stored in the kernel layer.
When one of the two fingers touching the TP is lifted, the TP reads the number of fingers as 1 and the touch positions of the fingers which are not lifted, the kernel generates a single-finger lifting event, and the single-finger lifting event is stored in the kernel layer.
Wherein, the single-finger pressing event, the single-finger lifting event or the double-finger event all belong to TP events. The TP event may include the number of fingers, touch location coordinates, and a timestamp.
Alternatively, TP may read the touch position of the finger and the number of fingers on the touch screen at the screen refresh frequency. The kernel generates TP events according to the reading frequency of TP.
The screen refresh frequency may be 60Hz or 120Hz, and in the embodiment of the present application, the screen refresh frequency is not specifically limited.
The UIkit may read the last generated TP event from the kernel at the screen refresh frequency.
Illustratively, the UIkit may read the last generated TP event from the kernel every 16.67 milliseconds (ms) when the screen refresh frequency is 60 Hz.
When the UIkit reads the double-finger event, the UIkit may generate the double-finger scaling event, and the UIkit may report the double-finger scaling event to a desktop application of the application layer.
The two-finger zoom event may include a two-finger zoom start event and an in-process event. The two-finger zoom start event may be used to indicate that the two-finger zoom event starts and the event during the two-finger zoom may be used to indicate that the two-finger zoom event has started and has not ended. The event in the two-finger zoom process may also be referred to as a two-finger zoom event.
The two-finger zoom event may include a time stamp and respective touch location coordinates of the two fingers. The touch positions of the two fingers are, for example, touch point1 (point 1) and touch point1 (point 2). Touch location coordinates, e.g., (x, y).
When the desktop application receives an event in the double-finger scaling process, the desktop application can calculate scaling according to the touch position coordinates of the double fingers and the like.
The desktop application may transmit the scaling information to the framework layer. The zoom information includes an identification of the desktop view and the calculated scale.
When the frame layer receives the scaling information, the frame layer can render, draw, compose and display the desktop view according to the scaling in the scaling information.
Further, in the following, with reference to fig. 2 and fig. 3, a scaling method provided by the present application is described in detail by taking a scenario of scaling a desktop of an electronic device as an example by using a double-finger touch TP.
Fig. 3 shows a flowchart of a scaling method according to an embodiment of the present application.
As shown in fig. 3, the scaling procedure provided by the embodiment of the present application may include:
s101, UIkit reads to a double finger event (state_scale).
Illustratively, in the following possible scenario, the UIkit may read a two-finger event.
Scene one: when the user touches the touch screen for the first time, the UIkit reads the TP event to obtain the double-finger event for the first time.
Scene II: the user touches the touch screen with one finger and then with the other finger. When a user first touches the touch screen with one finger, the UIkit reads a TP event, which is not a two-finger event, e.g., a single-finger press event. When the other finger of the user touches the touch screen again, the UIkit reads the TP event, and the double-finger event is obtained for the first time.
Scene III: the user keeps double fingers on the touch screen, and the UIkit reads the TP event generated last time from the kernel according to the screen refreshing frequency, so that the non-first double-finger event can be continuously obtained.
S102, the UIkit judges whether the previous TP state is a double-finger zoom state or not.
Optionally, when the UIkit first gets a double-finger event, the UIkit determines whether the TP state before the first gets the double-finger event is a double-finger zoom state. The two-finger zoom state may also be referred to as a two-finger state.
For example, a target identifier for identifying whether the TP state is a two-finger zoom state may be maintained in the electronic device, and taking the target identifier as is scaling_as an example, the is scaling_may have a plurality of values, and these values may be used to determine whether the TP state is a two-finger zoom state. For example, the value of is scaling_may include false and true, is scaling_=false may be used to indicate that the TP state is a non-double-finger scaling state, and is scaling_=true may be used to indicate that the TP state is a double-finger scaling state.
When the double-finger event is not obtained, the target identifier is scaling_may be maintained as false, and when the double-finger event is obtained, the target identifier is scaling_may be maintained as true.
Thus, when the UIkit obtains the double-finger event for the first time, and the electronic device does not obtain the double-finger event before obtaining the double-finger event for the first time, corresponding to the first and second scenes, the value of is scaling_is false, and the UIkit determines that the TP state before obtaining the double-finger event for the first time is a non-double-finger scaling state, and the UIkit may perform steps S103 to S105.
Corresponding to the third scenario, when the UIkit obtains the non-first double-finger event, because the electronic device has obtained the double-finger event before obtaining the non-first double-finger event, the value of the is scaling_is true, and the UIkit determines that the TP state before the non-first double-finger event is the double-finger scaling state. The UIkit may perform steps S106-S111.
S103, the UIkit acquires views of respective touch positions of two fingers of the two-finger event.
Illustratively, the UIkit acquires views of the respective touch locations of the two fingers of the two-finger event, such as view1 (view 1) and view2 (view 2).
S104, the UIkit judges that views of respective touch positions of two fingers of the two-finger event are the same, and the views support the two-finger zoom.
This view may be referred to as a desktop view, for example. If the UIkit finds out the attribute of the desktop view supporting the double-finger scaling through the interface (such as setscan interface) provided by the framework layer to the application layer, the UIkit can determine that the desktop view supports the double-finger scaling.
S105, the UIkit generates a double-finger scaling start event and reports the double-finger scaling start event to the desktop application.
Illustratively, when the UIkit determines that the views of the touch positions of the two fingers of the two-finger event are the same and the views support the two-finger zoom, the UIkit may generate a two-finger zoom start event and report the two-finger zoom start event to the desktop application, and the UIkit may also set is scaling to true.
Thus, through steps S103 to S105, it can be determined that the operation of the two fingers is a zoom operation, and further, the operation intention of the user can be accurately determined.
S106, the UIkit reports the event in the double-finger scaling process to the desktop application.
Illustratively, when the UIkit obtains the double-finger event and determines that the TP state before the double-finger event is the double-finger scaling state, the UIkit may generate an event in the double-finger scaling process and report the event in the double-finger scaling process to the desktop application.
S107, the desktop application calculates and records the distance of the current double finger (scale curr distance).
For example, the desktop application may perform distance calculation according to the touch position coordinates of the two fingers, to obtain the distance of the two fingers.
The desktop application may record or store the distance of the two fingers.
S108, the desktop application calculates double-finger center coordinates (pivotX, pivotY).
Illustratively, the desktop application calculates coordinates of a midpoint of the two-finger distance from the touch location coordinates of the two fingers, resulting in two-finger center coordinates (pivotX, pivotY).
The desktop application may record or store the two-finger center coordinates (pivotX, pivotY).
S109, the desktop application calculates and records the initial distance of the double fingers (scale start distance).
When the desktop application receives the double-finger zoom start event reported by the UIkit, the desktop application may calculate the distance of the double finger according to the touch position coordinates of the double finger corresponding to the double-finger zoom start event, obtain the initial distance of the double finger, and record or store the initial distance of the double finger.
S110, the desktop application calculates the scaling.
Illustratively, the desktop application calculates a scaling currscale for the event in the double-finger scaling process.
For example, the scaling currscale may satisfy the following formula:
currscale=lastscale+(scale curr distance-scale last distance)*scale speed/scale start distance
wherein lastscale is the last scaling, scale curr distance is the calculated distance of the double finger, scale last distance is the distance of the last double finger, scale speed is the last scaling speed, and scale start distance is the initial distance of the double finger when the double finger event is obtained for the first time.
The previous calculation method of the scaling is similar to the current algorithm of the scaling, and will not be described again. Wherein the initial value of the scaling may be set to 1.
Illustratively, the last zoom speed may be determined by the desktop application from the correspondence between the zoom scale and the zoom speed according to the last zoom scale.
Illustratively, the scaling may include a magnification ratio R F And a reduction ratio R S . The zoom speed may include a zoom speed v F And a reduction speed v s 。
The correspondence of the scaling ratio and the scaling speed may be as shown in table 1.
Table 1 correspondence between scaling ratio and scaling speed
As shown in table 1, the electronic device may have an enlargement ratio limit of 1.5 and a reduction ratio limit of 0.5.
When R is more than or equal to 0.5 F When the amplification speed is less than or equal to 1.25, the amplification speed is maximum, and the amplification speed is 1.
When 1.25 < R F When the amplification speed is less than or equal to 1.4, the amplification speed is reduced and is 0.5.
When 1.4 < R F When the amplification speed is less than or equal to 1.5, the amplification speed is further reduced, and the amplification speed is 0.25.
When the magnification ratio is higher than the magnification limit of 1.5, the magnification speed is 0, and the electronic device stops magnifying the desktop.
In this way, in the process of double-finger touch amplification of the desktop of the electronic equipment, the amplification rate is closer to the amplification rate limit, the amplification speed of the desktop is smaller, and the sense of sense is that the amplification rate is closer to the amplification rate limit, the amplification damping sense is larger, so that the use experience sense of a user on the electronic equipment can be improved.
Similarly, when 1.5.ltoreq.R S When the reduction speed is less than or equal to 0.75, the reduction speed is maximum, and the reduction speed is 1.
When 0.75 < R S When the reduction speed is less than or equal to 0.6, the reduction speed is reduced and is 0.5.
When 0.6 < R S When the reduction speed is less than or equal to 0.5, the reduction speed is further reduced, and the reduction speed is 0.25.
When the reduction ratio is lower than the reduction limit of 0.5, the reduction speed is 0, and the electronic device stops reducing the desktop.
In this way, in the process of reducing the desktop of the electronic device by double-finger touch, the closer the reduction ratio is to the reduction ratio limit, the smaller the reduction speed of the desktop is, the closer the reduction ratio is to the reduction ratio limit in sense, the larger the reduction damping sense is, and thus the use experience of a user on the electronic device can be improved.
When the desktop application calculates the scaling scale corresponding to the event in the current double-finger scaling process, the desktop application can transmit scaling information to the frame layer. Wherein the scaling information may include: scale, identification of desktop view, and two-finger center coordinates (pivotX, pivotY).
When the frame layer receives the scaling information, the frame layer may perform step S111. And S111, the frame layer scales all application icons in the desktop by taking the double-finger center coordinate (pivotX, pivotY) as a scaling center.
Illustratively, when the frame layer receives the scaling information transmitted by the desktop application, the frame layer renders, draws, synthesizes and displays the desktop view corresponding to the identifier of the desktop view according to the scaling scale in the scaling information by taking the double-finger center coordinate (pivotX, pivotY) in the scaling information as a scaling center, so as to realize the scaling of the desktop of the electronic device.
And scaling all application icons in the desktop by taking the double-finger center coordinate (pivotX, pivotY) as a scaling center, so that the icons of the application which the user wants to scale are displayed at a display position which the user conveniently operates after scaling, and the occurrence probability of invisible or incomplete display of the application icons which the user wants to scale after scaling is reduced.
In a possible scenario, two of the double fingers touching TP may lift.
When both fingers touching TP are lifted, the kernel may generate and store a single finger lift event (STATE_RELEASE). The UIkit may perform steps S112-S114.
S112, UIkit reads a single finger lift event (state_release) from the kernel.
S113, UIkit determines whether the TP state before the single-finger lift event is a double-finger zoom state.
The specific implementation of step S113 is similar to that of step S102, and will not be described again.
When the UIkit determines that the TP state before the single-finger lift event is the double-finger zoom state, the UIkit may perform step S114.
S114, the UIkit reports a double-finger zoom end event to the desktop application, and sets the TP state to a non-double-finger zoom state.
Illustratively, the UIkit generates a two-finger zoom end event, and after reporting the two-finger zoom end event to the desktop application, also maintains the target identifier is scaling_as false.
When the desktop application receives the double-finger zoom end event, the desktop is not zoomed.
In a possible scenario, a single one of the two fingers touching the TP may be lifted.
When one of the two fingers touching the TP is lifted, the kernel may generate and store a single finger PRESS event (STATE_PRESS). The UIkit may perform steps S115-S117.
S115, UIkit reads a single finger PRESS event (state_press) from the kernel.
S116, the UIkit judges whether the TP state before the single-finger pressing event is read is a double-finger zoom state.
The specific implementation of step S116 is similar to that of step S102, and will not be described again.
When the UIkit judges that the TP state before the single-finger press event is the double-finger zoom state, the UIkit may perform step S117.
S117, the UIkit reports a double-finger zoom end event to the desktop application, and the TP state is set to be a non-double-finger zoom state.
Illustratively, the UIkit generates a two-finger zoom end event, and after reporting the two-finger zoom end event to the desktop application, also maintains the target identifier is scaling_as false.
When the desktop application receives the double-finger zoom end event, the desktop is not zoomed.
Alternatively, the icons of the applications may be laid out in a checkerboard (or tiled) fashion on the desktop of the electronic device.
According to the scaling method provided by the embodiment, the UIkit reads the double-finger event, judges whether the previous TP state is the double-finger scaling state or not, determines to indicate the double-finger scaling start event or the double-finger scaling middle event to the desktop application, calculates the scaling scale and the scaling center when the desktop application receives the double-finger scaling middle event, instructs the framework layer to scale the desktop according to the calculated scaling scale and the scaling center, scales the icon applied on the desktop along with the scaling of the desktop, and enables a user to accurately trigger the icon of the application which is expected to be triggered. The UIkit can indicate events in a double-finger scaling process to the desktop application according to the screen refreshing frequency, the desktop application can calculate the scaling proportion and the scaling center according to the screen refreshing frequency, the framework layer can also scale the desktop view according to the scaling proportion and the scaling center calculated by the desktop application according to the screen refreshing frequency, real-time performance of scaling of the desktop view is achieved, scaling of the icons of the application in the desktop view along with scaling of the desktop view is achieved, real-time performance of scaling of the icons of the application is achieved, a user can accurately click and open an application program to be opened in time, and when the icons of the application are in a chessboard (or tiled) pattern layout on the desktop, the user can quickly find the target application, further quickly open the target application, and use experience of the user on the electronic device can be improved.
In addition, in the zooming method provided by the embodiment, in the process of touching and zooming the desktop of the electronic device by two fingers, the closer the zooming scale is to the zooming scale limit, the smaller the zooming speed of the desktop is, the closer the zooming scale is to the zooming scale limit in sense, the larger the zooming damping sense is, so that the use experience sense of a user on the electronic device can be improved.
The scaling method provided by the embodiment of the application is described below with reference to fig. 4 to 6.
Fig. 4 shows a flowchart of another scaling method according to an embodiment of the present application. In the embodiment of the present application, the desktop view may be simply referred to as a desktop.
As shown in fig. 4, the method includes:
s201, displaying a desktop, wherein the desktop comprises at least one icon of an application.
Illustratively, the touch screen of the electronic device displays a desktop that includes icons of at least one application in a checkerboard (or tiled) style layout.
The icons of the applications are shown as icon R, icon S, icon T, icon U, … in fig. 1.
S202, receiving double-finger operation for amplifying the desktop on the desktop.
For example, the operation for zooming in on the desktop may include an operation in which a user double-finger presses down on a touch screen of the electronic device and slides the double-finger to increase a distance between the double-fingers.
During the operation of receiving the double finger on the desktop for zooming in on the desktop, the steps S101-S111 shown in fig. 3 may be performed inside the electronic device.
The steps of the electronic device performing S101-S111 in a loop may cause the desktop display to be divided into a plurality of phases as described in S203.
S203, in the process of continuously receiving the operation for magnifying the desktop, the method comprises the following steps:
in the first stage, the desktop is enlarged at a first enlargement rate.
And in the second stage, the desktop is enlarged at a second enlargement speed. The second stage is later than the first stage, and the second amplification speed is less than the first amplification speed.
And in the third stage, stopping enlarging the desktop.
Optionally, before stopping the zooming in on the desktop, in a seventh stage, zooming in on the desktop at a third zooming-in speed; the seventh stage is earlier than the third stage, and the third amplification speed is smaller than the second amplification speed. The seventh stage is later than the second stage. Therefore, the gradient of the change of the zooming speed can be reduced, and the occurrence probability that the user experience is influenced due to the large change of the zooming speed is further reduced.
Illustratively, as shown in Table 1, the first stage may be the amplification speed v F The first stage is 1, and R is 0.5 ≡R F A stage of less than or equal to 1.25. The first amplification speed may be 1.
The second stage may be the amplification speed v F The second stage may be a stage of 0.5, or 1.25 < R F A stage of less than or equal to 1.4. The second amplification speed may be 0.5.
The seventh stage may be the amplification speed v F A stage of 0.25, and a seventh stage may be 1.4 < R F A stage of less than or equal to 1.5. The third amplification speed may be 0.25.
The third stage may be the amplification speed v F A stage of 0, the third stage may be R F Stages > 1.5.
The electronic device may have a magnification scale limit of 1.5 and a reduction scale limit of 0.5.
In this way, in the process of double-finger touch amplification of the desktop of the electronic equipment, the amplification rate is closer to the amplification rate limit, the amplification speed of the desktop is smaller, and the sense of sense is that the amplification rate is closer to the amplification rate limit, the amplification damping sense is larger, so that the use experience sense of a user on the electronic equipment can be improved.
Fig. 5 illustrates an enlarged schematic view of a desktop provided by an embodiment of the present application in conjunction with the above stages.
As shown in fig. 5, a in fig. 5 shows a desktop and a touch position of a double finger on the desktop. The touch location may also be referred to as a touch point. When the desktop application judges that the double-finger operation on the desktop shown in a in fig. 5 is an operation of enlarging the desktop, the desktop application calculates an enlargement ratio and transmits the enlargement ratio to the frame layer And outputting the calculated amplification ratio. When the amplification ratio received by the frame layer satisfies R being 0.5-R F When the frame layer is less than or equal to 1.25, the table top can be enlarged at the first enlarging speed of 1. B in fig. 5 shows the desktop of the desktop shown in a in fig. 5 magnified at a magnification speed of 1, and the touch position of the two fingers on the desktop.
When the desktop application judges that the on-desktop double-finger operation shown in b in fig. 5 is an operation to enlarge the desktop, the desktop application calculates the enlargement ratio and transmits the calculated enlargement ratio to the frame layer. When the magnification ratio received by the frame layer satisfies 1.25 < R F When the frame layer is less than or equal to 1.4, the table top can be enlarged at the second enlarging speed of 0.5. C in fig. 5 shows the desktop magnified at a magnification speed of 0.5 for the desktop shown in b in fig. 5, and the touch position of the two fingers on the desktop.
When the desktop application judges that the double-finger operation on the desktop shown in c in fig. 5 is an operation of enlarging the desktop, the desktop application calculates the enlargement ratio and transmits the calculated enlargement ratio to the frame layer. When the magnification ratio received by the frame layer satisfies 1.4 < R F When the speed is less than or equal to 1.5, the frame layer can enlarge the desktop at the third enlarging speed of 0.25. D in fig. 5 shows the desktop magnified at a magnification speed of 0.25 for the desktop shown in c in fig. 5, and the touch position of the two fingers on the desktop.
When the desktop application judges that the on-desktop double-finger operation shown as d in fig. 5 is an operation to enlarge the desktop, the desktop application calculates the enlargement ratio and transmits the calculated enlargement ratio to the frame layer. When the magnification ratio received by the frame layer satisfies 1.5 < R F When the frame layer can enlarge the desktop at the enlargement rate 0, i.e., the frame layer stops enlarging the desktop. E in fig. 5 shows the desktop magnified at magnification speed 0 for the desktop shown as d in fig. 5, and the touch position of the two fingers on the desktop.
The phase from a in fig. 5 to b in fig. 5 may correspond to the first phase.
The phase from b in fig. 5 to c in fig. 5 may correspond to the second phase.
The phase from c in fig. 5 to d in fig. 5 may correspond to the seventh phase.
The phase from d in fig. 5 to e in fig. 5 may correspond to the third phase.
S204, the operation for enlarging the desktop is ended and switched to the operation for reducing the desktop.
For example, the operation for zooming out the desktop may include an operation in which a user double-finger presses down on a touch screen of the electronic device and slides the double-finger to reduce a distance between the double-fingers.
During the operation of reducing the desktop with the double finger received on the desktop, the steps S101-S111 shown in fig. 3 may be performed inside the electronic device.
The steps of the electronic device performing S101-S111 in a loop may cause the desktop display to be divided into a plurality of phases as described in S205.
S205, in the process of continuously receiving the operation for shrinking the desktop, the method comprises the following steps:
and fourth, the desktop is scaled down according to the first scaling down speed.
And fifth, shrinking the desktop at the second shrinking speed. The fifth stage is later than the fourth stage, and the second reduction speed is smaller than the first reduction speed.
And a sixth step, stopping shrinking the desktop.
Optionally, before stopping the desktop zooming out, in an eighth stage, zooming out the desktop at a third zooming-out speed; the eighth stage is earlier than the sixth stage, and the third reduction speed is smaller than the second reduction speed. The eighth stage is later than the fifth stage. Therefore, the gradient of the change of the zoom-out speed can be reduced, and the occurrence probability that the user experience is influenced due to the large change of the zoom-out speed is further reduced.
Illustratively, as shown in Table 1, the fourth stage may be a reduction speed v S A stage 1, and a fourth stage may be 1.5.ltoreq.R S Less than or equal to 0.75. The first reduction speed may be 1.
The fifth stage may be the reduction speed v S The fifth stage may be 0.75 < R S Less than or equal to 0.6. The second reduction speed may be 0.5.
The eighth stage may be the reduction speed v S The eighth stage may be 0.6 < R S Less than or equal to 0.5. The third reduction speed may be 0.25.
The sixth stage may be the reduction speed v S A stage of 0, and the sixth stage may be R S Stage < 0.5.
In this way, in the process of reducing the desktop of the electronic device by double-finger touch, the closer the reduction ratio is to the reduction ratio limit, the smaller the reduction speed of the desktop is, the closer the reduction ratio is to the reduction ratio limit in sense, the larger the reduction damping sense is, and thus the use experience of a user on the electronic device can be improved.
Illustratively, fig. 6 illustrates a desktop reduction schematic provided by an embodiment of the present application in conjunction with the above stages.
As shown in fig. 6, a in fig. 6 shows a desktop and a touch position of a double finger on the desktop. The touch location may also be referred to as a touch point. When the desktop application determines that the two-finger operation on the desktop shown in a in fig. 6 is an operation to reduce the desktop, the desktop application calculates the reduction ratio and transmits the calculated reduction ratio to the frame layer. When the reduction ratio received by the frame layer satisfies 1.5R S And when the speed is less than or equal to 0.75, the frame layer can shrink the desktop at the first reduction speed 1. B in fig. 6 shows the desktop of the desktop shown in a in fig. 6 after the desktop is reduced at a reduction speed of 1, and the touch position of the two fingers on the desktop.
When the desktop application determines that the on-desktop double-finger operation shown in b in fig. 6 is an operation to reduce the desktop, the desktop application calculates the reduction ratio and transmits the calculated reduction ratio to the frame layer. When the reduction ratio received by the frame layer satisfies 0.75 < R S And when the reduction speed is less than or equal to 0.6, the frame layer can reduce the table top according to the second reduction speed of 0.5. C in fig. 6 shows the desktop reduced by the speed of 0.5 and the touch position of the two fingers on the desktop shown in b in fig. 6.
When the desktop application determines that the on-desktop double-finger operation shown in c in fig. 6 is an operation to reduce the desktop, the desktop application calculates the reduction ratio and transmits the calculated reduction ratio to the frame layer. When the reduction ratio received by the frame layer satisfies 0.6 < R S And when the speed is less than or equal to 0.5, the frame layer can shrink the desktop at the third shrinkage speed of 0.25. D in FIG. 6 shows that in FIG. 6C, the desktop is zoomed out at a zoomed out speed of 0.25, and the touch position of the double finger on the desktop.
When the desktop application determines that the on-desktop double-finger operation shown as d in fig. 6 is an operation to reduce the desktop, the desktop application calculates the reduction ratio and transmits the calculated reduction ratio to the frame layer. When the reduction ratio received by the frame layer satisfies R S When < 0.5, the frame layer may shrink the desktop at a shrink speed of 0, i.e., the frame layer stops shrinking the desktop. E in fig. 6 shows the desktop of fig. 6 after the desktop is scaled down at a scaling down speed of 0, and the touch position of the two fingers on the desktop.
The phase from a in fig. 6 to b in fig. 6 may correspond to the fourth phase.
The phase from b in fig. 6 to c in fig. 6 may correspond to the fifth phase.
The phase from c in fig. 6 to d in fig. 6 may correspond to the eighth phase.
The phase from d in fig. 6 to e in fig. 6 may correspond to the sixth phase.
The scaling method provided in this embodiment has a specific implementation principle and a technical effect similar to those of the embodiment of fig. 3, and this embodiment is not repeated here.
On the basis of the embodiment corresponding to fig. 4, the scaling method provided in this embodiment further includes the following method.
Optionally, the first amplifying speed is obtained from a preset corresponding relationship between the scaling ratio and the scaling speed according to the scaling ratio at the n+1th moment. The corresponding relationship between the preset scaling ratio and the scaling speed is shown in table 1.
Wherein, the n+1th moment is any moment in the first stage, n is an integer, and the amplification ratio of the n+1th moment is related to the following: the distance between the double fingers at the n+1 time, the distance between the double fingers at the n time, the scaling speed at the n time, and the distance between the double fingers when the double fingers are first pressed in the operation for enlarging the desktop.
Illustratively, the magnification at time n+1 is positively correlated with: distance between double fingers at time n+1, scaling at time n, and scaling speed at time n.
The magnification ratio at time n+1 is inversely related to: distance between the double fingers at the nth time, and distance between the double fingers when the double fingers are first pressed in an operation for enlarging the desktop.
In this way, the zooming speed can be adjusted according to the zooming scale, so that better experience of zooming the desktop can be realized for the user.
Illustratively, the scale at time n+1 satisfies the formula:
R n+1 =R n +(D n+1 -D n )×v n ÷D F,0
wherein R is n+1 For the n+1th time scaling, R n For the n-th time scale, D n+1 Is the distance between the two fingers at the n+1 time, D n V is the distance between the double fingers at the nth moment n For the zoom speed at the nth time, D F,0 To enlarge the distance between the two fingers when the two fingers are first pressed in the operation for enlarging the desktop. R is R 1 =1。
R n+1 May be currscale in the embodiment of fig. 3. R is R n May be lastscale in the embodiment of fig. 3. D (D) n+1 Scale curr distance in the embodiment of fig. 3. D (D) n Scale last distance in the embodiment of fig. 3. v n May be the scale speed in the embodiment of fig. 3. D (D) F,0 Scale start distance in the embodiment of fig. 3. R is R 1 May be the initial value of the scaling described in S110 in the embodiment of fig. 3.
In this way, the zooming speed can be adjusted according to the zooming scale, so that the zooming scales at different stages correspond to different zooming speeds, and the desktop of the electronic device is zoomed at different zooming speeds at different stages, so that better experience of zooming the desktop is provided for a user.
Optionally, in the first stage, the table top is scaled according to the scaling ratio at the n+1 time by taking the coordinates of the two-finger center at the n+1 time as the scaling center. The specific implementation manner of the method is similar to that of step S111 in the embodiment of fig. 3, and will not be repeated.
Optionally, in the process of continuously receiving the operation for zooming in the desktop, the zooming method provided by the embodiment of the application further includes:
a two-finger event is acquired. The specific implementation of this step is similar to that of step S101 in the embodiment of fig. 3, and will not be described again.
When the target mark is judged to be a first value, an event in the double-finger scaling process is obtained; the event characterization double-finger scaling event in the double-finger scaling process has started and has not ended. The first value may be true in the embodiment of fig. 3. The specific implementation of this step is similar to that of step S106 in the embodiment of fig. 3, and will not be described again.
The scale is calculated and the target identity is kept at a first value. The specific implementation of this step is similar to the specific implementation of steps S107-S110 in the embodiment of fig. 3, and will not be repeated.
Thus, the desktop can be timely and accurately scaled.
Optionally, when receiving the operation of the double-finger for zooming in on the desktop, the method further comprises:
a two-finger event is acquired. The specific implementation of this step is similar to that of step S101 in the embodiment of fig. 3, and will not be described again.
And when the target mark is judged to be a second value, obtaining a double-finger zoom starting event. The second value may be false in the embodiment of fig. 3. Illustratively, the two-finger zoom start event may be obtained by: and acquiring the views corresponding to the two fingers, and obtaining a scaling start event of the two fingers when the views corresponding to the two fingers are the same. The specific implementation of this step is similar to the specific implementation of steps S103-S105 in the embodiment of fig. 3, and will not be repeated.
The two-finger zoom start event is indicated to the desktop application and the target identifier is modified to a first value. The specific implementation of this step is similar to that of step S105 in the embodiment of fig. 3, and will not be described again.
Therefore, the operation of the double fingers can be determined to be the scaling operation, and the operation intention of the user can be further determined accurately.
The scaling method provided by the embodiment of the application can be applied to the electronic equipment with the communication function. The electronic device includes a terminal device, and specific device forms and the like of the terminal device may refer to the above related descriptions, which are not repeated herein.
An embodiment of the present application provides an electronic device, including: a processor and a memory; the memory stores computer-executable instructions; the processor executes the computer-executable instructions stored in the memory to cause the electronic device to perform the method described above.
The embodiment of the application provides a chip. The chip comprises a processor for invoking a computer program in a memory to perform the technical solutions in the above embodiments. The principle and technical effects of the present application are similar to those of the above-described related embodiments, and will not be described in detail herein.
The embodiment of the application also provides a computer readable storage medium. The computer-readable storage medium stores a computer program. The computer program realizes the above method when being executed by a processor. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
In one possible implementation, the computer readable medium may include RAM, ROM, compact disk-read only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium targeted for carrying or storing the desired program code in the form of instructions or data structures and accessible by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (Digital Subscriber Line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes optical disc, laser disc, optical disc, digital versatile disc (Digital Versatile Disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Embodiments of the present application provide a computer program product comprising a computer program which, when executed, causes a computer to perform the above-described method.
The embodiment of the application provides a device which can be electronic equipment or a chip system in the electronic equipment. The apparatus may include a display unit and a processing unit. When the apparatus is an electronic device, the display unit may be a display screen or a touch screen. The display unit is configured to perform the step of displaying the desktop, so that the electronic device implements a zoom method described in the first aspect or any one of the possible implementations of the first aspect. When the apparatus is an electronic device, the processing unit may be a processor. The apparatus may further comprise a storage unit, which may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the electronic device implements the method described in the first aspect or any one of the possible implementation manners of the first aspect. When the apparatus is a chip or a system of chips within an electronic device, the processing unit may be a processor. The processing unit executes instructions stored by the storage unit to cause the electronic device to implement the method described in the first aspect or any one of the possible implementations of the first aspect. The memory unit may be a memory unit (e.g., a register, a cache, etc.) within the chip, or a memory unit (e.g., a read-only memory, a random access memory, etc.) within the electronic device that is external to the chip.
Illustratively, the display unit is configured to display a desktop, where the desktop includes at least one icon of an application. And the processing unit is used for receiving double-finger operation for amplifying the desktop on the desktop. In the process of continuously receiving the operation for magnifying the desktop, the method comprises the following stages: the desktop amplifying device comprises a first stage, a processing unit, a second stage and a processing unit, wherein the processing unit is further used for amplifying the desktop at a first amplifying speed, the second stage is later than the first stage, the second amplifying speed is smaller than the first amplifying speed, and the processing unit is further used for stopping amplifying the desktop. The operation for enlarging the desktop ends switching to an operation for reducing the desktop. In the process of continuously receiving the operation for shrinking the desktop, the method comprises the following stages: the processing unit is further used for shrinking the desktop at a first shrinking speed, the processing unit is further used for shrinking the desktop at a second shrinking speed, the fifth stage is later than the fourth stage, the second shrinking speed is smaller than the first shrinking speed, the sixth stage is further used for stopping shrinking the desktop.
In a possible implementation, the processing unit is further configured to zoom in on the desktop at a third zoom-in speed in a seventh stage before stopping zooming in on the desktop. The seventh stage is earlier than the third stage, and the third amplification speed is smaller than the second amplification speed.
In a possible implementation, before stopping zooming out the desktop, the processing unit is further configured to zoom out the desktop at a third zooming-out speed in an eighth stage. The eighth stage is earlier than the sixth stage, and the third reduction speed is smaller than the second reduction speed.
In one possible implementation manner, the first amplifying speed is a scaling ratio according to the n+1th moment, and the processing unit is further configured to obtain the first amplifying speed corresponding to the scaling ratio at the n+1th moment from a preset correspondence between the scaling ratio and the scaling speed. Wherein, the n+1th moment is any moment in the first stage, n is an integer, and the amplification ratio of the n+1th moment is related to the following: the distance between the double fingers at the n+1 time, the distance between the double fingers at the n time, the scaling speed at the n time, and the distance between the double fingers when the double fingers are first pressed in the operation for enlarging the desktop.
In one possible implementation, the magnification ratio at time n+1 is positively correlated with: distance between double fingers at time n+1, scaling at time n, and scaling speed at time n. The magnification ratio at time n+1 is inversely related to: distance between the double fingers at the nth time, and distance between the double fingers when the double fingers are first pressed in an operation for enlarging the desktop.
In one possible implementation, the scaling at time n+1 satisfies the formula:
R n+1 =R n +(D n+1 -D n )×v n ÷D F,0
wherein R is n+1 For the n+1th time scaling, R n For the n-th time scale, D n+1 Is the distance between the two fingers at the n+1 time, D n V is the distance between the double fingers at the nth moment n For the zoom speed at the nth time, D F,0 To enlarge the distance between the two fingers when the two fingers are first pressed in the operation for enlarging the desktop.
In a possible implementation manner, in the first stage, the processing unit is further configured to scale the desktop according to the scaling ratio at the n+1 time, with the coordinates of the center of the two fingers at the n+1 time as the scaling center.
In one possible implementation, the processing unit is further configured to obtain the two-finger event during the duration of the operation for zooming in on the desktop. And the processing unit is also used for obtaining an event in the double-finger scaling process when the target mark is judged to be the first value. The event characterization double-finger scaling event in the double-finger scaling process has started and has not ended. The processing unit is further configured to calculate a scaling and keep the target identifier as a first value.
In one possible implementation, the processing unit is further configured to obtain a two-finger event when receiving an operation of the two-finger for zooming in on the desktop. And the processing unit is also used for obtaining a double-finger scaling start event when the target mark is judged to be the second value. The processing unit is further configured to indicate a two-finger zoom start event to the desktop application and modify the target identifier to a first value.
In a possible implementation manner, the processing unit is further configured to obtain a view corresponding to each of the two fingers. And the processing unit is also used for obtaining the double-finger zoom starting event when the views corresponding to the double fingers are the same.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing detailed description of the application has been presented for purposes of illustration and description, and it should be understood that the foregoing is by way of illustration and description only, and is not intended to limit the scope of the application.
Claims (13)
1. A method of scaling, for application to an electronic device, the method comprising:
displaying a desktop, the desktop including icons of at least one application;
receiving double-finger operation for enlarging the desktop on the desktop;
in the process of continuously receiving the operation for zooming in the desktop, the method comprises the following stages:
the first stage, enlarge the said tabletop at the first magnification speed;
a second stage of magnifying the desktop at a second magnification rate; the second stage is later than the first stage, and the second amplification speed is less than the first amplification speed;
a third stage, stopping enlarging the desktop;
the operation for enlarging the desktop is ended and switched to the operation for reducing the desktop;
in the process of continuously receiving the operation for shrinking the desktop, the method comprises the following stages:
a fourth step of shrinking the desktop at a first shrinking speed;
a fifth step of shrinking the desktop at a second shrinking speed; the fifth stage is later than the fourth stage, and the second reduction speed is smaller than the first reduction speed;
and a sixth step of stopping shrinking the desktop.
2. The method of claim 1, further comprising, prior to said ceasing to zoom in on said desktop:
A seventh step of magnifying the desktop at a third magnification speed; the seventh stage is earlier than the third stage, and the third amplification speed is less than the second amplification speed.
3. The method of claim 1 or 2, further comprising, prior to the stopping the zooming out of the desktop:
eighth stage, shrinking the desktop according to the third shrinking speed; the eighth stage is earlier than the sixth stage, and the third reduction speed is smaller than the second reduction speed.
4. A method according to any one of claims 1-3, wherein the first magnification speed is obtained from a preset correspondence between a scaling factor and a scaling speed according to a scaling factor at time n+1;
wherein the n+1th time is any time in the first stage, n is an integer, and the amplification ratio of the n+1th time is related to the following matters: the distance between the double fingers at the n+1 time, the distance between the double fingers at the n time, the scaling speed at the n time, and the distance between the double fingers when the double fingers are pressed for the first time in the operation for enlarging the desktop.
5. The method of claim 4, wherein the magnification ratio at time n+1 is positively correlated with: a distance between the double fingers at the n+1 time, a scaling ratio at the n time and a scaling speed at the n time;
The magnification ratio at the n+1 time is inversely related to the following content: the distance between the double fingers at the nth moment and the distance between the double fingers when the double fingers are pressed for the first time in the operation for amplifying the desktop.
6. The method according to claim 4 or 5, characterized in that the scaling at time n+1 satisfies the formula:
R n+1 =R n +(D n+1 -D n )×v n ÷D F,0
wherein the R is n+1 For the scaling of the n+1th moment, the R n For the n-th time scaling, the D n+1 The distance between the two fingers at the n+1th moment is the D n The v is the distance between the double fingers at the nth moment n For the zoom speed at the nth time, the D F,0 The distance between the two fingers when the two fingers are pressed for the first time in the operation for enlarging the desktop.
7. The method according to any one of claims 4 to 6, wherein in the first stage, the table top is scaled according to the scaling ratio at the n+1 time, with the coordinates of the two-finger center at the n+1 time as a scaling center.
8. The method of any of claims 1-7, wherein during continued receipt of the operation for zooming in on a desktop, the method further comprises:
acquiring a double-finger event;
When the target mark is judged to be a first value, an event in the double-finger scaling process is obtained; the event characterization double-finger scaling event in the double-finger scaling process is started and not ended;
a scaling is calculated and the target identity is kept as the first value.
9. The method of any of claims 1-8, wherein upon receiving the operation for magnifying the desktop with two fingers, further comprising:
acquiring a double-finger event;
when the target mark is judged to be a second value, a double-finger scaling start event is obtained;
the two-finger zoom start event is indicated to a desktop application and the target identification is modified to a first value.
10. The method of claim 9, wherein the deriving a double-finger zoom start event comprises:
acquiring respective corresponding views of the double fingers;
and when the views corresponding to the two fingers are the same, obtaining a two-finger zoom starting event.
11. An electronic device, comprising: a processor and a memory;
the memory stores computer-executable instructions;
the processor executing computer-executable instructions stored in the memory to cause the electronic device to perform the method of any one of claims 1-10.
12. A computer readable storage medium storing a computer program, which when executed by a processor implements the method according to any one of claims 1-10.
13. A computer program product comprising a computer program which, when run, causes a computer to perform the method of any of claims 1-10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310863199.XA CN117111823A (en) | 2023-07-12 | 2023-07-12 | Scaling method and related device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310863199.XA CN117111823A (en) | 2023-07-12 | 2023-07-12 | Scaling method and related device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117111823A true CN117111823A (en) | 2023-11-24 |
Family
ID=88809976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310863199.XA Pending CN117111823A (en) | 2023-07-12 | 2023-07-12 | Scaling method and related device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117111823A (en) |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106527862A (en) * | 2016-11-11 | 2017-03-22 | 北京奇虎科技有限公司 | Method and device for displaying application icons |
CN108460725A (en) * | 2018-03-22 | 2018-08-28 | 腾讯科技(深圳)有限公司 | Map-indication method, device, equipment and storage medium |
CN109960443A (en) * | 2017-12-22 | 2019-07-02 | 华为终端有限公司 | A kind of display application drawing calibration method and terminal device |
CN110362369A (en) * | 2019-06-06 | 2019-10-22 | 努比亚技术有限公司 | Wearable device control method, wearable device and computer readable storage medium |
US10459887B1 (en) * | 2015-05-12 | 2019-10-29 | Apple Inc. | Predictive application pre-launch |
CN110442277A (en) * | 2019-07-03 | 2019-11-12 | 华为技术有限公司 | Show the method and electronic equipment of the preview window information |
US20200017026A1 (en) * | 2018-07-10 | 2020-01-16 | Samsung Electronics Co., Ltd. | Method and system of displaying multimedia content on glass window of vehicle |
CN110908555A (en) * | 2019-10-21 | 2020-03-24 | 维沃移动通信有限公司 | Icon display method and electronic equipment |
CN113093970A (en) * | 2015-03-02 | 2021-07-09 | 苹果公司 | Controlling system zoom magnification using a rotatable input mechanism |
CN114430492A (en) * | 2020-10-29 | 2022-05-03 | 海信视像科技股份有限公司 | Display device, mobile terminal and picture synchronous zooming method |
US20220365661A1 (en) * | 2019-08-09 | 2022-11-17 | Honor Device Co., Ltd. | Method for displaying page elements and electronic device |
CN115904160A (en) * | 2021-09-30 | 2023-04-04 | 华为技术有限公司 | Method for moving icons, related graphic interface and electronic equipment |
CN116048311A (en) * | 2022-06-24 | 2023-05-02 | 荣耀终端有限公司 | Window display method, electronic device, and computer-readable storage medium |
CN116185263A (en) * | 2021-11-26 | 2023-05-30 | Oppo广东移动通信有限公司 | Screen interaction method, interaction device, electronic device and readable storage medium |
-
2023
- 2023-07-12 CN CN202310863199.XA patent/CN117111823A/en active Pending
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113093970A (en) * | 2015-03-02 | 2021-07-09 | 苹果公司 | Controlling system zoom magnification using a rotatable input mechanism |
US10459887B1 (en) * | 2015-05-12 | 2019-10-29 | Apple Inc. | Predictive application pre-launch |
CN106527862A (en) * | 2016-11-11 | 2017-03-22 | 北京奇虎科技有限公司 | Method and device for displaying application icons |
CN109960443A (en) * | 2017-12-22 | 2019-07-02 | 华为终端有限公司 | A kind of display application drawing calibration method and terminal device |
CN108460725A (en) * | 2018-03-22 | 2018-08-28 | 腾讯科技(深圳)有限公司 | Map-indication method, device, equipment and storage medium |
US20200017026A1 (en) * | 2018-07-10 | 2020-01-16 | Samsung Electronics Co., Ltd. | Method and system of displaying multimedia content on glass window of vehicle |
CN110362369A (en) * | 2019-06-06 | 2019-10-22 | 努比亚技术有限公司 | Wearable device control method, wearable device and computer readable storage medium |
CN110442277A (en) * | 2019-07-03 | 2019-11-12 | 华为技术有限公司 | Show the method and electronic equipment of the preview window information |
US20220365661A1 (en) * | 2019-08-09 | 2022-11-17 | Honor Device Co., Ltd. | Method for displaying page elements and electronic device |
CN110908555A (en) * | 2019-10-21 | 2020-03-24 | 维沃移动通信有限公司 | Icon display method and electronic equipment |
CN114430492A (en) * | 2020-10-29 | 2022-05-03 | 海信视像科技股份有限公司 | Display device, mobile terminal and picture synchronous zooming method |
CN115904160A (en) * | 2021-09-30 | 2023-04-04 | 华为技术有限公司 | Method for moving icons, related graphic interface and electronic equipment |
CN116185263A (en) * | 2021-11-26 | 2023-05-30 | Oppo广东移动通信有限公司 | Screen interaction method, interaction device, electronic device and readable storage medium |
CN116048311A (en) * | 2022-06-24 | 2023-05-02 | 荣耀终端有限公司 | Window display method, electronic device, and computer-readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2538297C2 (en) | Method, apparatus and computer programme product for joining displays of multiple devices | |
US9927953B2 (en) | Method and device for providing menu interface | |
EP2452260B1 (en) | Automatic enlargement of viewing area with selectable objects | |
EP2329223B9 (en) | User interface, device and method for displaying special locations on a map | |
US20120280898A1 (en) | Method, apparatus and computer program product for controlling information detail in a multi-device environment | |
US20100302176A1 (en) | Zoom-in functionality | |
RU2576472C2 (en) | Display device, user interface method and programme | |
US20130050269A1 (en) | Methods, apparatuses, and computer program products for compression of visual space for facilitating the display of content | |
TW201331825A (en) | Apparatus and method for providing a visual transition between screens | |
KR20120029898A (en) | Method for displaying internet pages and mobile terminal using this method | |
US20150063785A1 (en) | Method of overlappingly displaying visual object on video, storage medium, and electronic device | |
US20130135294A1 (en) | Mobile terminal and control method thereof | |
CN111190677A (en) | Information display method, information display device and terminal equipment | |
CN112269506A (en) | Screen splitting method and device and electronic equipment | |
CN107203309A (en) | View switching method and device, computer installation and computer-readable recording medium | |
CN112764633B (en) | Information processing method and device and electronic equipment | |
US20140181709A1 (en) | Apparatus and method for using interaction history to manipulate content | |
CN106569686B (en) | Method for controlling screen capture by rolling ball and related intelligent equipment | |
CN113031838B (en) | Screen recording method and device and electronic equipment | |
CN112764630A (en) | Application icon display method and device | |
US20110115788A1 (en) | Method and apparatus for setting stereoscopic effect in a portable terminal | |
CN117111823A (en) | Scaling method and related device | |
CN114567694B (en) | Alarm clock reminding method and device | |
CN115586937A (en) | Interface display method and device, electronic equipment and readable storage medium | |
CN111192191B (en) | Display method, device and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |