CN113849104A - Graphic control method, apparatus, terminal device and readable storage medium - Google Patents
Graphic control method, apparatus, terminal device and readable storage medium Download PDFInfo
- Publication number
- CN113849104A CN113849104A CN202010600422.8A CN202010600422A CN113849104A CN 113849104 A CN113849104 A CN 113849104A CN 202010600422 A CN202010600422 A CN 202010600422A CN 113849104 A CN113849104 A CN 113849104A
- Authority
- CN
- China
- Prior art keywords
- magnification
- graph
- display control
- control ratio
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application is applicable to the technical field of terminal control, and provides a graph control method, a graph control device, terminal equipment and a readable storage medium. The graphic control method comprises the following steps: and responding to the amplification operation, and amplifying the graph according to a preset first display control ratio. And when the magnification of the graph is larger than the target magnification, determining a second display control ratio according to the magnification of the graph and the target magnification. And continuing to amplify the graph according to the second display control ratio. The display control ratio is determined according to the magnification of the graph and the target magnification, and the target magnification is related to the display resolution of the screen and the graph resolution of the graph. Therefore, the graphics can be adapted to the graphics with different resolutions according to the second display control ratio, so that when the graphics with different resolutions are amplified, the graphics with different resolutions have uniform attenuation speed of the display control ratio after the amplification factor is larger than the target amplification factor. The effect of prompting that the graphic magnification of the user exceeds the target magnification is more obvious, and clear and definite feedback and prompt are given to the user.
Description
Technical Field
The present application relates to the field of terminal control technologies, and in particular, to a method and an apparatus for controlling a graphic, a terminal device, and a readable storage medium.
Background
The control of the enlargement and reduction of the graphics by the double-finger pinch operation is widely used in various terminal devices. Such as enlarging the image by the pinch-in operation of moving the two fingers away from the mobile phone, or reducing the image by the pinch-in operation of moving the two fingers closer to the mobile phone. After the graphic is enlarged to the system-supported maximum magnification (i.e., the maximum magnification after leaving the hand) by the two-finger pinch operation, the graphic may continue to be enlarged to the maximum magnification of the two-finger pinch operation if the two-finger pinch operation continues to be maintained. After the double-finger pinch operation is terminated and the double fingers are removed from the screen, the graphics will shrink from the graphics size at the time of removal to the maximum magnification supported by the system.
In the prior art, after the magnification of the double-finger pinch operation exceeds the maximum magnification supported by the system, the user is prompted that the current magnification exceeds the maximum magnification supported by the system through the continuous attenuation of the display control ratio of the double-finger pinch operation until the continuous amplification cannot be carried out (namely, the attenuation of the display control ratio is 0).
However, when the two-finger pinch operation operates on different patterns, the maximum magnification of the two-finger pinch operation and the maximum magnification supported by the system are different due to the difference in the resolution of the patterns. When the display control ratio is attenuated, the attenuation speed of the display control ratio of each graph is not uniform, and the prompt effect is poor.
Disclosure of Invention
The embodiment of the application provides a graph control method, a graph control device, terminal equipment and a readable storage medium, and can solve the problems that when different graphs are operated by double-finger kneading operation, the display control ratio attenuation speed of each graph is not uniform and the prompt effect is poor due to different resolutions of the graphs.
In a first aspect, an embodiment of the present application provides a graph control method, which is applied to a terminal device, where a graph is displayed on a screen of the terminal device. The terminal device can be a device which can display the graph and control the graph to be zoomed, such as a smart phone, a smart watch, a tablet computer, a notebook computer, a desktop computer and the like. The method comprises the following steps:
and responding to the amplification operation, and amplifying the graph according to a preset first display control ratio. The display control ratio is used for indicating the proportion of the magnification of the graph to the control parameter of the magnification operation. And when the magnification of the graph is larger than the target magnification, determining a second display control ratio according to the magnification of the graph and the target magnification, wherein the target magnification is determined according to the display resolution of the screen and the graph resolution of the graph. And continuing to amplify the graph according to the second display control ratio.
The graphic may be, among other things, a picture, a web page, an interface to a scalable application (e.g., a map in a mapping application), or other image object that may change size. The display control ratio refers to the ratio of the graphic magnification factor to the control parameter of the magnification operation when the touch operation is performed. When the image is magnified by the pinch-in operation of the two fingers far away, the display control ratio is the ratio of the magnification of the image to the distance between two fingers contacting the touch screen in the pinch-in operation of the two fingers. The target magnification is the maximum magnification that can be obtained after the operation of the magnification is performed and the user leaves the hand. For example, when the graphic is displayed at the screen resolution, the magnification of the graphic is 1 time. If the figure can be magnified 5 times after the magnification operation is performed and the hand is left, the target magnification is 5 times.
In this embodiment, when the graphic is enlarged at a preset display control ratio in response to the enlargement operation and the magnification of the graphic is larger than the target magnification, the second display control ratio is determined according to the magnification of the graphic and the target magnification. And amplifying the graphics according to the second display control ratio. Since the second display control ratio is determined based on the magnification of the graphic and the target magnification, which is in turn related to the display resolution of the screen and the graphic resolution of the graphic. Therefore, the graphics can be adapted to the graphics with different resolutions by amplifying the graphics with the second display control ratio, so that when the graphics with different resolutions are amplified, the attenuation functions are unified after the amplification factor is larger than the target amplification factor, and the uniform attenuation speed of the display control ratio is achieved. The effect of prompting that the graphic magnification of the user exceeds the target magnification is more obvious, and clear and definite feedback and prompt for controlling the operation progress are given to the user.
In some embodiments, the target magnification is related to a display resolution of the screen and a graphics resolution of the graphics, including: the target magnification is linearly related to the ratio of the display resolution and the graphics resolution.
In some embodiments, the target magnification is linearly related to a ratio of the display resolution to the graphics resolution, including: and multiplying the ratio of the display resolution to the graph resolution by a first preset coefficient, and adding a second preset coefficient to obtain the target magnification.
In some embodiments, determining the second display control ratio based on the magnification of the graphic and the target magnification comprises: and calculating to obtain a second display control ratio according to the magnification factor of the graph, the target magnification factor and the maximum magnification factor, wherein the maximum magnification factor is the maximum magnification factor which can be reached when the graph is magnified by the magnification operation.
In some embodiments, calculating the second display control ratio according to the magnification of the figure and the target magnification and the maximum magnification includes: according to the formulaAnd calculating a second display control ratio. Wherein gain(s) is the second display control ratio, s is the magnification of the graph, ScaleoffAs target magnification, ScalepinchIs the maximum magnification factor, lambda is a third preset coefficient, and the third preset coefficient is linearly related to the target magnification factor.
In some embodiments, zooming in on the graphic in response to the zoom-in operation includes: the prompting is performed when the magnification of the graphic is equal to the target magnification and/or the magnification of the graphic is equal to the maximum magnification.
In a second aspect, an embodiment of the present application provides a graphics control apparatus, which is applied to a terminal device, where a graphic is displayed on a screen of the terminal device, and the apparatus includes: and the response module is used for responding to the amplification operation and amplifying the graph according to the preset first display control ratio. The display control ratio is used for indicating the proportion of the magnification of the graph to the control parameter of the magnification operation. . And the determining module is used for determining a second display control ratio according to the magnification factor of the graph and the target magnification factor when the magnification factor of the graph is larger than the target magnification factor, wherein the target magnification factor is determined according to the display resolution of the screen and the graph resolution of the graph. And the response module is also used for continuously amplifying the graph according to the second display control ratio.
In some embodiments, the target magnification is related to a display resolution of the screen and a graphics resolution of the graphics, including: the target magnification is linearly related to the ratio of the display resolution and the graphics resolution.
In some embodiments, the target magnification is linearly related to a ratio of the display resolution to the graphics resolution, including: and multiplying the ratio of the display resolution to the graph resolution by a first preset coefficient, and adding a second preset coefficient to obtain the target magnification.
In some embodiments, the determining module is specifically configured to calculate and obtain the second display control ratio according to a magnification factor of the graph, a target magnification factor, and a maximum magnification factor, where the maximum magnification factor is a maximum magnification factor that can be achieved when the graph is magnified by the magnifying operation.
In some embodiments, the determining module is specifically configured to determine the value of the formulaAnd calculating a second display control ratio. Wherein gain(s) is the second display control ratio, s is the magnification of the graph, ScaleoffAs target magnification, ScalepinchIs the maximum magnification factor, lambda is a third preset coefficient, and the third preset coefficient is linearly related to the target magnification factor.
In some embodiments, the apparatus further comprises a prompting module for prompting when the magnification of the graphic is equal to the target magnification, and/or the magnification of the graphic is equal to the maximum magnification.
In a third aspect, an embodiment of the present application provides a terminal device, including: a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the method as provided by the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, where a computer program is stored, and when executed by a processor, the computer program implements the method provided in the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a terminal device, causes the terminal device to execute the method provided in the first aspect.
In a sixth aspect, an embodiment of the present application provides a chip system, where the chip system includes a memory and a processor, and the processor executes a computer program stored in the memory to implement the method provided in the first aspect.
In a seventh aspect, an embodiment of the present application provides a chip system, where the chip system includes a processor, the processor is coupled with the computer-readable storage medium provided in the fourth aspect, and the processor executes a computer program stored in the computer-readable storage medium to implement the method provided in the first aspect.
It is to be understood that, the beneficial effects of the second to seventh aspects may be referred to the relevant description of the first aspect, and are not repeated herein.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application;
fig. 2 is a schematic view of an application scenario of a graphics control method according to an embodiment of the present application;
fig. 3 is a schematic view of an application scenario of a graphics control method according to another embodiment of the present application;
fig. 4 is a schematic flowchart of a graphics control method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a display control ratio decay function in a graph control method according to an embodiment of the present application
Fig. 6 is a schematic diagram illustrating a graph being enlarged to a target magnification in a graph control method according to an embodiment of the present application;
FIG. 7 is a diagram illustrating a process of continuously enlarging a pattern after a magnification of the pattern is greater than a target magnification in a pattern control method according to an embodiment of the present disclosure;
FIG. 8 is a schematic structural diagram of a graphics controller according to an embodiment of the present disclosure;
FIG. 9 is a schematic structural diagram of a graphics controller according to another embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when.. or" upon "or" in response to a determination "or" in response to a detection ".
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise.
The graph control method provided by the embodiment of the application is applied to the terminal equipment which can display the graph and control the scaling of the graph. Such as smart phones, smart watches, tablet computers, notebook computers, desktop computers, and the like.
Fig. 1 shows a hardware configuration diagram of a terminal device.
As shown in fig. 1, the terminal device includes: processor 110, vibration motor 111, keys 112, audio module 120, screen 130, camera module 140, memory module 150, interface 160, power module 170, sensor module 180, communication module 190, etc. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 1 is not intended to be limiting of the terminal device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each constituent component and module of the terminal device in detail with reference to fig. 1:
the processor 110 is a control center of the terminal device, and may connect various parts of the terminal device by using various interfaces and lines, and perform various functions of the terminal device and process data by operating or executing software programs and/or modules stored in the storage module 150 and calling data stored in the storage module 150, thereby alerting a call.
In some embodiments, processor 110 may include one or more processing units. For example, it may be a Central Processing Unit (CPU), or it may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc.
In still other embodiments, processor 110 may integrate an application processor and a modem processor. The application processor mainly processes an operating system, a user interface, application programs and the like, and the modem processor mainly processes wireless communication. It is to be understood that the modem processor may not be integrated into the processor 110, and the application is not limited thereto.
The vibration motor 111 can vibrate when receiving the signal, and the warning is performed in the form of vibration.
The input module 112 may be configured to receive input information and key signals, where the input information includes number or character information, touch information, and the like, and the key signals include pressing signals of physical keys, pressing signals of virtual keys, and the like.
In one embodiment, the input module 112 may include a touch panel and other input devices. The touch panel and the screen 130 may form a touch screen, and the touch panel may collect a touch operation performed by a user on or near the touch panel (e.g., an operation performed by the user on or near the touch panel by an object or an accessory capable of generating a touch signal on the touch panel using a finger, a stylus, etc.), and execute a corresponding function according to a preset program driver. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation, and transmits the signal to the touch controller. The touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 110, and the processor 180 receives the touch point coordinates, converts the touch point coordinates into touch commands, and executes the touch commands. Touch panels can be implemented using various types, such as resistive, capacitive, infrared, and surface acoustic wave. Other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick.
An audio module 120 for processing the audio signal. For example, the audio module 120 may convert an analog audio signal received by the microphone 123 into digital audio data and send the digital audio data to the processor 110. Alternatively, the digital audio data transmitted from the processor 110 is converted into analog signals that can be played by the speaker 121 and the receiver 122 and transmitted to the speaker 121 or the receiver 122.
The screen 130 is used to display contents output from the terminal device through visual output. For example, information input by a user may be displayed, information provided to the user may be presented, a system interface of the terminal device may be displayed, an interface of an application running on the terminal device may be displayed, and so on. The material of the Display panel of the screen 130 may be a Liquid Crystal Display (LCD), a Thin Film Transistor (TFT), a Light-Emitting semiconductor (LED), an Organic Light-Emitting semiconductor (OLED), and the like, which are not limited herein.
In some embodiments, the touch panel may be overlaid on the display panel of the screen, and when the touch panel detects a touch operation thereon or nearby, the touch panel transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel according to the type of the touch event. Although the screen and the touch panel (not shown) are two parts independent of each other in fig. 1 to implement the input and output functions of the mobile phone, in some embodiments, the touch panel may be integrated with the display panel to implement the input and output functions of the mobile phone. After the touch panel is integrated with the screen, the touch panel can be used as a touch screen, and the touch screen can receive a touch signal on the touch panel and feed the touch signal back to the processor 110. The processor 110 responds according to the touch signal, and displays the response result through a display panel of the screen.
The camera module 140 includes at least 1 camera, which may be a front camera 141 or a rear camera 142.
For example only, the terminal device may be a single camera, a dual camera, a triple camera, or a quad camera. For example, in the case of four cameras, one camera is the front camera 141, and three cameras are the rear cameras 142. The three rear cameras 142 may be cameras of different focal lengths. Such as a main camera with an equivalent focal length of 35mm, a wide-angle camera with an equivalent focal length of 20mm and a tele camera with an equivalent focal length of 105 mm. The embodiments of the present application do not limit this.
It should be noted that, when the terminal device includes a plurality of cameras, all the cameras may be arranged in front of the terminal device, or all the cameras may be arranged in back of the terminal device, or a part of the cameras may be arranged in front of the terminal device, and another part of the cameras may be arranged in back of the terminal device, which is not limited in this embodiment of the present application.
The storage module 150 includes an internal storage 151 and an external storage interface 152, and the internal storage 151 may be a flash memory, a hard disk, an operation memory, or the like. For example, the internal memory may include at least one hard disk or flash memory, an operating memory. The external storage interface 152 is used to connect external storage, which may include memory cards, removable hard disks, U-disks, optical disks, and the like.
The storage module 150 may be used to store software programs and modules, and the processor 110 executes various functional applications and data processing of the terminal device by running the software programs and modules stored in the storage module 150. The storage module 150 may mainly include a program storage area and a data storage area. The storage program area is usually located on the internal memory 151, and may store an operating system and application programs (e.g., a sound playing function and a touch response function) required by at least one function. The storage data area may be located on the internal memory 151, or on an external memory connected to the external memory interface 152, or on both the internal memory and the external memory. The storage data area may store data (e.g., audio data, image data, video data) created according to the use of the cellular phone.
The interface 160 includes, but is not limited to, a Subscriber Identity Module (SIM) card interface 161, a USB interface 162, and a headset interface 163. The SIM card interface is used for inserting an SIM card provided by an operator, so that when the terminal device is in communication connection with the base station through the mobile communication module 191, the identity of a user is identified and verified, and after the identity is verified, a call request and a data request are sent to the base station, and a call, data, short messages and the like forwarded by the base station are received.
The USB interface 162 may connect the terminal device with a computer through a USB data line for data exchange. Meanwhile, the USB interface 162 is further connected to the power module 170, and when the USB data cable is connected to a computer or a charging socket, the USB data cable may transmit the input electric energy to the power module 170 to charge the terminal device. The USB interface 162 may be micro-USB, mini-USB, USB Type-c, etc., which is not limited herein.
The headphone interface 163 is used to access headphones. The headphone interface 163 may be a stand-alone interface, for example, the headphone interface 163 may be a 3.5mm headphone jack. Alternatively, the headset interface 163 may also be integrated in the USB interface 162, for example, the headset interface may be integrated in USB Type-c. When a headphone is inserted into the headphone interface 163, the audio module 120 may no longer transmit the output analog audio signal to the speaker 121 or the receiver 122, but transmit the output analog audio signal to the headphone through the headphone interface 163, and play the audio through the headphone. When the earphone is plugged, if it is detected that the earphone does not include a microphone, the audio module still receives the analog audio signal sent by the microphone 123. If the earphone is detected to include a microphone, the audio module receives an analog audio signal sent by the earphone microphone, processes the analog audio signal, and sends the processed analog audio signal to the processor 110.
The terminal device also includes a power module 170 that provides power to the various components. The power module may include a battery, a power management module, and the like. The power management module may be logically connected to the processor 110, so as to manage the charging, discharging, and power consumption management of the battery through the power management module.
The terminal device may also include a sensor module 180. The sensor module includes at least one sensor, such as a pressure sensor 181, a gyro sensor 182, a distance sensor 183, a proximity light sensor 184, a touch sensor 185, an acceleration sensor 186, or the like. The proximity sensor 184 may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may turn off the display panel and/or the backlight of the screen 130 when the mobile phone is moved to the ear. As one of the motion sensors, the acceleration sensor 186 can detect the magnitude of acceleration in various directions (generally three axes), detect the magnitude and direction of gravity when the terminal device is stationary, and can be used to identify the posture of the terminal device (e.g., pick up or put down the terminal device, horizontally or vertically place the terminal device, etc.). And can also be used for identifying related actions (such as step counting, knocking) and the like through vibration. Other sensors of the terminal device are not described herein.
The communication module 190 includes a mobile communication module 191 and a wireless communication module 192. The Mobile communication module 191 may support any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), Time Division-Division WCDMA (TD-WCDMA), Time Division-LTE (TD-LTE), and fifth generation New Radio (5G NR), etc. The Wireless Communication module 192 may support Bluetooth (Bluetooth), Wireless Fidelity (Wi-Fi), Near Field Communication (NFC), and the like.
The mobile communication module 191 may be configured to connect with a communication base station through an antenna to establish a call link between the terminal device and another terminal device, and receive call audio. The wireless communication module 192 is used for wirelessly connecting with an external communication device, wherein the external communication device may be an external device with sound playing and receiving functions, such as a bluetooth headset and a bluetooth sound box. When the external communication device is connected to the terminal device through the wireless communication module 192, the external communication device may play audio, receive analog audio signals collected by the external communication device, process the analog audio signals, and send the processed analog audio signals to the processor 110.
In the operation process of the terminal device, the screen 130 may display a graphic, the touch panel is configured to receive a zoom-in operation (e.g., a pinch-in operation with two fingers away) of a user and send the zoom-in operation to the processor 110, and the processor 110 may implement the graphic control method provided by the present application by operating a software program stored in the storage module 150.
Fig. 2 and fig. 3 respectively show an application scenario of the graphics control method provided by the present application.
Fig. 2 shows a scenario in which the terminal device is a smartphone. In this scenario, a graphic is displayed on the touchscreen of the smartphone. The graphics may be a picture, a web page, an interface to a scalable application (e.g., a map in a mapping application), or other image object of varying size, and the specific type is not limited herein. The graph in fig. 2 is illustrated by taking a picture as an example.
Referring to fig. 2, the smartphone may zoom a picture displayed on the touch screen in response to a pinch-and-pinch operation of the user. The double-finger pinch operation is a touch gesture in which two fingers are contacted on a touch medium (such as a touch screen) and the control is performed through the change of the distance between the two fingers. The zoom-in operation can be realized by increasing the distance between the two fingers, the zoom-out operation can be realized by decreasing the distance between the two fingers, and the like. For example, the user's thumb and forefinger in fig. 2 are in contact with the touch screen, and the smartphone detects two touch points, and the distance between the touch points. When the thumb and the index finger move in opposite directions, the distance between the thumb and the index finger is increased, the distance between the two detected touch points is correspondingly increased, and the smart phone amplifies the picture according to the increased distance and the preset display control ratio.
The display control ratio refers to a ratio of a graphic magnification factor to a control parameter of an amplification operation when a touch operation is performed. For example, when the graphic is enlarged by the pinch-in operation of the two fingers in fig. 2, the display control ratio is the ratio of the enlargement factor of the graphic to the increased distance between the two touched points.
Fig. 3 shows a scenario in which the terminal device is a laptop. In this scene, graphics are displayed on the screen. The notebook computer responds to the double-finger pinch operation acted on the touch pad to zoom the graph on the screen. The graphic type and the scaling principle are similar to those in fig. 1, and are not described herein.
In both fig. 2 and 3, the double-finger pinch operation is used as a control means for controlling the scaling of the graphics, but the control means is not limited to this. For example, when the method is applied to a notebook computer, if a mouse is externally connected, the zooming of the graph can be controlled through a mouse wheel. In this case, the magnitude of the graphical zoom may be determined based on the corresponding scroll direction and angle of the scroll of the mouse wheel. The graphic can be enlarged when the mouse roller positively rolls and can be reduced when the mouse roller reversely rolls. The display control ratio takes the angle of the roller wheel when rolling as a control parameter. For example, when the mouse wheel rolls forward to magnify the graph, the display control ratio is the ratio of the magnification factor of the graph to the angle of the wheel rolling forward.
When the image is magnified, the magnification factor has an upper limit due to the limitations of the screen resolution and the image resolution of the terminal device. After the graph is amplified and left from the hand, the maximum amplification factor capable of being amplified is the target amplification factor. For example, the target magnification is set to 5 times. When the graph is magnified, the magnification of the graph is gradually increased from 1 time. If the magnification of the figure is increased to 10 times along with the magnification operation, the magnification operation is terminated. The magnification of the figure is adjusted to 5 times (i.e., the target magnification) after the magnification operation is terminated.
In the process of amplifying the graphic in response to the user's operation, if the magnification of the graphic exceeds the target magnification, the user needs to be prompted that the amplification operation is close to the limit. In the prior art, the display control ratio can be attenuated through an exponential function, and the amplitude of an amplification graph in the amplification operation is reduced so as to prompt a user that the amplification limit is reached. For example, if the enlarging operation is a kneading operation with both fingers away. At normal magnification, the fingers increase by a factor of 1 for every 0.5cm away of the pattern. After the display control ratio is attenuated, the distance between the two fingers is 0.5cm, and the distance between the two fingers can be increased by only 0.5 times. If the attenuation continues, the fingers may only be able to increase the pattern by a factor of 0.1 every 0.5cm away. In this way, when the user exceeds the limit of the zoom-in operation, it can be perceived that the zoom-in operation being performed exceeds the limit. The user can conveniently carry out amplification operation according to the requirement, and the operation experience of the user is improved.
However, the target magnification is related to the screen resolution of the terminal device and the resolution of the graphics. When different graphs are controlled on the same terminal device, the resolutions of the graphs are different, so that the target magnification of each graph is different. After the magnification factor of the graph is larger than the target magnification factor, if the display control ratio is attenuated by means of exponential function attenuation and the like, functions of the display control ratio changing along with the magnification factor of the graph are different, so that the speed (namely the functional relation between the display control ratio and the graph magnification factor) when the display control ratio of each graph is attenuated to the maximum magnification factor from the target magnification factor is not uniform, the amplification operation limit is not clear enough, and a clear and accurate prompt effect cannot be provided.
Therefore, the graphic control method can dynamically adjust the display control ratio, so that the attenuation function of the display control ratio is matched with the graphics with different resolutions, and clear and definite control operation progress feedback and prompt are given to a user.
Fig. 4 shows a schematic flow chart of a graphics control method provided by the present application, which may be applied to the terminal device described above by way of example and not limitation. When the graphic control method provided by the application is applied, the graphic with changeable size needs to be displayed on the screen of the terminal equipment in response to the series of operations of the user. For example, an album application may be opened to display a graphical interface including pictures in response to a user's clicking operation. Or, the map application program can be opened, a graphical interface comprising the map is displayed and the like in response to the clicking operation of the user. The pattern can then be controlled according to the following steps.
And S11, responding to the amplification operation, and amplifying the graph according to the preset first display control ratio.
And S12, when the magnification of the graph is larger than the target magnification, determining a second display control ratio according to the magnification of the graph and the target magnification.
In some embodiments, the zoom-in operation may be a two-finger pinch operation by the user. When the user touches two fingers on the touch medium of the terminal device, the terminal device can detect two touch points and a distance d between the touch points. When the user performs the pinch-in operation of the two fingers, the distance between the two touch points changes with the pinch-in operation of the two fingers. And D, the distance between the two touch points after the terminal equipment detects the change is D. If the display control ratio is gain(s), the magnification factor s of the graph is as follows:
(formula one) where s is (D-D) × gain(s)
When s is greater than 0, the pinch-in operation of the two fingers is an enlargement operation, when s is less than 0, the pinch-in operation of the two fingers is a reduction operation, and when s is equal to 0, the distance between the touch points is not changed, and neither enlargement nor reduction is performed.
Further, gain(s) is a piecewise function which can be expressed by formula two:
wherein gain(s) is less than or equal to Scale at soffAnd taking the value as a first display control ratio. At s > ScaleoffWhile passing through the attenuation function of the display control ratioAnd calculating to obtain a second display control ratio.Is a first parameter, and lambda is a third preset coefficient. ScaleoffScale for target magnificationpinchThe maximum multiple that can be achieved when the image is magnified. Generally speaking, ScalepinchThe magnification when the apparent control ratio is attenuated to 0.
Referring to the second formula, when the magnification of the graph is less than or equal to the target magnification, the display control ratio is a preset display control ratio, that is, the display control ratio is 1. When the magnification of the graph is larger than the target magnification, the difference (s-Scale) between the magnification of the graph and the target magnification is calculatedoff) Difference between maximum magnification and target magnification (Scale)pinch-Scaleoff) Multiplying the ratio by a third preset coefficient lambda to obtain a first parameter, and determining the display control ratio of the amplification operation as the power of the first parameter of the natural constant, namely the display control ratio is
Further, the target magnification is linearly related to the ratio of the display resolution to the graphics resolution. Calculate ScaleoffIn the process, the ratio of the display resolution to the graphic resolution may be multiplied by a first preset coefficient, and then a second preset coefficient is added to obtain the target magnification. That is, it can be calculated by formula three:
wherein ResolutiondisplayResolution of the screen of the terminal equipmentgraphIs the resolution of the pattern. k is a first preset coefficient and b is a second preset coefficient which is a self-defined preset parameter. Target magnification Scale when graphic resolution and screen resolution are equal0K + b. For example, if k is 5 and b is 3, the target magnification is 8 times when the graphic resolution is equal to the screen resolution.
It should be noted that in the calculationIt is necessary to determine whether the graphics and screen are highly or widely aligned and then calculate them in number of pixels aligned. For example, refer to the scene shown in FIG. 1, where the pictures are aligned with the screen width. If the resolution of the picture is 2160 × 3840, the screen resolution is 1200 × 2640, and the picture width is aligned with the screen width when the picture is displayed, thenScale in the case of k being 5 and b being 3offAbout 5.78 times.
For another example, refer to the scene shown in FIG. 2, in which the pictures are highly aligned with the screen. If the picture resolution is 6880 × 4584 and the screen resolution is 2560 × 1440, thenScale in the case of k being 5 and b being 3offAbout 4.57 times.
Further, the maximum magnification is the maximum magnification that can be achieved when the image is magnified by a magnification operation, e.g., ScalepinchMay be ScaleoffMultiple of the constant of (c). Namely ScalepinchCan be expressed by the formula four:
Scalepinch=mScaleoff(formula four)
Wherein m is also a self-defined preset parameter, and m is greater than 1, for example, 2, 3, 3.2, etc., subject to the setting in actual application.
Further, the preset coefficient λ can be expressed by the formula five:
wherein λ is0The parameter is also a self-defined preset parameter, which is a lambda value when the resolution of the graph is equal to the resolution of the screen. E.g. λ0May be set to 2, 3, 4.5, etc.
In this embodiment, according to the second formula, after the graphic magnification is greater than the target magnification, the display control ratio (the second display control ratio) during the pinch operation of the two fingers is determined and updated according to the display resolution, the graphic resolution, the target magnification and the maximum magnification, so that the dynamic adjustment of the display control ratio is realized, the attenuation speed of the display control ratio is unified, and the unified operation experience is provided.
And S13, continuing to enlarge the graph according to the second display control ratio.
In some embodiments, when the magnification of the graph is larger than the target magnification, s in the formula two can be used>ScaleoffIn the case ofAnd determining a second display control ratio, and then continuously amplifying the graph according to the second display control ratio.
Referring to fig. 5, in fig. 5, a function 1 and a function 2 are attenuation functions of a display control ratio provided in the prior art, and a function 3 is gain(s), and it can be seen from fig. 5 that when the graphic control method provided by the present application is not used, speeds of attenuation of the display control ratio are obviously different for different graphics, and a uniform operation experience cannot be provided. The attenuation function of the display control ratio can dynamically adjust the attenuation of the display control ratio according to different graphs and parameters of terminal equipment. The speed of the display control ratio during attenuation is uniform, the operation experience is also uniform, and a clear and accurate prompt effect is provided.
In some embodiments, when the graphic is magnified in response to the magnification operation, a prompt may also be made when the magnification of the graphic is equal to the target magnification and/or equal to the maximum magnification. So as to further provide clear and accurate prompting effect. The prompting mode can be vibration, character pop-up window, sound prompting, etc. For example, the shaking may be performed when the magnification of the graphic is equal to the target magnification and/or equal to the maximum magnification, while popping up a prompt "maximum magnification reached".
The flow of the graphic control method provided by the present application when applied will be described with reference to the scenarios shown in fig. 2, fig. 6, and fig. 7.
The terminal device first responds to the click operation of the user, opens an application program such as an album, a map, a browser and the like for displaying graphics, and if the album is opened in fig. 2, 6 and 7, the graphics displayed in the figure are pre-stored pictures in the album.
After the photo album is opened, the picture is displayed in a size suitable for the screen, and the graphic magnification s is 1. The terminal device waits for receiving an operation of a user. And if the fact that the user performs the double-finger pinch operation and the direct distance between the two touch points is increased is detected, determining that the double-finger pinch operation is the amplification operation. And then responding to an amplification operation according to a preset display control ratio 1 to amplify the graph. Until the image is amplified to the amplitude shown in fig. 5, that is, the image amplification factor is equal to the target amplification factor (for example, 10 times), attenuation is performed on the display control ratio, the attenuation function is gain(s), s>10. Referring to fig. 6, after the graphic magnification is greater than the target magnification, the zooming operation is continued, the distance between the two touch points is increased, and the display control ratio is continuously adjusted according to gain(s) until the graphic magnification reaches Scalepinch(e.g., by a factor of 25), and gain(s) at this time approaches 0, i.e., the picture cannot be enlarged (i.e., by the factor shown in fig. 6) following the distance between the two touch points increasing. If the user looses his hand when the magnification of the graph is less than or equal to 10, the terminal device detects that the two touch points disappear, and the magnification of the graph is kept at the magnification when the user looses his hand. If the user is magnifying the figureIf the number is larger than 10, the magnification of the figure is adjusted to the target magnification, that is, the magnification is restored to the magnification shown in fig. 5.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 7 shows a block diagram of a graphics control apparatus provided in an embodiment of the present application, corresponding to the graphics control method described in the above embodiment, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 8, the apparatus includes: and the response module 21 is configured to respond to the amplification operation and amplify the graphics according to a preset first display control ratio. The determining module 22 is configured to determine a second display control ratio according to the magnification factor of the graphic and the target magnification factor when the magnification factor of the graphic is greater than the target magnification factor, where the target magnification factor is determined according to the display resolution of the screen and the graphic resolution of the graphic, and the second display control ratio is smaller than the first display control ratio. And the response module 21 is further configured to continue to enlarge the graphics according to the second display control ratio.
In some embodiments, the target magnification is related to a display resolution of the screen and a graphics resolution of the graphics, including: the target magnification is linearly related to the ratio of the display resolution and the graphics resolution.
In some embodiments, the target magnification is linearly related to a ratio of the display resolution to the graphics resolution, including: and multiplying the ratio of the display resolution to the graph resolution by a first preset coefficient, and adding a second preset coefficient to obtain the target magnification.
In some embodiments, the determining module 22 is specifically configured to calculate the second display control ratio according to a magnification factor of the graphic, a target magnification factor, and a maximum magnification factor, where the maximum magnification factor is a maximum magnification factor that can be achieved when the graphic is magnified by the magnifying operation.
In some embodiments, the determination module 22 is specifically configured to determine the value of the formulaAnd calculating a second display control ratio. Wherein gain(s) is the second display control ratio, s is the magnification of the graph, ScaleoffAs target magnification, ScalepinchIs the maximum magnification factor, lambda is a third preset coefficient, and the third preset coefficient is linearly related to the target magnification factor.
In some embodiments, as shown in fig. 9, the apparatus further comprises a prompting module 23 for prompting when the magnification of the graphic is equal to the target magnification, and/or the magnification of the graphic is equal to the maximum magnification.
It should be noted that, because the contents of information interaction, execution process, and the like between the modules are based on the same concept as that of the embodiment of the method of the present application, specific functions and technical effects thereof may be specifically referred to a part of the embodiment of the method, and details are not described here.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a terminal device, where the terminal device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
Fig. 10 is a schematic structural diagram of a terminal device. As shown in fig. 10, the terminal device 3 includes: at least one processor 31 (only one processor is shown in fig. 10), a memory 32, and a computer program 33 stored in the memory 32 and executable on the at least one processor 31, the steps in any of the various graphical control method embodiments described above being implemented when the computer program 33 is executed by the processor 31.
The terminal device 3 may be a desktop computer, a notebook, a palm computer, a smart phone, or the like. The terminal device 3 may include, but is not limited to, a processor 31, a memory 32. Those skilled in the art will appreciate that fig. 3 is only an example of the terminal device 3, and does not constitute a limitation to the terminal device 3, and may include more or less components than those shown, or combine some components, or different components, for example, and may further include an input/output device, a network access device, and the like.
The Processor 31 may be a Central Processing Unit (CPU), and the Processor 31 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 32 may in some embodiments be an internal storage unit of the terminal device 3, such as a hard disk or a memory of the terminal device 3. The memory 32 may also be an external storage device of the terminal device 3 in other embodiments, such as a plug-in hard disk provided on the terminal device 3, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 32 may also include both an internal storage unit of the terminal device 3 and an external storage device. The memory 32 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of a computer program. The memory 32 may also be used to temporarily store data that has been output or is to be output.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or apparatus capable of carrying computer program code to a terminal device, recording medium, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above described network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A graphic control method is applied to a terminal device, and a graphic is displayed on a screen of the terminal device, and is characterized by comprising the following steps:
responding to an amplification operation, and amplifying the graph according to a preset first display control ratio, wherein the display control ratio is used for indicating the proportion of the amplification factor of the graph to the amplification operation control parameter;
when the magnification of the graph is larger than a target magnification, determining a second display control ratio according to the magnification of the graph and the target magnification, wherein the target magnification is determined according to the display resolution of the screen and the graph resolution of the graph, and the second display control ratio is smaller than the first display control ratio;
and continuing to amplify the graph according to the second display control ratio.
2. The method of claim 1, wherein determining the target magnification according to a display resolution of the screen and a graphics resolution of the graphics comprises:
and multiplying the ratio of the display resolution to the graph resolution by a first preset coefficient, and adding a second preset coefficient to obtain the target magnification.
3. The method of claim 1 or 2, wherein determining the second display control ratio based on the magnification of the graphic and the target magnification comprises:
and calculating to obtain the second display control ratio according to the magnification factor of the graph, the target magnification factor and the maximum magnification factor, wherein the maximum magnification factor is the maximum magnification factor which can be reached when the graph is magnified by the magnification operation.
4. The method of claim 3, wherein calculating the second display control ratio according to the magnification of the graph, the target magnification and the maximum magnification comprises:
wherein gain(s) is the second display control ratio, s is the magnification of the graph, and ScaleoffFor the target magnification, ScalepinchAnd taking the maximum magnification factor and the lambda as a third preset coefficient, wherein the third preset coefficient is linearly related to the target magnification factor.
5. The method of claim 4, wherein magnifying the graphic in response to the magnifying operation comprises:
and prompting when the magnification of the graph is equal to the target magnification and/or the magnification of the graph is equal to the maximum magnification.
6. A graphic control device is applied to a terminal device, and a graphic is displayed on a screen of the terminal device, and the graphic control device is characterized by comprising:
the response module is used for responding to the amplification operation and amplifying the graph according to a preset first display control ratio, wherein the display control ratio is used for indicating the proportion of the amplification factor of the graph to the control parameter of the amplification operation;
the determining module is used for determining a second display control ratio according to the magnification of the graph and a target magnification when the magnification of the graph is larger than the target magnification, wherein the target magnification is determined according to the display resolution of the screen and the graph resolution of the graph, and the second display control ratio is smaller than the first display control ratio;
and the response module is also used for continuously amplifying the graph according to the second display control ratio.
7. The apparatus according to claim 6, wherein the determining module is specifically configured to calculate and obtain the second display control ratio according to a magnification of the graphic, the target magnification and a maximum magnification, where the maximum magnification is a maximum magnification that can be achieved when the graphic is magnified by the magnifying operation.
8. The device according to claim 6, characterized in that said determination module is particularly adapted to
wherein gain(s) is the second display control ratio, s is the magnification of the graph, and ScaleoffFor the target magnification, ScalepinchAnd taking the maximum magnification factor and the lambda as a third preset coefficient, wherein the third preset coefficient is linearly related to the target magnification factor.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 5 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010600422.8A CN113849104A (en) | 2020-06-28 | 2020-06-28 | Graphic control method, apparatus, terminal device and readable storage medium |
PCT/CN2021/082270 WO2022001217A1 (en) | 2020-06-28 | 2021-03-23 | Graph control method and apparatus, terminal device, and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010600422.8A CN113849104A (en) | 2020-06-28 | 2020-06-28 | Graphic control method, apparatus, terminal device and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113849104A true CN113849104A (en) | 2021-12-28 |
Family
ID=78972735
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010600422.8A Pending CN113849104A (en) | 2020-06-28 | 2020-06-28 | Graphic control method, apparatus, terminal device and readable storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113849104A (en) |
WO (1) | WO2022001217A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140028729A1 (en) * | 2012-07-30 | 2014-01-30 | Sap Ag | Scalable zoom calendars |
WO2018058986A1 (en) * | 2016-09-29 | 2018-04-05 | 中兴通讯股份有限公司 | Method and device for changing screen display scale, and mobile terminal |
CN110830713A (en) * | 2019-10-30 | 2020-02-21 | 维沃移动通信有限公司 | Zooming method and electronic equipment |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3748482A1 (en) * | 2012-09-29 | 2020-12-09 | Huawei Device Co., Ltd. | Electronic device and method for controlling zooming of displayed object |
JP2015032261A (en) * | 2013-08-06 | 2015-02-16 | シャープ株式会社 | Display device and control method |
CN105389072A (en) * | 2014-09-09 | 2016-03-09 | 富泰华工业(深圳)有限公司 | Automatic zoom-in/zoom-out system and method for user interface |
CN110286840B (en) * | 2019-06-25 | 2022-11-11 | 广州视源电子科技股份有限公司 | Gesture zooming control method and device of touch equipment and related equipment |
-
2020
- 2020-06-28 CN CN202010600422.8A patent/CN113849104A/en active Pending
-
2021
- 2021-03-23 WO PCT/CN2021/082270 patent/WO2022001217A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140028729A1 (en) * | 2012-07-30 | 2014-01-30 | Sap Ag | Scalable zoom calendars |
WO2018058986A1 (en) * | 2016-09-29 | 2018-04-05 | 中兴通讯股份有限公司 | Method and device for changing screen display scale, and mobile terminal |
CN110830713A (en) * | 2019-10-30 | 2020-02-21 | 维沃移动通信有限公司 | Zooming method and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2022001217A1 (en) | 2022-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7359920B2 (en) | Image processing method and flexible screen terminal | |
CN108958615B (en) | Display control method, terminal and computer readable storage medium | |
CN108509105B (en) | Application program management method and terminal | |
CN111562896B (en) | Screen projection method and electronic equipment | |
CN110007835B (en) | Object management method and mobile terminal | |
WO2019184947A1 (en) | Image viewing method and mobile terminal | |
CN107728886B (en) | A kind of one-handed performance method and apparatus | |
CN108415641B (en) | Icon processing method and mobile terminal | |
CN108762705B (en) | Information display method, mobile terminal and computer readable storage medium | |
CN110531915B (en) | Screen operation method and terminal equipment | |
CN109683764B (en) | Icon management method and terminal | |
CN111311489B (en) | Image clipping method and electronic equipment | |
CN111124571A (en) | Interface display method and electronic equipment | |
CN110536007B (en) | Interface display method, terminal and computer readable storage medium | |
CN109669656B (en) | Information display method and terminal equipment | |
US11290591B2 (en) | Dialing method and mobile terminal | |
CN108804628B (en) | Picture display method and terminal | |
CN108897475B (en) | Picture processing method and mobile terminal | |
CN108469940B (en) | Screenshot method and terminal | |
CN111352892B (en) | Operation processing method and electronic equipment | |
CN109885242B (en) | Method for executing operation and electronic equipment | |
JP7526810B2 (en) | Method for switching audio output modes and electronic device | |
CN110531905B (en) | Icon control method and terminal | |
CN108494958B (en) | Image processing method and flexible screen terminal | |
CN111443968A (en) | Screenshot method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |