CN105975166B - Application control method and device - Google Patents

Application control method and device Download PDF

Info

Publication number
CN105975166B
CN105975166B CN201610285182.0A CN201610285182A CN105975166B CN 105975166 B CN105975166 B CN 105975166B CN 201610285182 A CN201610285182 A CN 201610285182A CN 105975166 B CN105975166 B CN 105975166B
Authority
CN
China
Prior art keywords
control
menu
application
display
control menu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610285182.0A
Other languages
Chinese (zh)
Other versions
CN105975166A (en
Inventor
吴健成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huaduo Network Technology Co Ltd
Original Assignee
Guangzhou Huaduo Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huaduo Network Technology Co Ltd filed Critical Guangzhou Huaduo Network Technology Co Ltd
Priority to CN201610285182.0A priority Critical patent/CN105975166B/en
Publication of CN105975166A publication Critical patent/CN105975166A/en
Application granted granted Critical
Publication of CN105975166B publication Critical patent/CN105975166B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application relates to an application control method and device, wherein the method comprises the following steps: detecting a menu display touch event occurring in any area on an application interface; displaying a preset control menu in a floating manner in the application interface according to the touch position of the menu display touch event; one or more control options for correspondingly controlling the application are provided in the control menu; and when the control option is detected to be triggered, executing control operation corresponding to the control option on the application. According to the embodiment of the application, the touch position of the touch event is displayed according to the menu triggered by the user, and the control menu is displayed in the application interface in a suspending manner, so that the control menu can be conveniently operated by the user, the user can conveniently trigger the control option in the control menu, and the user can conveniently control the application.

Description

Application control method and device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an application control method and apparatus.
Background
With the development of terminal technology, a terminal can implement more and more functions through an Application (APP). In the conventional technology, a fixed position in an application interface is provided with control options such as buttons, and a user can trigger the control options to correspondingly control an application. However, this application control method is inconvenient for the user to operate flexibly with one hand.
Disclosure of Invention
In order to overcome the problems in the related art, the application control method and device are provided.
According to a first aspect of embodiments of the present application, there is provided an application control method, the method including:
detecting a menu display touch event occurring in any area on an application interface;
displaying a preset control menu in a floating manner in the application interface according to the touch position of the menu display touch event; one or more control options for correspondingly controlling the application are provided in the control menu;
and when the control option is detected to be triggered, executing control operation corresponding to the control option on the application.
According to a second aspect of embodiments of the present application, there is provided an application control apparatus including:
the detection module is used for detecting a menu display touch event occurring in any area on the application interface;
the display module is used for displaying the touch position of the touch event according to the menu and displaying a preset control menu in a suspension manner in the application interface; one or more control options for correspondingly controlling the application are provided in the control menu;
and the control module is used for executing control operation corresponding to the control option on the application when the control option is detected to be triggered.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
according to the method and the device, the control option is not required to be set at a fixed position in the application interface, the touch position of the touch event is displayed according to the menu triggered by the user, and the control menu is displayed in a suspended mode in the application interface, so that the control menu can be conveniently operated by the user, the user can conveniently trigger the control option in the control menu, and the user can conveniently control the application.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic diagram of an application interface of a terminal screen in the conventional art.
Fig. 2A is a flow chart illustrating an application control method according to an exemplary embodiment of the present application.
FIG. 2B is a schematic diagram of three control menus shown in the present application according to an exemplary embodiment.
FIG. 2C is a schematic diagram of an application interface shown in the present application, according to an example embodiment.
FIG. 2D is a schematic diagram of an application interface shown in the present application according to an exemplary embodiment.
FIG. 2E is a schematic diagram of an application interface shown in the present application, according to an example embodiment.
FIG. 2F is a schematic diagram of an application interface shown in the present application according to an exemplary embodiment.
FIG. 2G is a schematic diagram illustrating application control according to an exemplary embodiment of the present application.
Fig. 3 is a block diagram of an application control device shown in the present application according to an exemplary embodiment.
FIG. 4 is a block diagram illustrating an apparatus for application control according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
With the continuous development of terminal technology, the size of the terminal screen is larger and larger, for example, the mainstream terminals all adopt 5 inches, 5.5 inches or even 6 inches screens. The increase of the size of the terminal screen brings better visual experience to the user, and meanwhile, the user has to hold and operate the terminal with two hands, or one hand holds the terminal and operates the terminal with the other hand, so that the application control can be completed. However, a user sometimes wants to or has to hold and operate the terminal with a single hand, but the limited finger length can only realize an operation area with a limited area, and cannot conveniently control the display content in the whole application interface.
As shown in fig. 1, a schematic diagram of an application interface of a terminal screen in the conventional technology is shown, in the schematic diagram, a terminal uses a smartphone as an example, and an application uses a music playing application as an example for explanation. Three fixed position control options of "previous", "play", and "next" are provided in the application interface of the music playing application in fig. 1. When a user holds the terminal with one hand, for example, holds the terminal with the right hand and performs manipulation with the thumb, a one-handed manipulation region 102 as shown in fig. 1 may be formed on the terminal screen, where the one-handed manipulation region 102 can only cover part of the application control options, and other control options located on the left side and the upper side cannot be covered and operated, for example, the user cannot directly click the "previous" control option with the thumb of the right hand, which is not convenient for the user to flexibly perform application control with one hand.
The scheme provided by the embodiment of the application can be applied to intelligent terminals such as smart phones, tablet computers, music players, e-book readers or personal digital assistants, and the like, is different from the conventional technical means of setting control options at fixed positions of application interfaces, and can display the control menu in a floating manner according to the touch position of a user in the application interfaces, so that the display position of the control menu can be close to the touch position of fingers of the user, the user can conveniently trigger the control options in the control menu, and the user can conveniently control the application. The present application will be described in detail below.
As shown in fig. 2A, fig. 2A is a flowchart of an application control method shown in the present application according to an exemplary embodiment, which can be applied in a terminal, and includes the following steps 201 to 203:
in step 201, a menu display touch event occurring in any area on an application interface is detected.
In step 202, a preset control menu is displayed in a floating manner in the application interface according to the touch position of the menu display touch event.
Wherein, one or more control options for controlling the application correspondingly are provided in the control menu.
In step 203, when it is detected that the control option is triggered, a control operation corresponding to the control option is executed on the application.
The terminal can be an intelligent terminal such as a smart phone, a tablet computer, a music player, an e-book reader or a personal digital assistant.
In this embodiment, the menu display touch event may be understood as: in order to detect a user's need for an application control scheme based on the present application, a touch event is predefined within the terminal. It will be understood by those skilled in the art that any touch event that can be distinguished from other user operations or needs may be used as the menu display touch event in the present application.
As an exemplary embodiment, the menu display touch event may include: a click event, a double click event, or a long press event. That is, when it is detected that the menu display touch event such as the click event, the double-click event, or the long-press event occurs in any area of the application interface displayed on the terminal screen, the technical solution of the present application may be used for processing, otherwise, the processing may be performed without processing or according to a manner in the related art. Any area of the application interface may include the entire application interface, or a partial area in the application interface, for example, other areas except for the status bar or the title bar in the application interface may be referred to. The three events are used as menu display touch events, so that a user can conveniently indicate the terminal to display a control menu in an application interface.
In the embodiment of the application, according to the touch condition of the menu display touch event, the position displayed by the control menu can be determined when the user wishes to execute the application control processing. For example, when the menu display touch event is a click event, a double click event, or a long press event occurring on any region of the application interface, the position of the display center may be determined according to the touch position of the menu display touch event, and the control menu may be displayed in a floating manner according to the display center. It can be understood that when the finger of the user triggers the menu to display the touch event at different positions in the application interface, the display position of the control menu displayed in the application interface is different due to the difference of the touch positions.
As shown in fig. 2B, a schematic diagram of three control menus is shown. In specific implementation, a person skilled in the art can flexibly set a specific display form of the control menu according to an operating system of the terminal. For example, taking an android operating system as an example, the control menu may be designed by View (View), a person skilled in the art may design different frames, separation lines, ground color filling or content, and the like, and control menus with different effects are implemented by controlling different filling modes, brush styles, and the like.
In an alternative implementation manner, in order to display the control menu more quickly, the control menu may be loaded in the memory at the time of initialization of the application interface, and the display state of the control menu is preset to be the invisible state. The application interface is loaded and drawn by the corresponding APP and then is placed in the internal memory of the terminal, so that the touch induction drive of the terminal reads and displays the application interface from the internal memory and the internal memory on the terminal screen, the control menu can be loaded while the application interface is initialized, and the display state of the control menu can be preset to be in an invisible state so as to hide the control menu in the application interface. When the control menu is determined to be displayed according to the menu display touch event of the user, the loaded control menu in the memory can be directly read and displayed on the terminal screen.
One or more control options can be provided in the control menu, and in practical application, the control operation corresponding to the control option can be determined according to the functions of different applications and the control operation required by the application. In the embodiment of the application, the display position of the control menu is determined based on the touch position, so that the display position of the control menu is closer to the finger operation range of a user, the user can trigger the control option more conveniently, and the control operation on the application is executed. When the user triggers the control option, the terminal may detect that the control option is triggered, so as to perform a control operation corresponding to the control option on the application.
When the control menu is displayed, the control menu can be displayed near the touch position based on the touch position, so that the control menu is displayed in the finger operation range of the user. For example, the control menu may be displayed with the touch position as the center of the control menu. In this display mode, when the touch position of the finger of the user is closer to the edge of the terminal screen, a proper display center needs to be determined according to the touch position. In an alternative implementation, the following may be: when the trigger position is within a preset screen boundary range, the control menu can be displayed according to the display center determined by the trigger position, so that the control menu is completely displayed in a terminal screen; and when the trigger position is not in the range of the preset screen boundary, displaying the control menu by taking the trigger position as a display center.
For example, the following example takes a music playing application as an example, and illustrates an application control technical scheme based on the application; however, it should be understood by those skilled in the art that the technical solution of the present application can be equally applied to any application page of any other APP, such as a video playing application, a live broadcast application, a game application, or an instant messaging application, and can implement convenient application control.
In this embodiment of the application, the menu display touch event may be predefined as a long-press event, and when the user presses the application interface shown in fig. 2C for a long time, the terminal detects the menu display touch event, so that the control menu may be displayed based on the touch position of the user. Wherein, the control menu comprises three control options of 'previous', 'pause' and 'next'. The control menu in fig. 2C is shown in a ring-shaped pattern.
It can be understood that different touch positions may cause different display positions of the control menu, for example, in the application interface shown in fig. 2D, if the user holds the terminal with the left hand, and the position of the long-pressing application interface is different from that shown in fig. 2C, the position of the control menu displayed based on the touch position is also different from that shown in fig. 2C.
Compared with the control options at each fixed position in the prior art, the control menu is flexibly displayed according to the touch position, so that the control menu is displayed at the position which is convenient for the user to control, and the user can conveniently hold and control with one hand.
The control menus shown in fig. 2C and 2D are both displayed with the touch position of the menu display touch event as the display center. If the touch position of the user is as shown in fig. 2E and is closer to the terminal screen boundary, that is, the touch position is within the screen boundary range shown in fig. 2E, in order to enable the control menu to be completely displayed in the application interface, the position of the display center may be determined based on the touch position, for example, the position of the touch position after a certain offset may be determined as the display center, and then the control menu is displayed according to the display center.
Because one or more control options are provided on the control menu, the user can move the finger to trigger the control options, the terminal can determine that the control options are triggered according to the moving direction of the finger or the position triggered by the finger, and further can execute control operation corresponding to the control options on the application.
Still taking the control menu shown in FIG. 2C as an example, in FIG. 2F, when the user wishes to select the "Next" control option, the finger may be moved and pressed against the control option. At this time, the terminal may determine the control option selected by the user by detecting the pressing position of the user's finger and the position of each control option in the control menu. Or, when the finger of the user moves, the terminal can also judge the control option which is desired to be selected when the finger moves according to the distance between the current touch position of the finger and the central position of the control menu or the angle formed by the current touch position of the finger and the central position of the control menu and the screen baseline. After the control option selected by the user is determined, a highlighting processing mode can be adopted to highlight the control option selected by the user.
It will be appreciated that the user may also move his finger over other desired control options after selecting a control option. After the user determines the control operation the user wants to execute, the user can lift the finger, and the terminal can execute the corresponding control operation on the application according to the control option pressed by the finger before the finger is lifted, and meanwhile, the control menu is hidden. The above embodiment shows a manner of detecting that a control option is triggered, and in practical applications, a person skilled in the art may flexibly set various other manners of triggering the control option according to a designed control menu, which is not limited in this embodiment.
In addition, in the application control scheme of each of the above embodiments, after the application interface displays the control menu, the method may further include: when a hidden trigger event for the control menu is detected, setting the display state of the control menu to be an invisible state so as to hide the control menu in the application interface. In the embodiment of the application, after the user performs control on the application, the control menu displayed in the application interface can be hidden, so that the application interface is simpler.
In an exemplary embodiment, the hidden trigger event may refer to an event in which the user's finger is detected to be lifted, as opposed to a click, double click, or long press event in a menu display touch event. For example, it can be known from the foregoing embodiment that a user can press the application interface for a long time to display the control menu, and move the finger to trigger the control option to control the application, and after the execution of the control operation desired by the user is completed, the finger of the user is lifted, and at this time, the control menu can be hidden in the application interface, so that the application control mode is more intelligent, and convenience is provided for the user.
Fig. 2G is a schematic diagram illustrating a complete application control process according to an exemplary embodiment of the present application. When a user presses the application interface for a long time, the terminal detects a menu display touch event, and therefore a preset control menu is displayed in the application interface according to the touch position of the menu display touch event. The terminal judges that the finger of the user is positioned on the next control option according to the moving direction of the finger of the user, highlights the next control option and executes the control operation of playing the next song corresponding to the control option to the application. And after moving to the 'next' control option, the finger of the user is lifted, which indicates that the application control operation is finished, so that the terminal can hide the control menu in the application interface.
Corresponding to the embodiment of the application control method, the application also provides an embodiment of the application control device and a terminal applied by the application control device.
As shown in fig. 3, fig. 3 is a block diagram of an application control apparatus according to an exemplary embodiment shown in the present application, the apparatus including: a detection module 31, a display module 32 and a control module 33.
The detection module 31 is configured to detect a menu display touch event occurring in any area on the application interface.
And the display module 32 is configured to display the touch position of the touch event according to the menu, and suspend and display a preset control menu in the application interface. Wherein, one or more control options for controlling the application correspondingly are provided in the control menu.
And the control module 33 is configured to, when detecting that the control option is triggered, execute a control operation corresponding to the control option on the application.
In one optional implementation, the menu display touch event includes: a click event, a double click event, or a long press event.
In an optional implementation manner, the display module may be specifically configured to:
and when the trigger position is within a preset screen boundary range, displaying the control menu according to the display center determined by the trigger position so as to completely display the control menu in a terminal screen.
And when the trigger position is not in the range of the preset screen boundary, displaying the control menu by taking the trigger position as a display center.
In an optional implementation manner, the control menu is loaded in the memory when the application interface is initialized, and the display state of the control menu is preset to be in an invisible state.
In an optional implementation manner, the apparatus further includes a hiding module operable to:
when a hidden trigger event for the control menu is detected, setting the display state of the control menu to be an invisible state so as to hide the control menu in the application interface.
Correspondingly, the application also provides a terminal, which comprises a processor; a memory for storing processor-executable instructions; wherein the processor is configured to:
a menu display touch event occurring in an arbitrary area on the application interface is detected.
Displaying a preset control menu in a floating manner in the application interface according to the touch position of the menu display touch event; wherein, one or more control options for controlling the application correspondingly are provided in the control menu.
And when the control option is detected to be triggered, executing control operation corresponding to the control option on the application.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
As shown in fig. 4, fig. 4 is a schematic structural diagram of an application control device 800 according to an exemplary embodiment of the present application. For example, the apparatus 800 may be a mobile phone with routing capability, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like.
Referring to fig. 4, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, a microwave sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Wherein the instructions in the storage medium, when executed by the processor, enable the apparatus 800 to perform an application control method comprising:
a menu display touch event for an application interface is detected.
Displaying a preset control menu in the application interface according to the touch position of the menu display touch event; wherein, one or more control options for controlling the application correspondingly are provided in the control menu.
And when the control option is detected to be triggered, executing control operation corresponding to the control option on the application.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (10)

1. An application control method, characterized in that the method comprises:
detecting a menu display touch event occurring in any area on an application interface;
displaying a preset control menu in a floating manner in the application interface according to the touch position of the menu display touch event; one or more control options for correspondingly controlling the application are provided in the control menu; the control menu displays the control menu by taking the touch position of the menu display touch event as a display center;
when a user moves a finger on the control menu, determining a selected control option through an angle formed by the current touch position of the finger and the central position of the control menu and a screen baseline;
and when the control option is detected to be triggered, executing control operation corresponding to the control option on the application.
2. The method of claim 1, wherein the menu display touch event comprises: a click event, a double click event, or a long press event.
3. The method according to claim 1, characterized in that when the trigger position is within a preset screen boundary range, the control menu is displayed according to the display center determined by the trigger position so as to completely display the control menu in a terminal screen;
and when the trigger position is not in the range of the preset screen boundary, displaying the control menu by taking the trigger position as a display center.
4. The method according to claim 1, wherein the control menu is loaded in the memory at the time of initialization of the application interface, and the display state of the control menu is preset to be invisible.
5. The method of claim 1, further comprising:
when a hidden trigger event for the control menu is detected, setting the display state of the control menu to be an invisible state so as to hide the control menu in the application interface.
6. An application control apparatus, characterized in that the apparatus comprises:
the detection module is used for detecting a menu display touch event occurring in any area on the application interface;
the display module is used for displaying the touch position of the touch event according to the menu and displaying a preset control menu in a suspension manner in the application interface; one or more control options for correspondingly controlling the application are provided in the control menu; the control menu displays the control menu by taking the touch position of the menu display touch event as a display center;
when a user moves a finger on the control menu, determining a selected control option through an angle formed by the current touch position of the finger and the central position of the control menu and a screen baseline;
and the control module is used for executing control operation corresponding to the control option on the application when the control option is detected to be triggered.
7. The apparatus of claim 6, wherein the menu display touch event comprises: a click event, a double click event, or a long press event.
8. The apparatus of claim 6, wherein the display module is specifically configured to:
when the trigger position is within a preset screen boundary range, displaying the control menu according to the display center determined by the trigger position so as to completely display the control menu in a terminal screen;
and when the trigger position is not in the range of the preset screen boundary, displaying the control menu by taking the trigger position as a display center.
9. The apparatus of claim 6, wherein the control menu is loaded in the memory when the application interface is initialized, and the display state of the control menu is preset to be invisible.
10. The apparatus of claim 6, further comprising a hiding module to:
when a hidden trigger event for the control menu is detected, setting the display state of the control menu to be an invisible state so as to hide the control menu in the application interface.
CN201610285182.0A 2016-04-29 2016-04-29 Application control method and device Active CN105975166B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610285182.0A CN105975166B (en) 2016-04-29 2016-04-29 Application control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610285182.0A CN105975166B (en) 2016-04-29 2016-04-29 Application control method and device

Publications (2)

Publication Number Publication Date
CN105975166A CN105975166A (en) 2016-09-28
CN105975166B true CN105975166B (en) 2020-05-12

Family

ID=56994092

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610285182.0A Active CN105975166B (en) 2016-04-29 2016-04-29 Application control method and device

Country Status (1)

Country Link
CN (1) CN105975166B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106708374A (en) * 2016-12-08 2017-05-24 北京小米移动软件有限公司 Menu display method and device
CN107484024B (en) * 2017-08-17 2019-12-24 四川长虹电器股份有限公司 Method for realizing global floating menu by using hot key based on WEBOS intelligent television system
CN107704152A (en) * 2017-09-30 2018-02-16 努比亚技术有限公司 A kind of camera applications display methods, equipment and computer-readable recording medium
CN108600544B (en) * 2018-04-27 2021-01-08 维沃移动通信有限公司 Single-hand control method and terminal
CN109101160B (en) * 2018-08-20 2021-07-09 深圳市创凯智能股份有限公司 Terminal control method, terminal, and computer-readable storage medium
CN109992194A (en) * 2019-04-09 2019-07-09 广州视源电子科技股份有限公司 Intelligent interaction plate and its control method and device
CN110471298B (en) * 2019-08-12 2020-10-23 珠海格力电器股份有限公司 Intelligent household appliance control method, equipment and computer readable medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855056A (en) * 2012-07-09 2013-01-02 宇龙计算机通信科技(深圳)有限公司 Terminal and terminal control method
CN103186345A (en) * 2013-02-25 2013-07-03 北京极兴莱博信息科技有限公司 Text segment selecting method and field selecting method, device and terminal
CN103914441A (en) * 2014-03-13 2014-07-09 何峰 Method for editing files on touch screens through gestures
CN104731481A (en) * 2015-03-31 2015-06-24 北京奇艺世纪科技有限公司 Button display method and device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI431521B (en) * 2011-02-16 2014-03-21 Acer Inc Touch method
US20130219340A1 (en) * 2012-02-21 2013-08-22 Sap Ag Navigation on a Portable Electronic Device
CN102681847B (en) * 2012-04-28 2015-06-03 华为终端有限公司 Touch screen terminal object processing method and touch screen terminal
CN102841746B (en) * 2012-07-11 2015-08-05 广东欧珀移动通信有限公司 A kind of mobile phone webpage interaction method
CN106445322A (en) * 2012-11-26 2017-02-22 中兴通讯股份有限公司 Text processing method and terminal
CN103092508A (en) * 2012-12-07 2013-05-08 北京傲游天下科技有限公司 Touch interface implementation method and device
US20140195943A1 (en) * 2013-01-04 2014-07-10 Patent Category Corp. User interface controls for portable devices
US10599250B2 (en) * 2013-05-06 2020-03-24 Qeexo, Co. Using finger touch types to interact with electronic devices
CN104571829B (en) * 2013-10-15 2018-06-01 联想(北京)有限公司 The display control method and terminal of a kind of terminal
CN103927080A (en) * 2014-03-27 2014-07-16 小米科技有限责任公司 Method and device for controlling control operation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855056A (en) * 2012-07-09 2013-01-02 宇龙计算机通信科技(深圳)有限公司 Terminal and terminal control method
CN103186345A (en) * 2013-02-25 2013-07-03 北京极兴莱博信息科技有限公司 Text segment selecting method and field selecting method, device and terminal
CN103914441A (en) * 2014-03-13 2014-07-09 何峰 Method for editing files on touch screens through gestures
CN104731481A (en) * 2015-03-31 2015-06-24 北京奇艺世纪科技有限公司 Button display method and device

Also Published As

Publication number Publication date
CN105975166A (en) 2016-09-28

Similar Documents

Publication Publication Date Title
CN105975166B (en) Application control method and device
EP3099040B1 (en) Button operation processing method in single-hand mode, apparatus and electronic device
JP6199510B2 (en) Method and apparatus for switching display modes
US20170344192A1 (en) Method and device for playing live videos
CN105893136B (en) Multitask management method and device
US20170300210A1 (en) Method and device for launching a function of an application and computer-readable medium
EP3098701B1 (en) Method and apparatus for managing terminal application
CN107102772B (en) Touch control method and device
EP3109772A1 (en) Text input method and device
US20170153754A1 (en) Method and device for operating object
JP6426755B2 (en) Operation processing method, device, program, and recording medium
US20190235745A1 (en) Method and device for displaying descriptive information
CN104484111A (en) Content display method and device for touch screen
EP3133482A1 (en) Method and device for displaying a target object
KR101763270B1 (en) Method, apparatus, program and computer-readable recording medium for determining character
EP3239827B1 (en) Method and apparatus for adjusting playing progress of media file
US10705729B2 (en) Touch control method and apparatus for function key, and storage medium
CN106354383B (en) The method and device on hide tools column
CN106708374A (en) Menu display method and device
CN107908351B (en) Application interface display method and device and storage medium
CN106843691B (en) Operation control method and device of mobile terminal
EP3828682A1 (en) Method, apparatus for adding shortcut plug-in, and intelligent device
CN112860140A (en) Application program control method, application program control device and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20160928

Assignee: GUANGZHOU CUBESILI INFORMATION TECHNOLOGY Co.,Ltd.

Assignor: GUANGZHOU HUADUO NETWORK TECHNOLOGY Co.,Ltd.

Contract record no.: X2021980000151

Denomination of invention: Application control method and device

Granted publication date: 20200512

License type: Common License

Record date: 20210107