WO2016173414A1 - Mobile terminal and quick start method and device for application program thereof - Google Patents

Mobile terminal and quick start method and device for application program thereof Download PDF

Info

Publication number
WO2016173414A1
WO2016173414A1 PCT/CN2016/079481 CN2016079481W WO2016173414A1 WO 2016173414 A1 WO2016173414 A1 WO 2016173414A1 CN 2016079481 W CN2016079481 W CN 2016079481W WO 2016173414 A1 WO2016173414 A1 WO 2016173414A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
quick
contact
gesture
touch
Prior art date
Application number
PCT/CN2016/079481
Other languages
French (fr)
Chinese (zh)
Inventor
吴玲玲
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201510210105.4A priority Critical patent/CN104850342A/en
Priority to CN201510210105.4 priority
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2016173414A1 publication Critical patent/WO2016173414A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

A mobile terminal and a quick start method and device for an application program thereof, which belong to the technical field of mobile terminals. The quick start method for an application program is applied to a mobile terminal. A correlation table of different regions of a side edge and different locations where an application icon is placed is stored in the mobile terminal. The method comprises: receiving a touch control gesture of which the touch control point is located in a side edge region (S100); when the received touch control gesture is a pre-set quick start gesture, querying, in a correlation table, a location where an application icon is placed corresponding to the side edge region where the touch control gesture occurs (S200); and starting a program of the application icon placed in the acquired location (S300). The method enables a user to arrange applications expected to be quickly started in locations where application icons are placed defined in a correlation table, thereby facilitating the user in quickly starting an application program in proper time, and improving the user experience.

Description

Quick start method and device for mobile terminal and application thereof Technical field

This document relates to the field of startup technologies for mobile terminals, and in particular, to a method and apparatus for quickly starting a mobile terminal and an application thereof.

Background technique

A related art program is started by the user first finding the location of the program that he wants to start, and then clicking to start. However, with the user's life and work needs, many applications are downloaded and installed on the mobile phone. In the process of finding the application that needs to be started, there are often many applications, and it is necessary to look up and down or even repeatedly page through the page. This is not the user. want.

With the development of technology, mobile phones without borders or narrow borders are gradually favored by users because of their large display area and beautiful appearance. Now there is no technical solution for using the edge area to quickly launch applications. A technology that can be used in combination with the edge area to help users quickly launch applications.

Summary of the invention

The embodiments of the present invention provide a method and a device for quickly starting a mobile terminal and an application thereof, so as to achieve the purpose of quickly starting an application, and overcome the defects caused by the above-mentioned related technology that requires the user to search back and forth for an application that needs to be started.

The embodiment of the present invention provides a method for quickly starting an application, where the method is applied to a mobile terminal, where the mobile terminal stores a correspondence table between different areas of the side edges and different positions where the application icons are placed, and the method includes:

Receiving a touch gesture in which the touch point is located in the side edge region;

When the received touch gesture is a preset quick start gesture, querying, from the correspondence relationship table, a location of placing an application icon corresponding to a side edge region where the touch gesture occurs;

A program that launches the acquired application icon placed at the location.

Optionally, after the step of receiving a touch gesture in which the touch point is located in the side edge region, The method also includes:

Identify the interface displayed by the current desktop;

When the identified interface is not the interface of the launched application, it is determined whether the received touch gesture is a preset quick start gesture.

Optionally, before the step of starting the program of the application icon placed at the location acquired, the method further includes:

When two or more icons are actually placed at the position where the application icon is placed in the correspondence table, the position of the application icon corresponding to the side edge region where the touch gesture occurs is queried;

Prompt for the application icon placed in the queried location.

Optionally, before the step of starting the program of the application icon placed at the location acquired, the method further includes:

When two or more icons are actually placed at the position where the application icon is placed in the correspondence table, the application icon placed at the queried location is prompted.

Optionally, the application icon placed in the location that is queried by the prompt, the method further includes:

When the location of the application icon corresponding to the side edge region where the touch gesture occurs is queried, the preset effect processing is performed on the placed application icon.

Optionally, when the received touch gesture is a preset quick start gesture, the querying, from the correspondence relationship table, a location of placing an application icon corresponding to a side edge region where the touch gesture occurs Previously included:

Obtaining a coordinate of a first touch point where the touch gesture occurs;

The area where the acquired coordinates are located is determined as the side edge area where the touch gesture occurs.

Optionally, the quick start gesture is to slide or click the side edge region within a preset direction range.

Optionally, the quick start gesture is sliding within a preset direction range;

Determining that the received touch gesture is a preset quick start gesture includes:

Calculating a distance between the first contact and the second contact of the touch gesture to determine a calculated distance The deviation is greater than the preset threshold.

Optionally, the calculating the distance between the first contact and the second contact of the touch gesture includes:

According to the formula

Figure PCTCN2016079481-appb-000001
Calculating a distance between the first contact and the second contact;

Wherein, downX is the abscissa of the first contact, downY is the ordinate of the first contact, currentX is the abscissa of the second contact, and currentY is the ordinate of the second contact.

Optionally, the calculating the distance between the first contact and the second contact of the touch gesture includes:

Calculating a distance between the first contact and the second contact according to a formula |currentY−downY|;

Wherein, downY is the ordinate of the first contact, and currentY is the ordinate of the second contact.

The embodiment of the present invention further provides a quick start device for an application, where the device stores a correspondence table between different regions of the side edges and different positions where the application icons are placed, the device includes a gesture receiving module, a location query module, and Program startup module, wherein

The gesture receiving module is configured to: receive a touch gesture in which the touch point is located in a side edge region;

The location querying module is configured to: when the received touch gesture is a preset quick-start gesture, query, from the correspondence relationship table, a placement application icon corresponding to a side edge region where the touch gesture occurs s position;

The program launching module is configured to: initiate a program of the applied application icon placed at the location.

Optionally, the device further includes an interface recognition unit and an interface determination unit, where

The interface recognition unit is configured to: identify an interface displayed by the current desktop;

The interface determining unit is configured to determine whether the received touch gesture is a preset quick start gesture when the identified interface is not the interface of the launched application.

Optionally, the device further includes a number determining unit, a query unit, and a prompting unit, where

The number determining unit is configured to: determine whether the icon actually placed in the position where the application icon is placed in the correspondence table is two or more;

The query unit is configured to: if the number determining unit determines to be placed in the correspondence table If the location of the application icon is actually two or more icons, the location of the application icon corresponding to the side edge region where the touch gesture occurs is queried;

The prompting unit is configured to: prompt an application icon placed at the queried location.

Optionally, the device further includes a number determining unit and a prompting unit, where

The number determining unit is configured to: determine whether the icon actually placed in the position where the application icon is placed in the correspondence table is two or more;

The prompting unit is configured to: if the number determining unit determines that the icon actually placed in the position where the application icon is placed in the correspondence table is two or more, prompting the application icon placed at the queried location.

Optionally, the prompting unit is further configured to:

When the location of the application icon corresponding to the side edge region where the touch gesture occurs is queried, the preset effect processing is performed on the placed application icon.

Optionally, the gesture receiving module includes a coordinate acquiring unit and an area determining unit, where

The coordinate acquiring unit is configured to: acquire coordinates of the first touch point where the touch gesture occurs, and send the acquired coordinates to the area determining unit, where the output end of the coordinate acquiring unit is connected to the input end of the area determining unit;

The area determining unit is configured to determine the area where the acquired coordinates are located as the side edge area where the touch gesture occurs.

Optionally, the quick start gesture is to slide or click the side edge region within a preset direction range.

Optionally, the quick start gesture is sliding within a preset direction range;

The location query module is configured to perform the following manner to determine that the received touch gesture is a preset quick start gesture:

Calculating a distance between the first contact and the second contact of the touch gesture, and determining that the calculated distance is greater than a preset threshold.

Optionally, the location query module is configured to implement the calculating a distance between the first contact and the second contact of the touch gesture by:

According to the formula

Figure PCTCN2016079481-appb-000002
Calculating a distance between the first contact and the second contact; or

Calculating a distance between the first contact and the second contact according to a formula |currentY−downY|;

Wherein, downX is the abscissa of the first contact, downY is the ordinate of the first contact, currentX is the abscissa of the second contact, and currentY is the ordinate of the second contact.

The embodiment of the invention further provides a mobile terminal, comprising the quick start device of the application program according to any one of the preceding claims.

The embodiment of the invention provides a quick start method and device for a mobile terminal and an application program thereof, which can quickly trigger an application by using an edge region, and the definition of the trigger gesture is novel, so that the user can directly arrange the application to be quickly launched in the relationship table. The location of the application icon is defined in the side edge area, and the corresponding application is triggered by identifying the application icon placed at the corresponding location, so that the user can quickly launch the application when appropriate, such as desktop or black screen.

BRIEF abstract

1 is a schematic diagram of an optional hardware structure of a mobile terminal implementing various embodiments of the present invention;

2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;

3 is a flow chart of a method for quickly starting an application according to an embodiment of the present invention;

4 is a schematic view of an interface of different regions of a side edge according to an embodiment of the present invention;

FIG. 5 is a flowchart of a method for quickly starting an application according to another embodiment of the present invention; FIG.

6 is a flow chart of a method for quickly starting an application according to still another embodiment of the present invention;

FIG. 7 is a block diagram showing an exemplary structure of a quick start device of an application according to an embodiment of the present invention; FIG.

FIG. 8 is a block diagram showing an exemplary structure of a quick start device of an application according to another embodiment of the present invention; FIG.

FIG. 9 is a block diagram showing an exemplary structure of a mobile terminal according to an embodiment of the present invention; FIG.

FIG. 10 is a schematic structural diagram of a mobile terminal according to still another embodiment of the present invention.

Preferred embodiment of the invention

The following is an overview of the topics detailed in this document. This Summary is not intended to limit the scope of the claims.

The principles and features of the present invention are described in the following with reference to the accompanying drawings.

A mobile terminal embodying various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, the use of suffixes such as "module", "component" or "unit" for indicating an element is merely an explanation for facilitating the present invention, and does not have a specific meaning per se. Therefore, "module" and "component" can be used in combination.

FIG. 1 is a schematic diagram of an optional hardware structure of a mobile terminal implementing various embodiments of the present invention.

The mobile terminal 100 may include a wireless communication unit 110, an A/V (Audio/Video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more. Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.

Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.

The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel can include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Moreover, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal can exist in various forms, for example, it can be in the form of a number Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), etc. exist. The broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems. In particular, the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H) The digital broadcasting system of the @) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting. The broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).

The mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.

The wireless internet module 113 supports wireless internet access of the mobile terminal. The module can be internally or externally coupled to the terminal. The wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .

The short range communication module 114 is a module for supporting short range communication. Some examples of short-range communication technologies include BluetoothTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wide Band (UWB), ZigbeeTM, and the like.

The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of a location information module is GPS (Global Positioning System). According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude. Currently, the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite. Further, the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.

The A/V input unit 120 is for receiving an audio or video signal. The A/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 is in a video capture mode or an image capture mode. The image data of the still picture or video obtained by the image capturing device is processed. The processed image frame can be displayed on the display unit 151. The image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the configuration of the mobile terminal. The microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data. The processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode. The microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.

The user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal. The user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc. In particular, when the touch panel is superimposed on the display unit 151 in the form of a layer, a touch screen can be formed.

The sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal. The orientation of 100, the acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide type mobile phone, the sensing unit 140 can sense whether the slide type phone is turned on or off. In addition, the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device. Sensing unit 140 may include proximity sensor 1410 which will be described below in connection with a touch screen.

The interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more. The identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter referred to as "identification device") may take the form of a smart card, and thus, The identification device can be connected to the mobile terminal 100 via a port or other connection device. The interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.

In addition, when the mobile terminal 100 is connected to the external base, the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal. Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base. Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.

The display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 can display a user interface (UI) or a graphical user interface (GUI) related to a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.

Meanwhile, when the display unit 151 and the touch panel are superposed on each other in the form of a layer to form a touch screen, the display unit 151 can function as an input device and an output device. The display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like. According to a particular desired embodiment, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) . The touch screen can be used to detect touch input pressure as well as touch input position and touch input area.

The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like. The audio signal is output as sound. Moreover, the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100. The audio output module 152 can include a speaker, a buzzer, and the like.

The alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (ie, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.

The memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.

The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like. Moreover, the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.

The controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like. Additionally, the controller 180 can include a multimedia module 1810 for reproducing (or playing back) multimedia data, which can be constructed within the controller 180 or can be configured to be separate from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.

The power supply unit 190 receives external power or internal power under the control of the controller 180 and provides Operate the appropriate power required for each component and component.

The various embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof. For hardware implementations, the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle. For software implementations, implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation. The software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by controller 180.

So far, the mobile terminal has been described in terms of its function. Hereinafter, for the sake of brevity, a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.

The mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.

A communication system in which a mobile terminal according to the present invention can be operated will now be described with reference to FIG.

Such communication systems may use different air interfaces and/or physical layers. For example, air interfaces used by communication systems include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)). ), Global System for Mobile Communications (GSM), etc. As a non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.

Referring to FIG. 2, a CDMA wireless communication system can include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280. The MSC 280 is configured to interface with a public switched telephone network (PSTN) 290. The MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line. The backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be understood that the system as shown in Figure 2 can include multiple BSC2750.

Each BS 270 can serve one or more partitions (or regions), each of which is covered by a multi-directional antenna or an antenna directed to a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).

The intersection of partitioning and frequency allocation can be referred to as a CDMA channel. BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" can be used to generally refer to a single BSC 275 and at least one BS 270. A base station can also be referred to as a "cell station." Alternatively, each partition of a particular BS 270 may be referred to as a plurality of cellular stations.

As shown in FIG. 2, a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In Figure 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 helps locate at least one of the plurality of mobile terminals 100.

In Figure 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites. The GPS module 115 as shown in Figure 1 is typically configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.

As a typical operation of a wireless communication system, BS 270 receives reverse link signals from various mobile terminals 100. Mobile terminal 100 typically participates in calls, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within a particular BS 270. The obtained data is forwarded to the relevant BSC 275. The BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270. The BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290. Similarly, PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.

Based on the above-described mobile terminal hardware structure and communication system, various embodiments of the method of the present invention are proposed.

Embodiment 1

3 is a flowchart of a method for quickly starting an application according to an embodiment of the present invention. A method for quickly starting an application according to an embodiment of the present invention is described below with reference to FIG. 3, which is applied to a mobile terminal. The mobile terminal stores a correspondence table between different areas of the side edges and different positions where the application icons are placed. As shown in FIG. 3, the method includes the following steps:

S100: Receive a touch gesture in which a touch point is located in a side edge region;

S200: When the received touch gesture is a preset quick start gesture, query, in the correspondence relationship table, a location of the application application icon corresponding to the side edge region where the touch gesture occurs;

S300. A program for starting an application icon placed at the acquired location.

The relationship between the area where the side edge is stored and the position where the application icon is placed in the relationship table may be one-to-one or one-to-many relationship, for example, one side edge area may correspond to only one area. The position where the application icon is placed (ie, the shaded area in FIG. 4) may also correspond to a plurality of positions where the application icons are placed. When the area of one side edge corresponds to a plurality of positions where the application icons are placed, when the quick start is performed, There are multiple application icons, and there are multiple applications launched at the same time.

When the received touch gesture is a preset quick-start gesture, the querying the location of the application icon corresponding to the side edge region where the touch gesture occurs in the correspondence relationship table further includes:

Obtaining the coordinates of the first touch point where the touch gesture occurs;

The area where the acquired coordinates are located is determined as the side edge area where the touch gesture occurs.

According to an example of the embodiment, the manner of determining whether the touch point of the touch gesture is in the side edge region is as follows.

Since the touch gesture is usually a gesture of clicking, sliding, and the like, each touch gesture is composed of one or more touch points, so the terminal can determine that the touch gesture is by determining the area where the touch point of the touch gesture falls. Occurs in a normal partition or a side edge area (that is, a special partition). In a specific implementation, the coordinates of the touch point of the touch gesture are acquired by the driving layer of the terminal, and it is determined which partition the coordinates of the touch point fall into. When the coordinates of the touch point fall into a special partition, it is determined that the touch gesture occurs in a special partition, and the touch gesture is reported by the input device corresponding to the special partition. When the coordinates of the touch point fall into the normal partition, the determination is performed. The touch gesture occurs in a normal partition, and a special effect is generated according to the touch gesture. Do regular processing.

After receiving the reported event in the framework layer (the reported event includes the input device and each parameter of the touch point, etc.), first, according to the naming of the input device, which region is identified, the driver layer (kernel) in the above step The recognition is in the special zone touch, the input device reported to the frame layer by the driver layer is input1, instead of being reported by input0, that is, the frame layer does not need to determine which partition the current touch point is in, nor does it need to judge the partition. The size and position, these judgment operations are completed on the driver layer, and the driver layer reports the parameters of the touch point to the frame layer in addition to the specific input device, such as pressing time, position coordinates, and pressure. and many more.

When the touch point of the touch gesture falls into the normal partition, the driving layer of the mobile terminal reports the touch point through the input device corresponding to the normal partition.

After receiving the reported event in the framework layer (the reported event includes the input device and the parameters of the touch point, etc.), firstly, according to the naming of the input device, which region is identified, the driver layer (kernel) identification in the above step In the normal partition touch, the input device reported to the framework layer on the driver layer is input0, instead of using input1 to report, that is, the framework layer does not need to determine which partition the current touch point is in, and does not need to determine the size of the partition. And the position, the judgment operation is completed on the driving layer, and the driving layer not only reports which input device is specifically, but also reports various parameters of the touch point to the frame layer, such as pressing time, position coordinates, pressure size, etc. Wait.

The touch gesture is conventionally processed, that is, the touch gesture is processed according to a conventional process in the related art. For example, after receiving the touch point reported by the input device corresponding to the normal partition, the frame layer of the terminal continues to report the touch point according to the normal process to execute the corresponding operation instruction.

Therefore, the present embodiment does not perform normal processing on the touch gesture reported by the special partition, but generates special effects according to the touch gesture, and also prevents misoperation.

The size of the preset side edge area may be a fixed value. When the side edge area includes multiple blocks, a corresponding input device may be disposed with each side edge area for adjusting the position and size of the side edge area. When the size of the side edge area is a variable, it can be implemented by the interface modified by the parameter reserved by the developer. As shown in FIG. 4, the developer sets the interface Set_zone (id, X0, Y0, which is left in the upper layer through the driver layer. X1, Y1) is set by the user. When id is 1 in Figure 4, it indicates the left edge area, the corresponding preset coordinates are (X0, Y0, X1, Y1), and when id is 2, it indicates the right edge area, corresponding to the preset coordinates. For (X0', Y0', X1', Y1'), it can be preset in the above interface function.

The common partition in Figure 4 is abbreviated as Area A, and the side edge area is simply referred to as Area C. The reporting process of touch events is as follows:

The driving layer receives the touch event through physical hardware such as a touch screen, and determines whether the touch operation occurs in the A area or the C area, and reports the event through the device file node of the A area or the C area. The Native layer reads events from device files in Areas A and C, and processes events in Areas A and C, such as coordinate calculations. The devices in the A and C areas are distinguished by the device ID, and finally the A area is distributed. And zone C events. The A zone event takes the original process, and processes the A zone event in the usual way, that is, through the multi-channel mechanism; the C zone event is distributed from the C zone dedicated channel registered in advance to the Native layer, by the Native port. Input, system port output to the C area event end system service, listen to the C area event through the listener (listener), and then report the external interface to each application through the C area event. The embodiment of the present invention can realize the free customization of the anti-missing area, that is, the edge area, by using the driver layer code of the mobile terminal. Therefore, the technical solution of the present invention is implemented in the driver layer instead of the firmware, which makes the software design of the device get rid of the software design. The constraints of touch-screen IC suppliers are more flexible and less costly.

Optionally, the number of the special partitions is two, which are respectively located on both sides of the touch area, and the remaining area of the touch area is a normal partition. Alternatively, the normal partition includes an A area and a bottom B area, wherein the A area is an operable area for detecting touch point coordinates, and the B area is a virtual key area for detecting a menu key, a Home key, a return key, etc., two The special partitions are respectively located at the edge of the touch area and are located on both sides of the A area. In addition, you can set the special partition to any other area that is easy to cause misoperation, or set only one special partition or set multiple edge areas.

After the partition setting is completed, two input devices (input) such as input device 0 (input0) and input device 1 (input1) can be registered by the input_register_device() instruction when the touch screen driver is initialized. An input device is allocated for each partition by the input_allocate_device() instruction, for example, the normal partition corresponds to the input device 0, and the edge region corresponds to the input device 1.

The specific method of implementing the partition can define the categories and implementation manners of the common partitions and edge regions by using the object-oriented method. After determining the edge regions, the coordinates of the touch points of different resolutions are converted into LCDs by the EventHub function. Coordinates, define single-channel functions (such as serverchannel and clientchannel, etc.). The function of this function is to pass the event to the event manager (TouchEventManager) through the channel after receiving the reported event, and listen to it through the listener. The event is transmitted to multiple response application modules simultaneously or one by one through multiple channels, or It is only passed to one of the application modules, application modules such as camera, gallery, etc., and different application modules generate corresponding operations. Certainly, the specific implementation of the foregoing steps may also be implemented in other manners, which is not limited by the embodiment of the present invention.

After receiving the touch operation reported from the input device corresponding to the normal partition, the system receives the reported event (the reported event includes the input device and the parameters of the touch point, etc.), and then identifies which one is based on the naming of the input device. region.

4 is a schematic diagram of an interface of different areas of a side edge according to an embodiment of the present invention. The correspondence between the block of the side edge area and the position where the application icon is placed and the identification step are specifically described below with reference to FIG. 4:

In the preset correspondence table, the position or the area where the application icon is placed corresponding to the block {(Xa, Ya), (Xb, Yb)} in the side edge region is the shadow of the second row and the first column having the reverse stripes. The area, the location stored in the relationship table is the area coordinate of the application icon in the first row of the second row, and the first touch of the touch gesture is determined when the touch gesture of the touch point is received in the side edge region. The coordinates of the point, and then determining the block in which the touch gesture occurs by determining the block of the side edge region where the coordinates of the first touch point fall, and then querying the relationship table from the relationship table. Place the location of the app icon for a quick launch.

Wherein, the above quick-start gesture may be, for example, sliding, or sliding in a preset direction range, or clicking a block in a side edge region.

According to an example of the embodiment, a method for determining whether the detected gesture is a gesture of sliding up and down, for example, when the coordinates of the first contact are (downX, downY) (the first contact is also a criterion for determining a region where the touch gesture occurs) Pressing the first contact for (downTime), the touch screen reports the coordinates of the current position (second contact) of the contact (current, currentY) at regular intervals (for example, 1/85 second), and then calculates the first The distance between the contact and the second contact.

Among them, there are two ways to calculate the distance:

method one:

Figure PCTCN2016079481-appb-000003

Method 2: Distance =|currentY–downY|.

After calculating the distance value, it is further determined whether the distance is greater than a preset threshold, and if so, it is determined that the contact has slipped. If necessary, it is further possible to compare the first contact and the second contact The ordinate determines the direction of the slip.

In this embodiment, the application is quickly triggered by using the edge region, and since the definition of the trigger gesture is novel, the user can directly arrange the location where the application icon is placed in the relationship table, and the side edge region is identified by the user. The application icon placed in the corresponding location triggers the corresponding application, so that the user can quickly launch the application when appropriate, such as desktop or black screen.

Embodiment 2

FIG. 5 is a flowchart of a method for quickly starting an application according to another embodiment of the present invention. As shown in FIG. 5, the method for quickly starting an application provided by this embodiment is after the step S100, before S200 and S300. ,Also includes:

S110. Identify an interface displayed by the current desktop.

S120. When the identified interface is not the interface of the launched application, determine whether the received touch gesture is a preset quick start gesture.

The current desktop display interface is not the interface of the launched application, such as displaying the desktop or blanking out.

According to a usage scenario of the embodiment, for example, the current interface displays a music song playing interface, and the functions involved in the music song playing function have basic functions such as adjusting volume and sharing up and down, and if the side edge region is further defined to adjust the volume. In this case, the side edge area of the upper and lower sliding terminals corresponds to the volume of the playback, and the call of different functions corresponding to the gestures of sliding up and down on different operation interfaces may also be added to the relationship table.

In this embodiment, it is determined whether the quick start function is enabled by intelligently recognizing the content displayed on the current interface, so that the quick start option of the side edge region does not conflict with other preset functions.

Embodiment 3

FIG. 6 is a flowchart of a method for quickly starting an application according to another embodiment of the present invention. As shown in FIG. 6, the quick start method of the application provided by this embodiment is after the steps S100 and S200, before S300. ,Also includes:

S210. Determine whether the icon actually placed in the location where the application icon is placed in the relationship table is two or more;

S220. If two or more icons are actually placed at the position where the application icon is placed in the relationship table, the application icon placed at the queried location is prompted.

The method further includes: for the application icon placed in the location where the prompt is queried, the method further includes:

When the position of the application icon corresponding to the side edge area where the touch gesture occurs is queried, the preset effect processing is performed on the placed application icon. The special effect processing adds, for example, a background color to the application icon placed at the queried location, adds background blur, shakes the application icon up or down or left and right according to a preset frequency, or adds a visible label to the corresponding application icon.

In this embodiment, by prompting the application to be started before starting the application, the user may be effectively prevented from being erroneously started due to misoperation or inaccurate operation.

Embodiment 4

FIG. 7 is a block diagram showing an exemplary structure of a quick start device of an application according to an embodiment of the present invention, and a quick start device for an application according to an embodiment of the present invention is described in detail below with reference to FIG. As shown in FIG. 7, the quick start device 11 of the application specifically includes:

The gesture receiving module 10 is configured to: receive a touch gesture in which the touch point is located in the side edge region, and then send the received touch gesture to the location query module 20;

The location query module 20 is configured to: when the received touch gesture is a preset quick-start gesture, query the position of the application icon corresponding to the side edge region where the touch gesture occurs, from the correspondence table, and query The location is sent to the program startup module 30;

The program launching module 30 is configured to: start a program of the application icon placed at the acquired location.

The output end of the gesture receiving module 10 is connected to the input end of the location query module 20, and the output end of the location query module 20 is connected to the input end of the program launching module 30.

FIG. 8 is a block diagram showing an exemplary structure of a quick start device of an application according to another embodiment of the present invention. As shown in FIG. 8, the quick start device 11 of the application specifically includes:

The interface identifying unit 21 is configured to: identify an interface displayed by the current desktop;

The interface determining unit 22 is configured to: determine whether the interface recognized by the interface identifying unit 21 is an interface of the activated application, and if not, determine whether the received touch gesture is a preset quick start gesture.

The output end of the interface recognition unit 21 is connected to the input end of the interface determination unit 22.

Optionally, when the corresponding relationship in the stored correspondence table is two or more, the quick start device 11 of the application specifically includes:

The number determining unit is configured to: determine whether the icon actually placed at the position where the application icon is placed in the relationship table is two or more;

The query unit is configured to: if two or more icons are actually placed at the position where the application icon is placed in the relationship table, query the position where the application icon is placed corresponding to the side edge region where the touch gesture occurs;

The prompt unit is set to: prompt the application icon placed in the queried location.

Optionally, when the corresponding relationship in the stored correspondence table is two or more, the quick start device 11 of the application specifically includes:

The number determining unit is configured to: determine whether the icon actually placed at the position where the application icon is placed in the relationship table is two or more;

The prompting unit is configured to: when the number determining unit determines whether the icon actually placed in the position where the application icon is placed in the relationship table is two or more, prompting the application icon placed at the queried location.

The output end of the number judging unit is connected to the input end of the query unit, and the output end of the query unit is connected to the input end of the prompt unit.

Wherein, the prompting unit is further configured to:

When the position of the application icon is placed corresponding to the side edge area where the touch gesture occurs, the preset application effect is performed on the placed application icon.

Optionally, the gesture receiving module 10 includes:

The coordinate acquiring unit is configured to: acquire coordinates of the first touch point where the touch gesture occurs, and send the acquired coordinates to the area determining unit, where the output end of the coordinate acquiring unit is connected to the input end of the area determining unit;

The area determining unit is configured to: determine the area where the acquired coordinates are located as a side edge area where the touch gesture occurs.

In this embodiment, the application is quickly triggered by using the edge region, and since the definition of the trigger gesture is novel, the user can directly arrange the placement that is desired to be quickly launched in the relationship table. The position of the application icon, the side edge area triggers the corresponding application by identifying the application icon placed at the corresponding position, so that the user can quickly launch the application at an appropriate time.

Optionally, the quick start gesture is to slide or click on the side edge area within a preset direction range.

Optionally, the quick start gesture is to slide in a preset direction range;

The location query module is configured to perform the following manner to determine that the received touch gesture is a preset quick start gesture:

Calculating a distance between the first contact and the second contact of the touch gesture, and determining that the calculated distance is greater than a preset threshold.

Optionally, the location query module is configured to implement the calculating the distance between the first contact and the second contact of the touch gesture by:

According to the formula

Figure PCTCN2016079481-appb-000004
Calculating a distance between the first contact and the second contact;

Wherein, downX is the abscissa of the first contact, downY is the ordinate of the first contact, currentX is the abscissa of the second contact, and currentY is the ordinate of the second contact.

Optionally, the location query module is configured to implement the calculating, by using the following manner, the distance between the first contact and the second contact of the touch gesture:

Calculating a distance between the first contact and the second contact according to a formula |currentY−downY|;

Wherein, downY is the ordinate of the first contact, and currentY is the ordinate of the second contact.

Embodiment 5

FIG. 9 is a block diagram showing an exemplary structure of a mobile terminal 100, such as the mobile terminal 100 shown in FIG. 9, including the quick start device 11 of the application described above, in accordance with an embodiment of the present invention.

Among them, the mobile terminal can be implemented in various forms. For example, the terminal described in the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (Personal Digital Assistant), a PAD (Tablet), a PMP (Portable Multimedia Player), a navigation device, etc. Mobile terminals such as fixed terminals such as digital TVs, desktop computers, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, those skilled in the art will appreciate that configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.

Embodiment 6

A mobile terminal according to an embodiment of the present invention includes an input device, a processor 903, a display screen 904, and a memory 905. In one embodiment, the input device is a touch screen 2010 that includes a touch panel 901 and a touch controller 902. Further, the input device may also be a non-touch input device (eg, an infrared input device, etc.) or the like.

Touch controller 902 can be a single application specific integrated circuit (ASIC), which can include one or more processor subsystems, which can include one or more ARM processors or other processors with similar functions and capabilities.

The touch controller 902 is mainly used for receiving a touch signal generated by the touch panel 901, and processing the same to the processor 903 of the mobile terminal. Such processing is, for example, analog-to-digital conversion of a physical input signal, processing to obtain touch point coordinates, processing to obtain a touch duration, and the like.

The processor 903 receives the output of the touch controller 902, performs processing, and performs an action based on the output. The actions include, but are not limited to, moving an object such as a table or indicator, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, as a selection, executing an instruction, operating a peripheral device coupled to the host device Answering a phone call, making a call, terminating a phone call, changing volume or audio settings, storing information related to phone communications (eg, address, frequently used number, received call, missed call), logging in to a computer or computer network, allowing authorization An individual accesses a restricted area of a computer or computer network, records a user profile associated with a user preferences configuration of a computer desktop, allows access to network content, launches a particular program, encrypts or decodes a message, and the like.

The processor 903 is also coupled to the display screen 904. Display 904 is used to provide a UI to a user of the device.

In some embodiments, processor 903 can be a separate component from touch controller 902. In other embodiments, the processor 903 can be a composite component with the touch controller 902.

In one embodiment, the touch panel 901 is provided with a discrete motion sensor 906, such as a capacitive sensor, a resistive sensor, a force sensor, an optical sensor, or the like.

The touch panel 901 includes an electrode array made of a conductive material in a lateral direction and a longitudinal direction. For one Single-touch screen of M row and N column electrode arrays (only the coordinates of single touch can be determined), touch controller 902 uses self-capacitance scanning, then scan M rows and N columns respectively, according to each row and each column signal To perform the calculation, locate the coordinates of the finger on the touch screen. The number of scans is M+N times.

For a multi-touch touch screen of M rows and N columns of electrode arrays (which can detect and resolve coordinates of multiple points, ie multi-touch), the touch controller 902 uses multi-contact mutual capacitance scanning to intersect the rows and columns. Scanning, whereby the number of scans is M × N times.

When the user's finger touches the panel, the touch panel 901 generates a touch signal (which is an electrical signal) to the touch controller 902. The touch controller 902 can obtain the coordinates of the touched point by scanning. In one embodiment, the touch panel 901 of the touch screen 2010 is physically a set of independent coordinate positioning systems. After the touch point coordinates of the touch are reported to the processor 903, the processor 903 is converted to the display screen 904. Pixel coordinates to correctly identify the input operation.

Optionally, the touch controller 902 is configured to receive a touch gesture in which the touch point is located in the side edge region according to the following manner: acquiring coordinates of the first touch point where the touch gesture occurs; determining an area where the acquired coordinates are located The area where the touch gesture occurs. The touch controller 902 determines whether the touch operation occurs in a normal partition or an edge region (ie, a special partition) by determining an area in which the touch point of the touch operation falls. In a specific implementation, the coordinates of the touch point of the touch operation are obtained through the touch screen, and it is determined which partition the coordinates of the touch point fall into. When the coordinates of the touch point fall into the special partition, it is determined that the touch operation occurs in the special partition, and the touch operation is reported by the input device corresponding to the special partition. When the coordinates of the touch point fall into the normal partition, the determination is performed. The touch operation takes place in a normal partition, and special effects are generated according to the touch operation, and the conventional processing is performed.

Specifically, the touch controller 902 receives the touch event through the touch screen, and determines whether the touch operation occurs in the A area or the C area, and reports the event through the device file node of the A area or the C area. Specifically, the event is read from the device files of the A area and the C area, and the events of the A area and the C area are processed, such as coordinate calculation, and the events of the A and C areas are distinguished by the device ID, and finally the A is respectively distributed. District and District C events. The A area event takes the original process, and the A area event is processed in the usual way, that is, through the multi-channel mechanism; the C area event is distributed from the C area dedicated channel, input by the native port, and the system port is output to The C-zone event ends the system service, and listens to the C-zone event through the listener, and then reports the external interface to each application through the C-zone event.

Optionally, the touch controller 902 is further configured to: set a partition. The number of special partitions is two. They are located on both sides of the touch area, and the remaining area of the touch area is a normal partition. Alternatively, the normal partition includes an A area and a bottom B area, wherein the A area is an operable area for detecting touch point coordinates, and the B area is a virtual key area for detecting a menu key, a Home key, a return key, etc., two The special partitions are respectively located at the edge of the touch area and are located on both sides of the A area. In addition, you can set the special partition to any other area that is easy to cause misoperation, or set only one special partition or set multiple edge areas.

After the touch controller 902 determines that it is an edge region, the coordinates of the touch points of different resolutions are converted into the coordinates of the LCD through the EventHub function, and a single channel function (such as serverchannel and clientchannel, etc.) is defined, and the function is After the event is reported, the event is transmitted to the event manager (TouchEventManager) through the channel, and the event is transmitted to the application module of multiple responses simultaneously or one by one through the monitoring of the listener, or can be transmitted only to the application module of the response. An application module, application modules such as camera, gallery, etc., different application modules generate corresponding operations.

Optionally, the processor 903 is configured to: when determining that the received touch gesture is a preset quick start gesture, query the placement application corresponding to the side edge region where the touch gesture occurs from the corresponding relationship table stored in the memory 905. The location of the icon; the program that launches the app icon placed at the acquired location.

The processor 903 is further configured to: identify an interface displayed by the current desktop; and when the recognized interface is not an interface of the launched application, determine whether the received touch gesture is a preset quick start gesture.

The processor 903 is further configured to: determine whether the icon actually placed in the location where the application icon is placed in the relationship table is two or more; if the location where the application icon is placed in the relationship table is actually two or more icons, the location of the query is prompted. The app icon placed. When the location of the application icon corresponding to the side edge area where the touch gesture occurs is detected, the preset application effect is performed on the placed application icon. The special effect processing adds, for example, a background color to the application icon placed at the queried location, adds background blur, shakes the application icon up or down or left and right according to a preset frequency, or adds a visible label to the corresponding application icon.

The memory 905 stores a correspondence table of different areas of the side edges and different positions at which the application icons are placed.

The embodiment of the invention further discloses a computer program, comprising program instructions, when the program instruction is executed by the mobile terminal, so that the mobile terminal can execute the quick start method of any of the above applications.

The embodiment of the invention also discloses a carrier carrying the computer program.

Other aspects will be apparent upon reading and understanding the drawings and detailed description.

It is to be understood that the term "comprises", "comprising", or any other variants thereof, is intended to encompass a non-exclusive inclusion, such that a process, method, article, or device comprising a series of elements includes those elements. It also includes other elements that are not explicitly listed, or elements that are inherent to such a process, method, article, or device. An element that is defined by the phrase "comprising a ..." does not exclude the presence of additional equivalent elements in the process, method, item, or device that comprises the element.

The serial numbers of the embodiments of the present invention are merely for the description, and do not represent the advantages and disadvantages of the embodiments.

Through the description of the above embodiments, those skilled in the art can clearly understand that the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better. Implementation. Based on such understanding, the technical solution of the present invention in essence or the contribution to the related art can be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk, CD-ROM). The instructions include a number of instructions for causing a terminal device (which may be a cell phone, computer, server, air conditioner, or network device, etc.) to perform the methods described in various embodiments of the present invention.

The above are only the preferred embodiments of the present invention, and are not intended to limit the scope of the invention, and the equivalent structure or equivalent process transformations made by the description of the present invention and the drawings are directly or indirectly applied to other related technical fields. The same is included in the scope of patent protection of the present invention.

Industrial applicability

The embodiment of the invention provides a quick start method and device for a mobile terminal and an application program thereof, which can quickly trigger an application by using an edge region, and the definition of the trigger gesture is novel, so that the user can directly arrange the application to be quickly launched in the relationship table. The location of the application icon is defined in the side edge area, and the corresponding application is triggered by identifying the application icon placed at the corresponding location, so that the user can quickly launch the application when appropriate, such as desktop or black screen. Therefore, the present invention has strong industrial applicability.

Claims (20)

  1. A quick start method of an application, the method being applied to a mobile terminal, where the mobile terminal stores a correspondence table between different areas of a side edge and different positions where an application icon is placed, the method comprising:
    Receiving a touch gesture in which the touch point is located in the side edge region;
    When the received touch gesture is a preset quick start gesture, querying, from the correspondence relationship table, a location of placing an application icon corresponding to a side edge region where the touch gesture occurs;
    A program that launches the acquired application icon placed at the location.
  2. The method of quickly starting the application according to claim 1, after the step of receiving the touch gesture of the touch point in the side edge region, the method further includes:
    Identify the interface displayed by the current desktop;
    When the identified interface is not the interface of the launched application, it is determined whether the received touch gesture is a preset quick start gesture.
  3. The quick start method of the application according to claim 1, before the step of initiating the program of the application icon placed at the location acquired, the method further comprises:
    When two or more icons are actually placed at the position where the application icon is placed in the correspondence table, the position of the application icon corresponding to the side edge region where the touch gesture occurs is queried;
    Prompt for the application icon placed in the queried location.
  4. The quick start method of the application according to claim 1, before the step of initiating the program of the application icon placed at the location acquired, the method further comprises:
    When two or more icons are actually placed at the position where the application icon is placed in the correspondence table, the application icon placed at the queried location is prompted.
  5. The method for quickly launching an application according to claim 3 or 4, wherein the method further includes:
    When the location of the application icon corresponding to the side edge region where the touch gesture occurs is queried, the preset effect processing is performed on the placed application icon.
  6. The quick start method of the application according to claim 1, when the received touch Before the gesture is a preset quick-start gesture, the querying the position of the application icon corresponding to the side edge region where the touch gesture occurs is further included in the correspondence relationship table:
    Obtaining a coordinate of a first touch point where the touch gesture occurs;
    The area where the acquired coordinates are located is determined as the side edge area where the touch gesture occurs.
  7. The quick start method of an application according to any one of claims 1 to 6, wherein the quick start gesture is to slide or click on the side edge region toward a preset direction range.
  8. The quick start method of the application according to claim 7, wherein the quick start gesture is sliding within a preset direction range;
    Determining that the received touch gesture is a preset quick start gesture includes:
    Calculating a distance between the first contact and the second contact of the touch gesture, and determining that the calculated distance is greater than a preset threshold.
  9. The quick start method of the application according to claim 8, wherein the calculating the distance between the first contact and the second contact of the touch gesture comprises:
    According to the formula
    Figure PCTCN2016079481-appb-100001
    Calculating a distance between the first contact and the second contact;
    Wherein, downX is the abscissa of the first contact, downY is the ordinate of the first contact, currentX is the abscissa of the second contact, and currentY is the ordinate of the second contact.
  10. The application quick launch method according to claim 8, wherein the calculating the distance between the first contact and the second contact of the touch gesture comprises:
    Calculating a distance between the first contact and the second contact according to a formula |currentY−downY|;
    Wherein, downY is the ordinate of the first contact, and currentY is the ordinate of the second contact.
  11. A quick start device for an application, the device storing a correspondence table of different regions of a side edge and different positions for placing an application icon, the device comprising a gesture receiving module, a location query module, and a program startup module, wherein
    The gesture receiving module is configured to: receive a touch gesture in which the touch point is located in a side edge region;
    The location querying module is configured to: when the received touch gesture is a preset quick-start gesture, query, from the correspondence relationship table, a placement application icon corresponding to a side edge region where the touch gesture occurs s position;
    The program launching module is configured to: initiate a program of the applied application icon placed at the location.
  12. A quick start device for an application according to claim 11, wherein said device further comprises an interface recognition unit and an interface determination unit, wherein
    The interface recognition unit is configured to: identify an interface displayed by the current desktop;
    The interface determining unit is configured to determine whether the received touch gesture is a preset quick start gesture when the identified interface is not the interface of the launched application.
  13. A quick start device for an application according to claim 11 or 12, further comprising a number determining unit, a query unit and a prompting unit, wherein
    The number determining unit is configured to: determine whether the icon actually placed in the position where the application icon is placed in the correspondence table is two or more;
    The query unit is configured to: if the number determining unit determines that the icon of the position where the application icon is placed in the correspondence table is actually two or more, query the side edge region corresponding to the touch gesture Place the location of the app icon;
    The prompting unit is configured to: prompt an application icon placed at the queried location.
  14. A quick start device for an application according to claim 11 or 12, further comprising a number determining unit and a prompting unit, wherein
    The number determining unit is configured to: determine whether the icon actually placed in the position where the application icon is placed in the correspondence table is two or more;
    The prompting unit is configured to: if the number determining unit determines that the icon actually placed in the position where the application icon is placed in the correspondence table is two or more, prompting the application icon placed at the queried location.
  15. The application quick launch device according to claim 13 or 14, wherein the prompting unit is further configured to:
    When the location of the application icon corresponding to the side edge region where the touch gesture occurs is queried, the preset effect processing is performed on the placed application icon.
  16. The quick start device of the application according to any one of claims 11 to 14, wherein the gesture receiving module includes a coordinate acquiring unit and an area determining unit, wherein
    The coordinate acquiring unit is configured to: acquire coordinates of the first touch point where the touch gesture occurs, and send the acquired coordinates to the area determining unit, where the output end of the coordinate acquiring unit is connected to the input end of the area determining unit;
    The area determining unit is configured to determine the area where the acquired coordinates are located as the side edge area where the touch gesture occurs.
  17. The quick start device of the application according to any one of claims 11 to 14, wherein the quick start gesture is to slide or click the side edge region toward a preset direction range.
  18. The quick start device of the application according to claim 17, wherein the quick start gesture is sliding in a predetermined direction range;
    The location query module is configured to perform the following manner to determine that the received touch gesture is a preset quick start gesture:
    Calculating a distance between the first contact and the second contact of the touch gesture, and determining that the calculated distance is greater than a preset threshold.
  19. The quick start device of the application according to claim 18, wherein the location query module is configured to implement the calculating the distance between the first contact and the second contact of the touch gesture in the following manner:
    According to the formula
    Figure PCTCN2016079481-appb-100002
    Calculating a distance between the first contact and the second contact; or
    Calculating a distance between the first contact and the second contact according to a formula |currentY−downY|;
    Wherein, downX is the abscissa of the first contact, downY is the ordinate of the first contact, currentX is the abscissa of the second contact, and currentY is the ordinate of the second contact.
  20. A mobile terminal comprising the quick start device of the application according to any one of claims 11 to 19.
PCT/CN2016/079481 2015-04-29 2016-04-15 Mobile terminal and quick start method and device for application program thereof WO2016173414A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201510210105.4A CN104850342A (en) 2015-04-29 2015-04-29 Mobile terminal and rapid startup method and device for applications of mobile terminal
CN201510210105.4 2015-04-29

Publications (1)

Publication Number Publication Date
WO2016173414A1 true WO2016173414A1 (en) 2016-11-03

Family

ID=53850019

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/079481 WO2016173414A1 (en) 2015-04-29 2016-04-15 Mobile terminal and quick start method and device for application program thereof

Country Status (2)

Country Link
CN (1) CN104850342A (en)
WO (1) WO2016173414A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850342A (en) * 2015-04-29 2015-08-19 努比亚技术有限公司 Mobile terminal and rapid startup method and device for applications of mobile terminal
CN105159587B (en) * 2015-08-27 2018-03-27 广东欧珀移动通信有限公司 A kind of method and mobile terminal for controlling application
CN106484269A (en) * 2015-08-31 2017-03-08 中兴通讯股份有限公司 A kind of application program using method and device and terminal
CN106610777A (en) * 2015-10-23 2017-05-03 小米科技有限责任公司 Application starting method and device and mobile terminal
CN105487805B (en) * 2015-12-01 2020-06-02 小米科技有限责任公司 Object operation method and device
CN106919415A (en) * 2015-12-28 2017-07-04 阿里巴巴集团控股有限公司 Start method, device and the electronic equipment of application program
CN105955608B (en) * 2016-04-22 2020-06-12 北京金山安全软件有限公司 Shortcut control method and device and electronic equipment
CN107132967B (en) * 2017-04-26 2020-09-01 努比亚技术有限公司 Application starting method and device, storage medium and terminal
CN108509131A (en) * 2018-03-28 2018-09-07 维沃移动通信有限公司 A kind of application program launching method and terminal

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103885674A (en) * 2012-12-20 2014-06-25 卡西欧计算机株式会社 Input device, input operation method and electronic device
CN104156073A (en) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
CN104182164A (en) * 2013-05-27 2014-12-03 赛龙通信技术(深圳)有限公司 Electronic device and method for operating interface through side frame
CN104238837A (en) * 2013-06-23 2014-12-24 北京智膜科技有限公司 Control device and method of touch screen of information interactive equipment
US20150046871A1 (en) * 2013-08-09 2015-02-12 Insyde Software Corp. System and method for re-sizing and re-positioning application windows in a touch-based computing device
CN104850342A (en) * 2015-04-29 2015-08-19 努比亚技术有限公司 Mobile terminal and rapid startup method and device for applications of mobile terminal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019600A (en) * 2012-12-14 2013-04-03 广东欧珀移动通信有限公司 Method and system for starting application programs of mobile terminal
CN104063164B (en) * 2013-03-22 2018-02-27 腾讯科技(深圳)有限公司 The method and device of screen control

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103885674A (en) * 2012-12-20 2014-06-25 卡西欧计算机株式会社 Input device, input operation method and electronic device
CN104182164A (en) * 2013-05-27 2014-12-03 赛龙通信技术(深圳)有限公司 Electronic device and method for operating interface through side frame
CN104238837A (en) * 2013-06-23 2014-12-24 北京智膜科技有限公司 Control device and method of touch screen of information interactive equipment
US20150046871A1 (en) * 2013-08-09 2015-02-12 Insyde Software Corp. System and method for re-sizing and re-positioning application windows in a touch-based computing device
CN104156073A (en) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
CN104850342A (en) * 2015-04-29 2015-08-19 努比亚技术有限公司 Mobile terminal and rapid startup method and device for applications of mobile terminal

Also Published As

Publication number Publication date
CN104850342A (en) 2015-08-19

Similar Documents

Publication Publication Date Title
US9900422B2 (en) Mobile terminal and method of controlling therefor
US10033855B2 (en) Controlling access to features of a mobile communication terminal
US9575742B2 (en) Mobile terminal and control method thereof
US10001917B2 (en) Mobile terminal and controlling method thereof
EP2637086B1 (en) Mobile terminal
EP2602702B1 (en) Mobile terminal and fan-shaped icon arrangement method thereof
US9094530B2 (en) Mobile terminal and controlling method thereof
US9600145B2 (en) Mobile terminal and controlling method thereof
EP2752752B1 (en) Method of Controlling Mobile Terminal
EP2703974B1 (en) Mobile terminal and application icon moving method thereof
US9442743B2 (en) Mobile terminal
US9081477B2 (en) Electronic device and method of controlling the same
USRE46225E1 (en) Mobile terminal and controlling method thereof
EP2549717B1 (en) Mobile terminal and controlling method thereof
CN104935725B (en) Mobile terminal and utilize the method that virtual frame region realizes function point analysis
EP2750363B1 (en) Mobile terminal
EP2672682B1 (en) Mobile terminal and controlling method thereof
US9710148B2 (en) Mobile terminal and controlling method thereof
ES2485967T3 (en) Terminal and method to control it
EP2989522B1 (en) Mobile terminal and control method thereof
US20170255382A1 (en) Mobile terminal and operation method thereof and computer storage medium
US8966401B2 (en) Electronic device and methods of sending information with the electronic device, controlling the electronic device, and transmitting and receiving information in an information system
US8170620B2 (en) Mobile terminal and keypad displaying method thereof
US8713463B2 (en) Mobile terminal and controlling method thereof
US9417723B2 (en) Method for controlling operation of touch panel and portable terminal supporting the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16785845

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16785845

Country of ref document: EP

Kind code of ref document: A1