WO2016173414A1 - Terminal mobile et procédé de démarrage rapide et dispositif pour programme d'application associé - Google Patents

Terminal mobile et procédé de démarrage rapide et dispositif pour programme d'application associé Download PDF

Info

Publication number
WO2016173414A1
WO2016173414A1 PCT/CN2016/079481 CN2016079481W WO2016173414A1 WO 2016173414 A1 WO2016173414 A1 WO 2016173414A1 CN 2016079481 W CN2016079481 W CN 2016079481W WO 2016173414 A1 WO2016173414 A1 WO 2016173414A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
contact
gesture
quick start
touch
Prior art date
Application number
PCT/CN2016/079481
Other languages
English (en)
Chinese (zh)
Inventor
吴玲玲
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2016173414A1 publication Critical patent/WO2016173414A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This document relates to the field of startup technologies for mobile terminals, and in particular, to a method and apparatus for quickly starting a mobile terminal and an application thereof.
  • a related art program is started by the user first finding the location of the program that he wants to start, and then clicking to start.
  • many applications are downloaded and installed on the mobile phone.
  • finding the application that needs to be started there are often many applications, and it is necessary to look up and down or even repeatedly page through the page. This is not the user. want.
  • the embodiments of the present invention provide a method and a device for quickly starting a mobile terminal and an application thereof, so as to achieve the purpose of quickly starting an application, and overcome the defects caused by the above-mentioned related technology that requires the user to search back and forth for an application that needs to be started.
  • the embodiment of the present invention provides a method for quickly starting an application, where the method is applied to a mobile terminal, where the mobile terminal stores a correspondence table between different areas of the side edges and different positions where the application icons are placed, and the method includes:
  • the received touch gesture is a preset quick start gesture, querying, from the correspondence relationship table, a location of placing an application icon corresponding to a side edge region where the touch gesture occurs;
  • the method also includes:
  • the identified interface is not the interface of the launched application, it is determined whether the received touch gesture is a preset quick start gesture.
  • the method before the step of starting the program of the application icon placed at the location acquired, the method further includes:
  • the method before the step of starting the program of the application icon placed at the location acquired, the method further includes:
  • the method further includes:
  • the preset effect processing is performed on the placed application icon.
  • the querying from the correspondence relationship table, a location of placing an application icon corresponding to a side edge region where the touch gesture occurs Previously included:
  • the area where the acquired coordinates are located is determined as the side edge area where the touch gesture occurs.
  • the quick start gesture is to slide or click the side edge region within a preset direction range.
  • the quick start gesture is sliding within a preset direction range
  • Determining that the received touch gesture is a preset quick start gesture includes:
  • the calculating the distance between the first contact and the second contact of the touch gesture includes:
  • downX is the abscissa of the first contact
  • downY is the ordinate of the first contact
  • currentX is the abscissa of the second contact
  • currentY is the ordinate of the second contact.
  • the calculating the distance between the first contact and the second contact of the touch gesture includes:
  • downY is the ordinate of the first contact
  • currentY is the ordinate of the second contact
  • the embodiment of the present invention further provides a quick start device for an application, where the device stores a correspondence table between different regions of the side edges and different positions where the application icons are placed, the device includes a gesture receiving module, a location query module, and Program startup module, wherein
  • the gesture receiving module is configured to: receive a touch gesture in which the touch point is located in a side edge region;
  • the location querying module is configured to: when the received touch gesture is a preset quick-start gesture, query, from the correspondence relationship table, a placement application icon corresponding to a side edge region where the touch gesture occurs s position;
  • the program launching module is configured to: initiate a program of the applied application icon placed at the location.
  • the device further includes an interface recognition unit and an interface determination unit, where
  • the interface recognition unit is configured to: identify an interface displayed by the current desktop;
  • the interface determining unit is configured to determine whether the received touch gesture is a preset quick start gesture when the identified interface is not the interface of the launched application.
  • the device further includes a number determining unit, a query unit, and a prompting unit, where
  • the number determining unit is configured to: determine whether the icon actually placed in the position where the application icon is placed in the correspondence table is two or more;
  • the query unit is configured to: if the number determining unit determines to be placed in the correspondence table If the location of the application icon is actually two or more icons, the location of the application icon corresponding to the side edge region where the touch gesture occurs is queried;
  • the prompting unit is configured to: prompt an application icon placed at the queried location.
  • the device further includes a number determining unit and a prompting unit, where
  • the number determining unit is configured to: determine whether the icon actually placed in the position where the application icon is placed in the correspondence table is two or more;
  • the prompting unit is configured to: if the number determining unit determines that the icon actually placed in the position where the application icon is placed in the correspondence table is two or more, prompting the application icon placed at the queried location.
  • the prompting unit is further configured to:
  • the preset effect processing is performed on the placed application icon.
  • the gesture receiving module includes a coordinate acquiring unit and an area determining unit, where
  • the coordinate acquiring unit is configured to: acquire coordinates of the first touch point where the touch gesture occurs, and send the acquired coordinates to the area determining unit, where the output end of the coordinate acquiring unit is connected to the input end of the area determining unit;
  • the area determining unit is configured to determine the area where the acquired coordinates are located as the side edge area where the touch gesture occurs.
  • the quick start gesture is to slide or click the side edge region within a preset direction range.
  • the quick start gesture is sliding within a preset direction range
  • the location query module is configured to perform the following manner to determine that the received touch gesture is a preset quick start gesture:
  • the location query module is configured to implement the calculating a distance between the first contact and the second contact of the touch gesture by:
  • downX is the abscissa of the first contact
  • downY is the ordinate of the first contact
  • currentX is the abscissa of the second contact
  • currentY is the ordinate of the second contact.
  • the embodiment of the invention further provides a mobile terminal, comprising the quick start device of the application program according to any one of the preceding claims.
  • the embodiment of the invention provides a quick start method and device for a mobile terminal and an application program thereof, which can quickly trigger an application by using an edge region, and the definition of the trigger gesture is novel, so that the user can directly arrange the application to be quickly launched in the relationship table.
  • the location of the application icon is defined in the side edge area, and the corresponding application is triggered by identifying the application icon placed at the corresponding location, so that the user can quickly launch the application when appropriate, such as desktop or black screen.
  • FIG. 1 is a schematic diagram of an optional hardware structure of a mobile terminal implementing various embodiments of the present invention
  • FIG. 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
  • FIG. 3 is a flow chart of a method for quickly starting an application according to an embodiment of the present invention.
  • FIG. 4 is a schematic view of an interface of different regions of a side edge according to an embodiment of the present invention.
  • FIG. 5 is a flowchart of a method for quickly starting an application according to another embodiment of the present invention.
  • FIG. 6 is a flow chart of a method for quickly starting an application according to still another embodiment of the present invention.
  • FIG. 7 is a block diagram showing an exemplary structure of a quick start device of an application according to an embodiment of the present invention.
  • FIG. 8 is a block diagram showing an exemplary structure of a quick start device of an application according to another embodiment of the present invention.
  • FIG. 9 is a block diagram showing an exemplary structure of a mobile terminal according to an embodiment of the present invention.
  • FIG. 10 is a schematic structural diagram of a mobile terminal according to still another embodiment of the present invention.
  • FIG. 1 is a schematic diagram of an optional hardware structure of a mobile terminal implementing various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an A/V (Audio/Video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
  • Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal can exist in various forms, for example, it can be in the form of a number Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), etc. exist.
  • the broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
  • MediaFLO forward link media
  • the digital broadcasting system of the @) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .
  • the short range communication module 114 is a module for supporting short range communication.
  • Some examples of short-range communication technologies include BluetoothTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wide Band (UWB), ZigbeeTM, and the like.
  • the location information module 115 is a module for checking or acquiring location information of the mobile terminal.
  • a typical example of a location information module is GPS (Global Positioning System).
  • GPS Global Positioning System
  • the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
  • the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
  • the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
  • the A/V input unit 120 is for receiving an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 is in a video capture mode or an image capture mode.
  • the image data of the still picture or video obtained by the image capturing device is processed.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
  • Sensing unit 140 may include proximity sensor 1410 which will be described below in connection with a touch screen.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identity Module (USIM), and the like.
  • a device having an identification module may take the form of a smart card, and thus, The identification device can be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.
  • an external device eg, data information, power, etc.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal.
  • Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 can display a user interface (UI) or a graphical user interface (GUI) related to a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor LCD
  • OLED organic light emitting diode
  • a flexible display a three-dimensional (3D) display, and the like.
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like.
  • TOLED Transparent Organic Light Emitting Diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
  • the touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (ie, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like. Additionally, the controller 180 can include a multimedia module 1810 for reproducing (or playing back) multimedia data, which can be constructed within the controller 180 or can be configured to be separate from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides Operate the appropriate power required for each component and component.
  • the various embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle.
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
  • the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • Such communication systems may use different air interfaces and/or physical layers.
  • air interfaces used by communication systems include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)). ), Global System for Mobile Communications (GSM), etc.
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
  • a CDMA wireless communication system can include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a public switched telephone network (PSTN) 290.
  • PSTN public switched telephone network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
  • the backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be understood that the system as shown in Figure 2 can include multiple BSC2750.
  • Each BS 270 can serve one or more partitions (or regions), each of which is covered by a multi-directional antenna or an antenna directed to a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS Base Transceiver Subsystem
  • the term "base station” can be used to generally refer to a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as a plurality of cellular stations.
  • a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • GPS Global Positioning System
  • the satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the GPS module 115 as shown in Figure 1 is typically configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • FIG. 3 is a flowchart of a method for quickly starting an application according to an embodiment of the present invention.
  • a method for quickly starting an application according to an embodiment of the present invention is described below with reference to FIG. 3, which is applied to a mobile terminal.
  • the mobile terminal stores a correspondence table between different areas of the side edges and different positions where the application icons are placed. As shown in FIG. 3, the method includes the following steps:
  • S100 Receive a touch gesture in which a touch point is located in a side edge region
  • the relationship between the area where the side edge is stored and the position where the application icon is placed in the relationship table may be one-to-one or one-to-many relationship, for example, one side edge area may correspond to only one area.
  • the position where the application icon is placed ie, the shaded area in FIG. 4
  • the querying the location of the application icon corresponding to the side edge region where the touch gesture occurs in the correspondence relationship table further includes:
  • the area where the acquired coordinates are located is determined as the side edge area where the touch gesture occurs.
  • the manner of determining whether the touch point of the touch gesture is in the side edge region is as follows.
  • each touch gesture is composed of one or more touch points, so the terminal can determine that the touch gesture is by determining the area where the touch point of the touch gesture falls. Occurs in a normal partition or a side edge area (that is, a special partition).
  • the coordinates of the touch point of the touch gesture are acquired by the driving layer of the terminal, and it is determined which partition the coordinates of the touch point fall into.
  • the coordinates of the touch point fall into a special partition it is determined that the touch gesture occurs in a special partition, and the touch gesture is reported by the input device corresponding to the special partition.
  • the determination is performed.
  • the touch gesture occurs in a normal partition, and a special effect is generated according to the touch gesture. Do regular processing.
  • the driver layer After receiving the reported event in the framework layer (the reported event includes the input device and each parameter of the touch point, etc.), first, according to the naming of the input device, which region is identified, the driver layer (kernel) in the above step The recognition is in the special zone touch, the input device reported to the frame layer by the driver layer is input1, instead of being reported by input0, that is, the frame layer does not need to determine which partition the current touch point is in, nor does it need to judge the partition. The size and position, these judgment operations are completed on the driver layer, and the driver layer reports the parameters of the touch point to the frame layer in addition to the specific input device, such as pressing time, position coordinates, and pressure. and many more.
  • the driving layer of the mobile terminal reports the touch point through the input device corresponding to the normal partition.
  • the reported event includes the input device and the parameters of the touch point, etc.
  • the driver layer kernel identification in the above step
  • the input device reported to the framework layer on the driver layer is input0, instead of using input1 to report, that is, the framework layer does not need to determine which partition the current touch point is in, and does not need to determine the size of the partition.
  • the judgment operation is completed on the driving layer, and the driving layer not only reports which input device is specifically, but also reports various parameters of the touch point to the frame layer, such as pressing time, position coordinates, pressure size, etc. Wait.
  • the touch gesture is conventionally processed, that is, the touch gesture is processed according to a conventional process in the related art. For example, after receiving the touch point reported by the input device corresponding to the normal partition, the frame layer of the terminal continues to report the touch point according to the normal process to execute the corresponding operation instruction.
  • the present embodiment does not perform normal processing on the touch gesture reported by the special partition, but generates special effects according to the touch gesture, and also prevents misoperation.
  • the size of the preset side edge area may be a fixed value.
  • a corresponding input device may be disposed with each side edge area for adjusting the position and size of the side edge area.
  • the size of the side edge area is a variable, it can be implemented by the interface modified by the parameter reserved by the developer. As shown in FIG. 4, the developer sets the interface Set_zone (id, X0, Y0, which is left in the upper layer through the driver layer. X1, Y1) is set by the user.
  • the common partition in Figure 4 is abbreviated as Area A, and the side edge area is simply referred to as Area C.
  • the reporting process of touch events is as follows:
  • the driving layer receives the touch event through physical hardware such as a touch screen, and determines whether the touch operation occurs in the A area or the C area, and reports the event through the device file node of the A area or the C area.
  • the Native layer reads events from device files in Areas A and C, and processes events in Areas A and C, such as coordinate calculations. The devices in the A and C areas are distinguished by the device ID, and finally the A area is distributed. And zone C events.
  • the A zone event takes the original process, and processes the A zone event in the usual way, that is, through the multi-channel mechanism; the C zone event is distributed from the C zone dedicated channel registered in advance to the Native layer, by the Native port.
  • system port output to the C area event end system service, listen to the C area event through the listener (listener), and then report the external interface to each application through the C area event.
  • the embodiment of the present invention can realize the free customization of the anti-missing area, that is, the edge area, by using the driver layer code of the mobile terminal. Therefore, the technical solution of the present invention is implemented in the driver layer instead of the firmware, which makes the software design of the device get rid of the software design.
  • the constraints of touch-screen IC suppliers are more flexible and less costly.
  • the number of the special partitions is two, which are respectively located on both sides of the touch area, and the remaining area of the touch area is a normal partition.
  • the normal partition includes an A area and a bottom B area, wherein the A area is an operable area for detecting touch point coordinates, and the B area is a virtual key area for detecting a menu key, a Home key, a return key, etc., two
  • the special partitions are respectively located at the edge of the touch area and are located on both sides of the A area.
  • two input devices such as input device 0 (input0) and input device 1 (input1) can be registered by the input_register_device() instruction when the touch screen driver is initialized.
  • An input device is allocated for each partition by the input_allocate_device() instruction, for example, the normal partition corresponds to the input device 0, and the edge region corresponds to the input device 1.
  • the specific method of implementing the partition can define the categories and implementation manners of the common partitions and edge regions by using the object-oriented method. After determining the edge regions, the coordinates of the touch points of different resolutions are converted into LCDs by the EventHub function. Coordinates, define single-channel functions (such as serverchannel and clientchannel, etc.). The function of this function is to pass the event to the event manager (TouchEventManager) through the channel after receiving the reported event, and listen to it through the listener. The event is transmitted to multiple response application modules simultaneously or one by one through multiple channels, or It is only passed to one of the application modules, application modules such as camera, gallery, etc., and different application modules generate corresponding operations. Certainly, the specific implementation of the foregoing steps may also be implemented in other manners, which is not limited by the embodiment of the present invention.
  • the system After receiving the touch operation reported from the input device corresponding to the normal partition, the system receives the reported event (the reported event includes the input device and the parameters of the touch point, etc.), and then identifies which one is based on the naming of the input device. region.
  • FIG. 4 is a schematic diagram of an interface of different areas of a side edge according to an embodiment of the present invention. The correspondence between the block of the side edge area and the position where the application icon is placed and the identification step are specifically described below with reference to FIG. 4:
  • the position or the area where the application icon is placed corresponding to the block ⁇ (Xa, Ya), (Xb, Yb) ⁇ in the side edge region is the shadow of the second row and the first column having the reverse stripes.
  • the area, the location stored in the relationship table is the area coordinate of the application icon in the first row of the second row, and the first touch of the touch gesture is determined when the touch gesture of the touch point is received in the side edge region.
  • the coordinates of the point, and then determining the block in which the touch gesture occurs by determining the block of the side edge region where the coordinates of the first touch point fall, and then querying the relationship table from the relationship table. Place the location of the app icon for a quick launch.
  • the above quick-start gesture may be, for example, sliding, or sliding in a preset direction range, or clicking a block in a side edge region.
  • a method for determining whether the detected gesture is a gesture of sliding up and down, for example, when the coordinates of the first contact are (downX, downY) (the first contact is also a criterion for determining a region where the touch gesture occurs) Pressing the first contact for (downTime), the touch screen reports the coordinates of the current position (second contact) of the contact (current, currentY) at regular intervals (for example, 1/85 second), and then calculates the first The distance between the contact and the second contact.
  • the distance value After calculating the distance value, it is further determined whether the distance is greater than a preset threshold, and if so, it is determined that the contact has slipped. If necessary, it is further possible to compare the first contact and the second contact The ordinate determines the direction of the slip.
  • the application is quickly triggered by using the edge region, and since the definition of the trigger gesture is novel, the user can directly arrange the location where the application icon is placed in the relationship table, and the side edge region is identified by the user.
  • the application icon placed in the corresponding location triggers the corresponding application, so that the user can quickly launch the application when appropriate, such as desktop or black screen.
  • FIG. 5 is a flowchart of a method for quickly starting an application according to another embodiment of the present invention. As shown in FIG. 5, the method for quickly starting an application provided by this embodiment is after the step S100, before S200 and S300. ,Also includes:
  • the current desktop display interface is not the interface of the launched application, such as displaying the desktop or blanking out.
  • the current interface displays a music song playing interface
  • the functions involved in the music song playing function have basic functions such as adjusting volume and sharing up and down, and if the side edge region is further defined to adjust the volume.
  • the side edge area of the upper and lower sliding terminals corresponds to the volume of the playback, and the call of different functions corresponding to the gestures of sliding up and down on different operation interfaces may also be added to the relationship table.
  • FIG. 6 is a flowchart of a method for quickly starting an application according to another embodiment of the present invention. As shown in FIG. 6, the quick start method of the application provided by this embodiment is after the steps S100 and S200, before S300. ,Also includes:
  • the method further includes: for the application icon placed in the location where the prompt is queried, the method further includes:
  • the preset effect processing is performed on the placed application icon.
  • the special effect processing adds, for example, a background color to the application icon placed at the queried location, adds background blur, shakes the application icon up or down or left and right according to a preset frequency, or adds a visible label to the corresponding application icon.
  • the user may be effectively prevented from being erroneously started due to misoperation or inaccurate operation.
  • FIG. 7 is a block diagram showing an exemplary structure of a quick start device of an application according to an embodiment of the present invention, and a quick start device for an application according to an embodiment of the present invention is described in detail below with reference to FIG. As shown in FIG. 7, the quick start device 11 of the application specifically includes:
  • the gesture receiving module 10 is configured to: receive a touch gesture in which the touch point is located in the side edge region, and then send the received touch gesture to the location query module 20;
  • the location query module 20 is configured to: when the received touch gesture is a preset quick-start gesture, query the position of the application icon corresponding to the side edge region where the touch gesture occurs, from the correspondence table, and query The location is sent to the program startup module 30;
  • the program launching module 30 is configured to: start a program of the application icon placed at the acquired location.
  • the output end of the gesture receiving module 10 is connected to the input end of the location query module 20, and the output end of the location query module 20 is connected to the input end of the program launching module 30.
  • FIG. 8 is a block diagram showing an exemplary structure of a quick start device of an application according to another embodiment of the present invention. As shown in FIG. 8, the quick start device 11 of the application specifically includes:
  • the interface identifying unit 21 is configured to: identify an interface displayed by the current desktop;
  • the interface determining unit 22 is configured to: determine whether the interface recognized by the interface identifying unit 21 is an interface of the activated application, and if not, determine whether the received touch gesture is a preset quick start gesture.
  • the output end of the interface recognition unit 21 is connected to the input end of the interface determination unit 22.
  • the quick start device 11 of the application specifically includes:
  • the number determining unit is configured to: determine whether the icon actually placed at the position where the application icon is placed in the relationship table is two or more;
  • the query unit is configured to: if two or more icons are actually placed at the position where the application icon is placed in the relationship table, query the position where the application icon is placed corresponding to the side edge region where the touch gesture occurs;
  • the prompt unit is set to: prompt the application icon placed in the queried location.
  • the quick start device 11 of the application specifically includes:
  • the number determining unit is configured to: determine whether the icon actually placed at the position where the application icon is placed in the relationship table is two or more;
  • the prompting unit is configured to: when the number determining unit determines whether the icon actually placed in the position where the application icon is placed in the relationship table is two or more, prompting the application icon placed at the queried location.
  • the output end of the number judging unit is connected to the input end of the query unit, and the output end of the query unit is connected to the input end of the prompt unit.
  • the prompting unit is further configured to:
  • the preset application effect is performed on the placed application icon.
  • the gesture receiving module 10 includes:
  • the coordinate acquiring unit is configured to: acquire coordinates of the first touch point where the touch gesture occurs, and send the acquired coordinates to the area determining unit, where the output end of the coordinate acquiring unit is connected to the input end of the area determining unit;
  • the area determining unit is configured to: determine the area where the acquired coordinates are located as a side edge area where the touch gesture occurs.
  • the application is quickly triggered by using the edge region, and since the definition of the trigger gesture is novel, the user can directly arrange the placement that is desired to be quickly launched in the relationship table.
  • the position of the application icon, the side edge area triggers the corresponding application by identifying the application icon placed at the corresponding position, so that the user can quickly launch the application at an appropriate time.
  • the quick start gesture is to slide or click on the side edge area within a preset direction range.
  • the quick start gesture is to slide in a preset direction range
  • the location query module is configured to perform the following manner to determine that the received touch gesture is a preset quick start gesture:
  • the location query module is configured to implement the calculating the distance between the first contact and the second contact of the touch gesture by:
  • downX is the abscissa of the first contact
  • downY is the ordinate of the first contact
  • currentX is the abscissa of the second contact
  • currentY is the ordinate of the second contact.
  • the location query module is configured to implement the calculating, by using the following manner, the distance between the first contact and the second contact of the touch gesture:
  • downY is the ordinate of the first contact
  • currentY is the ordinate of the second contact
  • FIG. 9 is a block diagram showing an exemplary structure of a mobile terminal 100, such as the mobile terminal 100 shown in FIG. 9, including the quick start device 11 of the application described above, in accordance with an embodiment of the present invention.
  • the mobile terminal can be implemented in various forms.
  • the terminal described in the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (Personal Digital Assistant), a PAD (Tablet), a PMP (Portable Multimedia Player), a navigation device, etc.
  • Mobile terminals such as fixed terminals such as digital TVs, desktop computers, and the like.
  • the terminal is a mobile terminal.
  • those skilled in the art will appreciate that configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.
  • a mobile terminal includes an input device, a processor 903, a display screen 904, and a memory 905.
  • the input device is a touch screen 2010 that includes a touch panel 901 and a touch controller 902.
  • the input device may also be a non-touch input device (eg, an infrared input device, etc.) or the like.
  • Touch controller 902 can be a single application specific integrated circuit (ASIC), which can include one or more processor subsystems, which can include one or more ARM processors or other processors with similar functions and capabilities.
  • ASIC application specific integrated circuit
  • the touch controller 902 is mainly used for receiving a touch signal generated by the touch panel 901, and processing the same to the processor 903 of the mobile terminal.
  • processing is, for example, analog-to-digital conversion of a physical input signal, processing to obtain touch point coordinates, processing to obtain a touch duration, and the like.
  • the processor 903 receives the output of the touch controller 902, performs processing, and performs an action based on the output.
  • the actions include, but are not limited to, moving an object such as a table or indicator, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, as a selection, executing an instruction, operating a peripheral device coupled to the host device Answering a phone call, making a call, terminating a phone call, changing volume or audio settings, storing information related to phone communications (eg, address, frequently used number, received call, missed call), logging in to a computer or computer network, allowing authorization An individual accesses a restricted area of a computer or computer network, records a user profile associated with a user preferences configuration of a computer desktop, allows access to network content, launches a particular program, encrypts or decodes a message, and the like.
  • the processor 903 is also coupled to the display screen 904.
  • Display 904 is used to provide a UI to a user of the device.
  • processor 903 can be a separate component from touch controller 902. In other embodiments, the processor 903 can be a composite component with the touch controller 902.
  • the touch panel 901 is provided with a discrete motion sensor 906, such as a capacitive sensor, a resistive sensor, a force sensor, an optical sensor, or the like.
  • a discrete motion sensor 906 such as a capacitive sensor, a resistive sensor, a force sensor, an optical sensor, or the like.
  • the touch panel 901 includes an electrode array made of a conductive material in a lateral direction and a longitudinal direction.
  • touch controller 902 uses self-capacitance scanning, then scan M rows and N columns respectively, according to each row and each column signal To perform the calculation, locate the coordinates of the finger on the touch screen. The number of scans is M+N times.
  • the touch controller 902 uses multi-contact mutual capacitance scanning to intersect the rows and columns. Scanning, whereby the number of scans is M ⁇ N times.
  • the touch panel 901 When the user's finger touches the panel, the touch panel 901 generates a touch signal (which is an electrical signal) to the touch controller 902.
  • the touch controller 902 can obtain the coordinates of the touched point by scanning.
  • the touch panel 901 of the touch screen 2010 is physically a set of independent coordinate positioning systems. After the touch point coordinates of the touch are reported to the processor 903, the processor 903 is converted to the display screen 904. Pixel coordinates to correctly identify the input operation.
  • the touch controller 902 is configured to receive a touch gesture in which the touch point is located in the side edge region according to the following manner: acquiring coordinates of the first touch point where the touch gesture occurs; determining an area where the acquired coordinates are located The area where the touch gesture occurs.
  • the touch controller 902 determines whether the touch operation occurs in a normal partition or an edge region (ie, a special partition) by determining an area in which the touch point of the touch operation falls.
  • the coordinates of the touch point of the touch operation are obtained through the touch screen, and it is determined which partition the coordinates of the touch point fall into.
  • the touch operation occurs in the special partition, and the touch operation is reported by the input device corresponding to the special partition.
  • the determination is performed.
  • the touch operation takes place in a normal partition, and special effects are generated according to the touch operation, and the conventional processing is performed.
  • the touch controller 902 receives the touch event through the touch screen, and determines whether the touch operation occurs in the A area or the C area, and reports the event through the device file node of the A area or the C area.
  • the event is read from the device files of the A area and the C area, and the events of the A area and the C area are processed, such as coordinate calculation, and the events of the A and C areas are distinguished by the device ID, and finally the A is respectively distributed. District and District C events.
  • the A area event takes the original process, and the A area event is processed in the usual way, that is, through the multi-channel mechanism; the C area event is distributed from the C area dedicated channel, input by the native port, and the system port is output to The C-zone event ends the system service, and listens to the C-zone event through the listener, and then reports the external interface to each application through the C-zone event.
  • the touch controller 902 is further configured to: set a partition.
  • the number of special partitions is two. They are located on both sides of the touch area, and the remaining area of the touch area is a normal partition.
  • the normal partition includes an A area and a bottom B area, wherein the A area is an operable area for detecting touch point coordinates, and the B area is a virtual key area for detecting a menu key, a Home key, a return key, etc., two
  • the special partitions are respectively located at the edge of the touch area and are located on both sides of the A area.
  • the touch controller 902 determines that it is an edge region
  • the coordinates of the touch points of different resolutions are converted into the coordinates of the LCD through the EventHub function, and a single channel function (such as serverchannel and clientchannel, etc.) is defined, and the function is
  • the event is transmitted to the event manager (TouchEventManager) through the channel, and the event is transmitted to the application module of multiple responses simultaneously or one by one through the monitoring of the listener, or can be transmitted only to the application module of the response.
  • An application module, application modules such as camera, gallery, etc., different application modules generate corresponding operations.
  • the processor 903 is configured to: when determining that the received touch gesture is a preset quick start gesture, query the placement application corresponding to the side edge region where the touch gesture occurs from the corresponding relationship table stored in the memory 905. The location of the icon; the program that launches the app icon placed at the acquired location.
  • the processor 903 is further configured to: identify an interface displayed by the current desktop; and when the recognized interface is not an interface of the launched application, determine whether the received touch gesture is a preset quick start gesture.
  • the processor 903 is further configured to: determine whether the icon actually placed in the location where the application icon is placed in the relationship table is two or more; if the location where the application icon is placed in the relationship table is actually two or more icons, the location of the query is prompted.
  • the app icon placed.
  • the preset application effect is performed on the placed application icon.
  • the special effect processing adds, for example, a background color to the application icon placed at the queried location, adds background blur, shakes the application icon up or down or left and right according to a preset frequency, or adds a visible label to the corresponding application icon.
  • the memory 905 stores a correspondence table of different areas of the side edges and different positions at which the application icons are placed.
  • the embodiment of the invention further discloses a computer program, comprising program instructions, when the program instruction is executed by the mobile terminal, so that the mobile terminal can execute the quick start method of any of the above applications.
  • the embodiment of the invention also discloses a carrier carrying the computer program.
  • the embodiment of the invention provides a quick start method and device for a mobile terminal and an application program thereof, which can quickly trigger an application by using an edge region, and the definition of the trigger gesture is novel, so that the user can directly arrange the application to be quickly launched in the relationship table.
  • the location of the application icon is defined in the side edge area, and the corresponding application is triggered by identifying the application icon placed at the corresponding location, so that the user can quickly launch the application when appropriate, such as desktop or black screen. Therefore, the present invention has strong industrial applicability.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un terminal mobile ainsi qu'un procédé et un dispositif de démarrage rapide pour un programme d'application associé, qui appartiennent au domaine technique des terminaux mobiles. Le procédé de démarrage rapide pour un programme d'application est appliqué à un terminal mobile. Une table de corrélation des différentes zones d'un bord latéral et des différents emplacements où se trouve une icône d'application est stockée dans le terminal mobile. Le procédé consiste à : recevoir un geste de commande tactile dont le point de commande tactile est situé dans une zone d'un bord latéral (S100) ; lorsque le geste de commande tactile reçu est un geste de démarrage rapide prédéfini, interroger, dans une table de corrélation, un emplacement où se trouve une icône d'application correspondant à la zone du bord latéral où le geste de commande tactile est effectué (S200) ; et démarrer un programme de l'icône d'application disposée au niveau de l'emplacement acquis (S300). Le procédé permet à un utilisateur de disposer des applications susceptibles d'être démarrées rapidement à des emplacements où se trouvent des icônes d'application définies dans une table de corrélation, ce qui permet d'aider l'utilisateur à démarrer rapidement un programme d'application en temps approprié, et d'améliorer ainsi l'expérience de l'utilisateur.<sb />
PCT/CN2016/079481 2015-04-29 2016-04-15 Terminal mobile et procédé de démarrage rapide et dispositif pour programme d'application associé WO2016173414A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510210105.4A CN104850342A (zh) 2015-04-29 2015-04-29 移动终端及其应用程序的快速启动方法和装置
CN201510210105.4 2015-04-29

Publications (1)

Publication Number Publication Date
WO2016173414A1 true WO2016173414A1 (fr) 2016-11-03

Family

ID=53850019

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/079481 WO2016173414A1 (fr) 2015-04-29 2016-04-15 Terminal mobile et procédé de démarrage rapide et dispositif pour programme d'application associé

Country Status (2)

Country Link
CN (1) CN104850342A (fr)
WO (1) WO2016173414A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111356217A (zh) * 2020-02-17 2020-06-30 Oppo广东移动通信有限公司 终端控制方法、装置、终端设备以及存储介质

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850342A (zh) * 2015-04-29 2015-08-19 努比亚技术有限公司 移动终端及其应用程序的快速启动方法和装置
CN105159587B (zh) * 2015-08-27 2018-03-27 广东欧珀移动通信有限公司 一种控制应用的方法及移动终端
CN106484269A (zh) * 2015-08-31 2017-03-08 中兴通讯股份有限公司 一种应用程序使用方法和装置、及终端
CN106610777A (zh) * 2015-10-23 2017-05-03 小米科技有限责任公司 启动应用程序的方法、装置及移动终端
CN105487805B (zh) * 2015-12-01 2020-06-02 小米科技有限责任公司 对象操作方法及装置
CN106919415A (zh) * 2015-12-28 2017-07-04 阿里巴巴集团控股有限公司 启动应用程序的方法、装置及电子设备
CN105955608B (zh) * 2016-04-22 2020-06-12 北京金山安全软件有限公司 一种快捷控制方法、装置及电子设备
CN107132967B (zh) * 2017-04-26 2020-09-01 努比亚技术有限公司 一种应用的启动方法及装置、存储介质、终端
CN108509131B (zh) * 2018-03-28 2021-04-06 维沃移动通信有限公司 一种应用程序启动方法及终端
CN109658883A (zh) * 2018-11-27 2019-04-19 努比亚技术有限公司 屏幕显示方法、终端及计算机可读存储介质
CN111741158A (zh) * 2019-12-19 2020-10-02 张鹏辉 一种在手机屏画长线实现快捷操作的方法
CN110868498A (zh) * 2019-12-21 2020-03-06 张鹏辉 一种在手机屏左右侧边滑动实现快捷操作的方法
CN111953905B (zh) * 2020-08-26 2021-11-16 维沃移动通信有限公司 美颜功能开启方法、装置、电子设备及可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103885674A (zh) * 2012-12-20 2014-06-25 卡西欧计算机株式会社 输入装置、输入操作方法以及电子设备
CN104156073A (zh) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 移动终端及其操作方法
CN104182164A (zh) * 2013-05-27 2014-12-03 赛龙通信技术(深圳)有限公司 一种电子装置及通过侧边框操作界面的方法
CN104238837A (zh) * 2013-06-23 2014-12-24 北京智膜科技有限公司 基于信息交互设备触控屏幕的控制装置及方法
US20150046871A1 (en) * 2013-08-09 2015-02-12 Insyde Software Corp. System and method for re-sizing and re-positioning application windows in a touch-based computing device
CN104850342A (zh) * 2015-04-29 2015-08-19 努比亚技术有限公司 移动终端及其应用程序的快速启动方法和装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019600A (zh) * 2012-12-14 2013-04-03 广东欧珀移动通信有限公司 一种移动终端开启应用程序的方法及系统
CN104063164B (zh) * 2013-03-22 2018-02-27 腾讯科技(深圳)有限公司 屏幕控制的方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103885674A (zh) * 2012-12-20 2014-06-25 卡西欧计算机株式会社 输入装置、输入操作方法以及电子设备
CN104182164A (zh) * 2013-05-27 2014-12-03 赛龙通信技术(深圳)有限公司 一种电子装置及通过侧边框操作界面的方法
CN104238837A (zh) * 2013-06-23 2014-12-24 北京智膜科技有限公司 基于信息交互设备触控屏幕的控制装置及方法
US20150046871A1 (en) * 2013-08-09 2015-02-12 Insyde Software Corp. System and method for re-sizing and re-positioning application windows in a touch-based computing device
CN104156073A (zh) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 移动终端及其操作方法
CN104850342A (zh) * 2015-04-29 2015-08-19 努比亚技术有限公司 移动终端及其应用程序的快速启动方法和装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111356217A (zh) * 2020-02-17 2020-06-30 Oppo广东移动通信有限公司 终端控制方法、装置、终端设备以及存储介质

Also Published As

Publication number Publication date
CN104850342A (zh) 2015-08-19

Similar Documents

Publication Publication Date Title
WO2016173414A1 (fr) Terminal mobile et procédé de démarrage rapide et dispositif pour programme d&#39;application associé
WO2016169483A1 (fr) Terminal mobile et procédé d&#39;ajustement de fonction à l&#39;aide de la région de trame virtuelle associée
WO2016169524A1 (fr) Procédé et dispositif de réglage rapide de luminosité d&#39;écran, terminal mobile et support d&#39;informations
WO2016155550A1 (fr) Procédé de commutation d&#39;application pour terminal sans cadre et terminal sans cadre
WO2017071424A1 (fr) Terminal mobile et procédé de partage de fichier
WO2016029766A1 (fr) Terminal mobile et son procede de fonctionnement et support de stockage informatique
WO2016155424A1 (fr) Procédé de commutation d&#39;application pour terminal mobile, terminal mobile et support de stockage informatique
WO2017143847A1 (fr) Dispositif et procédé d&#39;affichage à écrans multiples d&#39;application associée, et terminal
WO2016169480A1 (fr) Procédé et dispositif de commande de terminal de mobile, et support de stockage informatique
WO2016034055A1 (fr) Terminal mobile et son procédé de fonctionnement et support de stockage informatique
WO2016119648A1 (fr) Procédé et appareil pour empêcher un effleurement accidentel d&#39;un terminal mobile
WO2016119635A1 (fr) Procédé et dispositif de prévention de toucher par erreur pour un terminal mobile
WO2016173468A1 (fr) Procédé et dispositif d&#39;opération combinée, procédé de fonctionnement d&#39;écran tactile et dispositif électronique
CN106210328B (zh) 信息显示装置及方法
WO2017071456A1 (fr) Procédé de traitement de terminal, terminal, et support de stockage
WO2016155454A1 (fr) Terminal mobile et procédé de reconnaissance de défilement pour une zone de trame virtuelle d&#39;un terminal mobile
WO2016155597A1 (fr) Procédé et dispositif de commande d&#39;application basée sur un terminal sans cadre
WO2016155434A1 (fr) Procédé et dispositif pour reconnaître la tenue d&#39;un terminal mobile, support de stockage et terminal
WO2016173498A1 (fr) Procédé et dispositif permettant d&#39;entrer un nombre ou un symbole au moyen d&#39;un cadre d&#39;écran tactile, terminal et support
WO2017071481A1 (fr) Terminal mobile et procédé de mise en œuvre d&#39;écran divisé
WO2016161986A1 (fr) Procédé et appareil de reconnaissance d&#39;opération, terminal mobile et support de stockage informatique
WO2016119650A1 (fr) Procédé et appareil permettant de prévenir un toucher accidentel d&#39;un terminal mobile
WO2017020771A1 (fr) Dispositif et procédé de commande de terminal
WO2016155509A1 (fr) Procédé et dispositif permettant de déterminer un mode de tenue d&#39;un terminal mobile
CN106101423B (zh) 分屏区域大小调整装置及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16785845

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16785845

Country of ref document: EP

Kind code of ref document: A1