WO2016169483A1 - 移动终端及其利用虚拟边框区域实现功能调节的方法 - Google Patents

移动终端及其利用虚拟边框区域实现功能调节的方法 Download PDF

Info

Publication number
WO2016169483A1
WO2016169483A1 PCT/CN2016/079794 CN2016079794W WO2016169483A1 WO 2016169483 A1 WO2016169483 A1 WO 2016169483A1 CN 2016079794 W CN2016079794 W CN 2016079794W WO 2016169483 A1 WO2016169483 A1 WO 2016169483A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
virtual
virtual border
partition
mobile terminal
Prior art date
Application number
PCT/CN2016/079794
Other languages
English (en)
French (fr)
Inventor
陈小翔
马英超
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Priority to US15/567,569 priority Critical patent/US20180113591A1/en
Publication of WO2016169483A1 publication Critical patent/WO2016169483A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices

Definitions

  • the present invention relates to the field of communications technologies, and in particular, to a mobile terminal and a method for implementing function adjustment using a virtual border area.
  • terminal devices such as mobile phones, personal digital assistants (PDAs), and the increasingly powerful functions of operating devices
  • PDAs personal digital assistants
  • the terminal device has become a powerful data processing tool, it has caused a lot of troubles for users because of the need to process too much data and too many functions.
  • the effect processing of images you need to select a lot of filter effects to achieve your desired effect.
  • the whole process is very complicated.
  • the main purpose of the present invention is to provide a mobile terminal and a method for realizing function adjustment by using a virtual border area, which solves the defects of cumbersome and inconvenient function operation procedures in the prior art.
  • the present invention provides a method for realizing function adjustment by using a virtual border area, where the virtual border area includes a first virtual border partition and a second virtual border partition, and the method includes the following steps:
  • Sensing a touch event concurrent with the contact determining whether the touch event belongs to a sliding event; if yes, further determining a direction attribute of the sliding event and a location of the area;
  • the first virtual state is determined according to the direction attribute
  • the function item currently displayed in the border partition is switched; if the sliding event occurs in the second virtual border partition, the function parameter of the current function item is adjusted according to the direction attribute.
  • the method for determining whether the touch event belongs to a sliding event is specifically:
  • the method for determining the direction attribute of the sliding event is specifically:
  • the direction attribute of the sliding event is determined by comparing coordinate values of the contact in the vertical direction of the initial coordinate position and the current coordinate position.
  • the method for determining the location of the location of the sliding event is:
  • W is the width of the screen
  • CW1 is the width of the first virtual border partition
  • CW2 is the width of the second virtual partition.
  • the method further includes:
  • the touchable area of the touch screen of the mobile terminal is divided into a first virtual border partition and a second virtual border partition.
  • the first virtual border partition and the second virtual border partition are respectively: a first virtual border partition located on one side or both side edges of the touch screen, and remaining touch operable areas on the touch screen except the first virtual border partition Is the second virtual border.
  • the two first virtual border partitions located on the left and right sides are respectively set with different special processing methods corresponding to the touch events. .
  • the method further includes the step of dividing the virtual border area on the touch screen by using a fixed dividing manner:
  • the position and size of the virtual border area are defined.
  • the method further includes the step of dividing the virtual border area on the touch screen by using a free setting manner:
  • the method further includes:
  • the naming of the virtual input device to identify that the current user touch area is the first virtual border partition or the second virtual border partition;
  • the present invention further provides a mobile terminal, wherein the first virtual border partition and the second virtual border partition are divided on the touch screen, and the mobile terminal includes:
  • the bottom layer reporting unit is configured to report coordinate position information of the contact in real time when sensing a touch event concurrent with the contact;
  • the sliding recognition unit is configured to determine, according to the coordinate position information of the contact reported by the bottom reporting unit, whether the touch event belongs to a sliding event, and if yes, further determine a direction attribute of the sliding event and a position of the position area;
  • the function switching unit is configured to, when determining that the sliding event occurs in the first virtual border partition, switch the function item currently displayed in the first virtual border partition according to the direction attribute thereof, and control the second virtual border partition update to be displayed as Corresponding function parameter adjustment control;
  • the parameter adjustment unit is configured to adjust a function parameter of the current function item according to a direction attribute thereof when determining that the sliding event occurs in the second virtual frame partition.
  • the sliding recognition unit further includes:
  • the sliding direction determining module is configured to determine a direction attribute of the sliding event by comparing coordinate values of the contact in the vertical direction of the initial coordinate position and the current coordinate position.
  • the sliding recognition unit further includes:
  • the event area determining module is configured to determine, according to the coordinate value of the horizontal direction of the contact, the position and size information of the first virtual border partition and the second virtual border partition, that the sliding event occurs in the first virtual border partition Still the second virtual border partition.
  • the mobile terminal further includes:
  • the virtual border area fixed dividing unit is configured to divide the touchable area of the touch screen of the mobile terminal into a first virtual border partition and a second virtual border partition.
  • the first virtual border partition and the second virtual border partition are respectively: a first virtual border partition located on one side or both side edges of the touch screen, and remaining touch operable areas on the touch screen except the first virtual border partition Is the second virtual border.
  • the virtual frame area fixed partitioning unit is configured to set the first virtual border partitions on the left and right sides when the two sides of the touchable operation area of the touch screen are divided into the first virtual border partitions. Touch events correspond to different special handling methods.
  • the virtual bezel area fixed dividing unit is configured to define a position and a size of the virtual bezel area when the driving is initialized.
  • the mobile terminal further includes:
  • the virtual border area setting interface is used to create and modify the number, position and size of the virtual border area.
  • the virtual border area fixed dividing unit is configured to register two virtual input devices corresponding to the first virtual border partition or the second virtual border partition;
  • a sliding recognition unit configured to identify, according to the naming of the virtual input device, that the current user touch area is a first virtual border partition or a second virtual border partition;
  • the parameter adjustment unit is configured to determine to perform different processing modes based on different touch zones.
  • the left side portion and the right side portion of the virtual bezel area are respectively set as a function switching area and a function parameter adjustment area, and can be switched by sliding in the function switching area to switch the currently adjustable function item, by sliding in the function parameter adjustment area Adjusting the parameter value of the current function item greatly facilitates the user to quickly find the function item to be adjusted and quickly adjust the specific parameter value, which simplifies the operation procedure and greatly improves the user's Use the experience.
  • FIG. 1 is a schematic structural diagram of hardware of a mobile terminal that implements various embodiments of the present invention
  • FIG. 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
  • FIG. 3 is a schematic diagram of a conventional touch screen division manner of a mobile terminal
  • FIG. 4 is a flowchart of a touch operation method of a mobile terminal according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of dividing a C area by a fixed manner according to an embodiment of the present invention.
  • FIG. 6 is another schematic diagram of dividing a C area by a fixed manner according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of dividing a C area by a free setting method according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram showing a display effect of a touch screen under a system desktop according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of a display effect of a touch screen in a camera application scenario according to an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of a C-zone event processing system according to an embodiment of the present invention.
  • FIG. 11 is a flowchart of a C area slip recognition method according to an embodiment of the present invention.
  • FIG. 12 is a schematic diagram of movement of a contact point in a C area according to an embodiment of the present invention
  • FIG. 13 is a flowchart of a method for recognizing a touch event in different virtual border areas according to an embodiment of the present invention
  • Figure 14 is a schematic view showing the sizes of the C area and the A area in the embodiment of the present invention.
  • FIG. 15 is a flowchart of a method for implementing function adjustment by using a virtual border area according to an embodiment of the present invention.
  • FIG. 16 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
  • the mobile terminal can be implemented in various forms.
  • the terminal described in the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (Personal Digital Assistant), a PAD (Tablet), a PMP (Portable Multimedia Player), a navigation device, etc.
  • Mobile terminals and fixed terminals such as digital TVs, desktop computers, and the like.
  • the terminal is a mobile terminal.
  • those skilled in the art will appreciate that configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.
  • FIG. 1 is a schematic diagram showing the hardware structure of a mobile terminal that implements various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an A/V (Audio/Video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
  • Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may exist in the form of Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), and the like.
  • the broadcast receiving module 111 can receive signals by using various types of broadcast systems. broadcast. In particular, the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
  • MediaFLO forward link media
  • the digital broadcasting system of the @) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .
  • the short range communication module 114 is a module for supporting short range communication.
  • Some examples of short-range communication technologies include BluetoothTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wide Band (UWB), ZigbeeTM, and the like.
  • the location information module 115 is a module for checking or acquiring location information of the mobile terminal.
  • a typical example of a location information module is GPS (Global Positioning System).
  • GPS Global Positioning System
  • the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
  • the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
  • the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
  • the A/V input unit 120 is for receiving an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 1220, the camera 121 being imaged in a video capture mode or an image capture mode
  • the image data of the still picture or video obtained by the capture device is processed.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
  • Sensing unit 140 may include proximity sensor 1410 which will be described below in connection with a touch screen.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module can be Various information for verifying the user's use of the mobile terminal 100 is stored and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identity Module (USIM), and the like.
  • UIM User Identification Module
  • SIM Customer Identification Module
  • USB Universal Customer Identity Module
  • the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal.
  • Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 can display a user interface (UI) or a graphical user interface (GUI) related to a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor LCD
  • OLED organic light emitting diode
  • a flexible display a three-dimensional (3D) display, and the like.
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like.
  • TOLED Transparent Organic Light Emitting Diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown).
  • the touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (ie, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like that performs processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, and the like) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 can be connected to a network that performs a storage function of the memory 160 through a network connection. Network storage device collaboration.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like. Additionally, the controller 180 can include a multimedia module 1810 for reproducing (or playing back) multimedia data, which can be constructed within the controller 180 or can be configured to be separate from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
  • the various embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle.
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
  • the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • Such communication systems may use different air interfaces and/or physical layers.
  • the air interfaces used include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), Global Mobile Communications. System (GSM) and more.
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global Mobile Communications. System
  • the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
  • a CDMA wireless communication system can include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a public switched telephone network (PSTN) 290.
  • PSTN public switched telephone network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
  • the backhaul line can be constructed in accordance with any of a number of known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in FIG. 2 may include multiple BSC 2750s.
  • Each BS 270 can serve one or more partitions (or regions), each of which is covered by a multi-directional antenna or an antenna directed to a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS Base Transceiver Subsystem
  • the term "base station” can be used to generally refer to a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as a plurality of cellular stations.
  • a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • GPS Global Positioning System
  • the satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the GPS module 115 as shown in Figure 1 is typically configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, Other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • the touch screen of the conventional mobile terminal is divided into a touchable operation area (hereinafter referred to as A' area) and a physical key area (hereinafter referred to as B area).
  • A' area is a touchable operation area for detecting touch point coordinates
  • B area is a physical key area for detecting a menu key, a Home key, a return key, and the like.
  • the embodiment of the present invention proposes a new touch screen division method, and implements a new touch operation method, which is particularly suitable for a narrow border or a borderless mobile terminal.
  • the A′ area of the mobile terminal is divided into two.
  • One partition, one partition is a virtual border area located at the edge of the screen (hereinafter referred to as area C), and the other partition is the same common partition as the prior art (hereinafter referred to as area A), and each partition is assigned a virtual Input device; when the touch event is sensed, it is determined in which partition the touch event occurs, and if it occurs in the C area, the touch event is reported through the virtual input device corresponding to the C area, and if it occurs in the A area, The virtual input device corresponding to the A area reports the touch event; finally, the mobile terminal performs special processing on the touch event reported by the virtual input device corresponding to the C area, and the touch event image reported by the virtual input device corresponding to the A area As in the prior art, normal processing is performed.
  • the special processing of the touch event in the C area can be understood as: other processing methods that are different from the normal processing mode of the A area in the touch event of the C area, such as ignoring, generating special effects, function switching, Parameter adjustment or other custom processing methods.
  • FIG. 4 illustrates a touch operation method of the mobile terminal of the present invention, including the following steps:
  • Step 401 Divide the touchable area of the touch screen of the mobile terminal into two partitions, where the two partitions include a first virtual border partition and a second virtual border partition, the first virtual border partition and the second virtual border
  • the partitions may be: a first virtual border partition located at one side or both side edges of the touch screen, that is, a C area, and a second virtual border partition, that is, an A area, of the remaining touchable operation areas except the C area on the touch screen.
  • the first type is a fixed division mode: the position and size (such as width, length, etc.) of the C area are fixedly set at the time of drive initialization. After the C area is set, the remaining touchable area on the touch screen is the A area.
  • the setting of the C area is preferably as shown in FIG. 5, and is disposed at an edge position of the touch screen, and the width is narrow, so as to avoid affecting the touch operation of the A area;
  • the A area includes an A0 area and an A1 area, wherein the A0 area is an operable area for detecting touch point coordinates, and the A1 area is a virtual key area for detecting a menu key, a Home key, and a return key.
  • zone C is located at the edge of the touch screen and is located on both sides of zone A.
  • the C zone can be set as needed in any other area that is likely to cause erroneous operation.
  • the second type is a free division mode: setting a virtual border area setting interface in the driver layer; and setting an interface in the application layer by calling the virtual border area to create or modify the number, position and size of the virtual border area.
  • the width, height, and position of the C area can be customized, for example, specifically set by a user.
  • the embodiment may further include: setting a virtual border area setting interface, respectively setting a number of virtual border areas, a position and a size of each virtual border area.
  • the setting may be created or modified.
  • the virtual border area setting interface may be used to create or modify the number, location, and size of the virtual border area applicable to the current application scene.
  • Step 402 When the touch screen driver is initialized, two virtual input devices (defined as: input0 and input1) are allocated by input_allocate_device(), and the two input devices are registered by input_register_device(), where input0 corresponds to C area and input1 corresponds to In Area A.
  • input_allocate_device() When the touch screen driver is initialized, two virtual input devices (defined as: input0 and input1) are allocated by input_allocate_device(), and the two input devices are registered by input_register_device(), where input0 corresponds to C area and input1 corresponds to In Area A.
  • the method provided in this embodiment may include: after registering two virtual input devices corresponding to the first virtual border partition or the second virtual border partition, the upper layer will identify the current according to the naming of the virtual input device reported by the driver layer.
  • the user touch area is the first virtual border partition or the second virtual border partition, specifically C area or A area; and further, based on different touch partitions, the upper layer determines to perform different processing manners, which will be introduced in the subsequent steps.
  • the upper layer of the present invention generally refers to a framework layer, an application layer, and the like.
  • a customized system such as android or IOS usually includes an underlying layer (physical layer, driving layer) and an upper layer (frame layer, application).
  • Layer the direction of the signal flow is: the physical layer (touch panel) receives the touch operation of the user, the physical compression is converted into the electrical signal TP, and the TP is transmitted to the driving layer, and the driving layer analyzes the pressed position to obtain the position.
  • the specific coordinates, duration, pressure and other parameters of the point are uploaded to the framework layer.
  • the communication between the framework layer and the driver layer can be realized through the corresponding interface.
  • the framework layer receives the input device of the driver layer and parses the input device. The input device is selected to respond or not respond to the input device, and the valid input is passed up to which specific application to meet the application layer to perform different application operations according to different events.
  • Step 403 Sense a touch event concurrent with the contact.
  • the mobile terminal can sense the touch event through the driver layer.
  • Step 404 Determine that the touch event occurs in the C area or the A area.
  • the touch event is usually an operation event such as clicking and sliding.
  • Each touch event is composed of one or more contacts. Therefore, the mobile terminal can determine the touch event by detecting the area where the contact of the touch event falls. Occurs in Zone C or Zone A.
  • the driving layer of the mobile terminal acquires the coordinates of the contact of the touch event, and determines which partition the coordinate of the contact falls into.
  • the coordinates of the contact falls into the C area, it is determined that the touch event occurs in the C area; when the coordinates of the contact do not fall into the C area, but fall into the A area, it is determined that the touch event occurs in the A area.
  • Step 405 The touch event is reported by the virtual input device corresponding to the touch event occurrence area.
  • the touch event is reported to the upper layer through the virtual input device input0; when the A area senses a touch event with a contact concurrent, the virtual input device input1 is passed. Report the touch event to the upper layer.
  • Step 406 Perform a preset special processing operation for the touch event of the C area, and perform a normal processing operation for the touch event of the A area.
  • the touch area corresponding to the reported event is first identified according to the naming of the input device, and is driven in the above step.
  • the kernel identification is touched in the C area, and the input device reported to the framework layer by the driver layer is input1 instead of being reported by input0, that is, the framework layer does not need to determine which partition the current contact is in, nor does it need to Judging the size and position of the partition, these judgment operations are completed on the driver layer, and the drive layer reports the parameters of the touch point to the frame layer, for example, pressing time, position coordinates, in addition to which input device is reported. , the size of the pressure and so on.
  • the framework layer reports to the application layer through a single-channel to multi-channel mechanism. Specifically, the channel is first registered, the reported event is transmitted through the channel, and the event is monitored by the listener (1), and the event is transmitted to the corresponding application module through different channels to generate different application operations, where the application
  • the module includes common applications such as camera and contact; it generates different application operations. For example, in the camera application, when the user clicks in the C area, different operations such as focusing, shooting, and camera parameters are generated.
  • the report event is delivered to the listener, it is a single channel. After the listener listens, the reported event is multi-channel, and multiple channels exist at the same time. The advantage is that it can be transmitted to different application modules at the same time, and different application modules are generated. Different response operations.
  • the specific implementation of the foregoing steps is: defining the A area and the C by using an object-oriented manner.
  • the category and implementation of the area after determining that it is the C area, convert the coordinates of the different resolutions into the coordinates of the LCD through the EventHub function, and define a single-channel function (such as serverchannel and clientchannel, etc.), the function of this function is
  • the event is transmitted to the event manager (TouchEventManager) through the channel, and the event is transmitted to the application module of multiple responses simultaneously or one by one through the monitoring of the listener, or can be only transmitted to the application module of multiple responses.
  • One of the application modules, application modules such as camera, gallery, etc., different application modules generate corresponding operations.
  • the specific implementation of the foregoing steps may also be implemented in other manners, which is not limited by the embodiment of the present invention.
  • the touch operation flow of the present invention will be further described in another manner.
  • the virtual border area is simply referred to as the C area, and the other areas are simply referred to as the A area, and the touch event is abbreviated.
  • the reporting process is as follows:
  • the driving layer receives the touch event through physical hardware such as a touch screen, and determines whether the touch operation occurs in the A area or the C area, and reports the event through the device file node of the A area or the C area.
  • the Native layer reads events from device files in Areas A and C, and processes events in Areas A and C, such as coordinate calculations. The devices in the A and C areas are distinguished by the device ID, and finally the A area is distributed. And zone C events.
  • the A area event takes the original process, and the A area event is processed in the usual way; the C area event is distributed from the dedicated channel of the C area registered in the Native layer in advance, and is input by the Native port, and the system port is output to the C area.
  • the system service then reports the external interface to each application through the C area event.
  • the present invention also provides a sliding recognition method for the C area, including the steps of:
  • Step 1101 When the C area senses an initial moment of a touch event concurrent with the contact, the initial coordinate position (downX, downY) and the initial pressed time information (downTime) of the contact are reported to the upper layer through the virtual input device input0. And during the movement of the contact, the current coordinate position (currentX, currentY) of the contact is reported to the upper layer in real time according to a preset period; the upper layer records the information as a basis for subsequent sliding determination. As shown in FIG. 12, the middle portion of the touch screen is the A area, the narrow sides on the left and right sides are the C area, and the gray origin represents the contacts in the C area.
  • Step 1102 The upper layer determines whether the touch event is a sliding event according to the initial coordinate position of the contact and the current coordinate position information, and if yes, proceeds to the next step.
  • the reporting period of the virtual input device input0 can be set to a shorter time value, such as 1/85 second.
  • the specific method for determining whether the touch event concurrent with the contact is a sliding event is: determining a moving distance between the current position of the contact and the initial position; if the moving distance exceeds a preset threshold, determining that the touch event is Sliding event, otherwise, it is determined that the touch event is not a sliding event.
  • Step 1103 For the sliding event, determine the direction attribute according to the ordinate change information of the contact.
  • the method for determining the sliding direction of the contact is specifically: comparing the current position of the contact with the Y-axis coordinate value of the initial position, and if currentY>downY, determining that the sliding direction of the contact is downward, otherwise determining the sliding The direction is up.
  • the direction attribute is not limited to up or down, and may be a single reciprocating, multiple reciprocating, etc., and the judging method may be implemented based on the change trajectory of the Y-axis coordinate value of the contact.
  • the C area can be divided only at one side edge position of the touchable operation area of the touch screen, or the C area can be separately divided on both side edges.
  • the touch events may be respectively set for the two first virtual border partitions C area located on the left and right sides. Special handling to improve ease of operation. For this reason, it is necessary to identify whether the touch event occurs in the left side C area or the right side C area.
  • the present invention provides a method for identifying a touch event in different virtual border regions, including the steps of:
  • Step 1301 When the C area senses a touch event concurrent with the contact, the coordinate position (currentX, currentY) of the contact is periodically reported to the upper layer through the virtual input device input0; the coordinate information of the upper layer is recorded as a follow-up The basis for judgment.
  • Step 1302 The frame layer or the application layer determines the location of the contact according to the X-axis coordinate value of the contact, the position of the left C-zone portion, and the right C-zone portion.
  • the specific judgment mode is: if the X-axis coordinate of the contact satisfies 0 ⁇ currentX ⁇ CW1, it is determined that the touch event concurrent with the contact occurs in the left C region; if the X-axis coordinate of the contact satisfies (W-CW2) ⁇ currentX ⁇ W, it is determined that the touch event concurrent with the contact occurs in the right side of the C area.
  • W is the width of the screen
  • CW1 is the width of the left C area
  • CW2 is the width of the right C area
  • CW1 and CW2 may be the same or different.
  • FIG. 15 illustrates a method for realizing function adjustment by using a virtual border area, which is applicable to a case where both sides of the touch screen are divided into virtual border areas (for the difference description, hereinafter referred to as the left side C area and Right side C area), including steps:
  • Step 1501 Set the left C area as the function switching area and the right side C area as the function parameter adjustment area; in the various application scenarios, the left side C area can be switched to display the function items applicable to the current application scene. And the function parameter adjustment control corresponding to the right side C area when each function item is active.
  • the left C area can be switched to display multiple function items suitable for the current application scenario; for the function item currently displayed in the left C area, the function item is set to be active, and the right C area is Corresponding to the function parameter adjustment control for displaying the function item; for other function items not currently displayed in the left C area, it is set to be in stealth state, and the right side C area does not display its corresponding function parameter adjustment control.
  • Step 1502 Receive a touch event concurrent with the contact.
  • Step 1503 Determine whether the touch event is a sliding event, and if yes, proceed to the next step.
  • the specific judgment method is as shown in FIG.
  • Step 1504 Determine the area of the sliding event, if it is located in the left C area, go to step 1505; if it is in the right side C area, go to step 1506.
  • the specific judgment method is as shown in FIG.
  • Step 1505 Determine the direction attribute of the sliding event. If it is sliding upward, switch the currently displayed function item in the left C area to the previous function item, and at the same time, the right side C area is updated and displayed as the corresponding function.
  • the parameter adjustment control if it is sliding down, the function item currently displayed in the C area on the left side is switched to the next function item, and at the same time, the update function in the right area C is displayed as the corresponding function parameter adjustment control. After that, the process jumps to step 1507 to execute.
  • the function item can be switched by sliding on the left side C area.
  • Step 1506 Determine a direction attribute of the sliding event. If the direction is an upward sliding, the function parameter corresponding to the current function item is adjusted from the initial parameter to the high/low value; if the sliding function is downward, the current function parameter is changed from the initial parameter to the initial parameter. Low/high adjustment.
  • the display contents of the left side C area and the right side C area are preferably restored to the default item, that is, the display content at the end of the last application is not recorded.
  • the user can switch various function items in the current application scenario by sliding up and down in the left area C to select the function items to be adjusted; and adjust by sliding up and down in the right C area.
  • the function parameters of the currently selected function item are convenient and quick. The method will be described below by way of an example.
  • the image processing function items may include: a zoom processing function, a contrast adjustment function, a blur processing function, a brightness adjustment function, and the like.
  • the method of realizing the image processing function by using the C area is:
  • the left area C is displayed as the default function item, such as the zoom processing function.
  • the embodiment further provides a mobile terminal, including:
  • the C area dividing unit 1610 is configured to divide the left side C area and the right side C area on both side edges of the touch screen.
  • the unit includes: a C-area fixed dividing unit 1611, configured to define a position and a size of the virtual border area during driving initialization to implement a fixed dividing manner; and a C-area setting interface 1612 for creating and modifying a virtual border
  • the number, location, and size of the area are called by the upper layer to implement a custom partitioning method.
  • the function setting unit 1620 is configured to preset, in the initializing, various function items switchablely displayed in the left area C for the current application scene, and correspondingly displayed in the right side C area when each function item is in the display state. Function parameter adjustment control.
  • the bottom layer reporting unit 1630 is configured to report coordinate position information of the contact in real time when sensing a touch event concurrent with the contact.
  • the slide recognition unit 1640 is configured to determine, according to the coordinate position information of the contact reported by the bottom layer reporting unit 1630, whether the touch event belongs to a sliding event, and if yes, further determine a direction attribute of the sliding event and a position of the position area.
  • the function switching unit 1650 is configured to sequentially switch the currently displayed function items in the left C area to the previous or next according to the direction attribute when determining that the sliding event occurs in the left C area, and simultaneously control the right C area update. Displayed as the corresponding function parameter adjustment control.
  • the parameter adjustment unit 1660 is configured to increase or decrease the current function parameter from the initial value according to the direction attribute when determining that the sliding event occurs in the right side C area.
  • the sliding identification unit 1640 includes:
  • a recording module 1641 configured to record coordinate position information of the contact
  • the sliding event judging module 1642 is configured to calculate the initial coordinate position and the current coordinate position of the contact Calculating the moving distance of the contact, by comparing the moving distance with a preset threshold to determine whether the touch event is a sliding event;
  • the sliding direction determining module 1643 is configured to determine a direction attribute of the sliding event by comparing coordinate values of the contact in the vertical direction of the initial coordinate position and the current coordinate position;
  • the event area determining module 1644 is configured to determine, according to the coordinate value of the horizontal direction of the contact, the position and size information of the left side C and the right side C area, whether the sliding event occurs in the left side C area or the right side C area.
  • the unilateral up and down setting mode can be used to achieve the following applications:
  • Multi-task switching function By sliding up in the C area, the display interface of the A area is switched from the current application interface to the previous application interface; by sliding down in the C area, the display interface of the A area is from the current application interface. Switch to the next application interface, which is convenient for users to switch applications.
  • Multi-task thumbnail switching function When sliding up and down in the C area, the thumbnails of various applications currently running in the background are sequentially displayed in the C area, and the corresponding application is started when lifting.
  • the bilateral upswing mode is adopted.
  • the following functions can be realized:
  • Brightness adjustment function Increase the screen brightness by sliding up in the side C area, and lower the screen brightness by sliding down on the other side C area, the user does not need to adjust by physical buttons, and does not need to enter Set the interface to adjust.
  • Volume adjustment function Turn up the volume by sliding up in the side of zone C, and turn down by sliding down on the other side of zone C.
  • Show hidden content Show hidden content (such as apps, images, files, text messages, etc.) when sliding up, and hide related content when sliding down.
  • Mobile phone acceleration function It is activated when the up and down sliding reaches two thresholds for back and forth, to clean up the background application to release the memory, and notify the user to process the result at the end of the processing, which is suitable for game players and the like.
  • the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better.
  • Implementation Based on such understanding, the technical solution of the present invention, which is essential or contributes to the prior art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk,
  • the optical disc includes a number of instructions for causing a terminal device (which may be a cell phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the methods described in various embodiments of the present invention.

Abstract

本发明公开了一种移动终端及其利用虚拟边框区域实现功能调节的方法,所述方法包括:感测与触点并发的触控事件,判断所述触控事件是否属于滑动事件;若是,则进一步判断该滑动事件的方向属性以及所处区域位置;若所述滑动事件发生于第一虚拟边框分区内,则根据其方向属性将第一虚拟边框分区内当前显示的功能项进行切换;若所述滑动事件发生于第二虚拟边框分区内,则根据其方向属性调节当前的功能项的参数。本发明对虚拟边框区域的左侧部分和右侧部分分别设置为功能切换区和功能参数调节区,可通过在功能切换区滑动以切换功能项,通过在功能参数调节区滑动以调整参数值,大大提升了用户使用体验。

Description

移动终端及其利用虚拟边框区域实现功能调节的方法 技术领域
本发明涉及通信技术领域,尤其涉及一种移动终端及其利用虚拟边框区域实现功能调节的方法。
背景技术
随着终端设备,如手机、个人数码助理(Personal Digital Assistant,PDA)等设备的内置存储器容量的扩大、操作装置功能的日益强大,终端设备中可以开发安装的应用程序越来越多,功能越来越丰富。虽然终端设备变成了一种强大的数据处理工具,但是由于需要处理的数据太多、功能太多,给用户也带来了很多困扰。比如,对图片的效果处理,需要选择很多滤镜效果,才能达到自己理想的效果,整个操作过程非常的复杂;还比如;对一些相机的操作,有时候很难快速找到想调节的功能项,而且不同功能之间的切换也会比较麻烦。因而,目前的终端设备虽然从功能上满足了很多用户的需求;但是从用户体验上来说,没有一种非常方便的交互方式,不能给用户带来快捷的操作体验。
发明内容
本发明的主要目的在于提出一种移动终端及其利用虚拟边框区域实现功能调节的方法,解决现有方式中功能操作程序繁琐、不便利的缺陷。
为实现上述目的,本发明提供了一种利用虚拟边框区域实现功能调节的方法,所述虚拟边框区域包括第一虚拟边框分区和第二虚拟边框分区,所述方法包括步骤:
感测与触点并发的触控事件,判断所述触控事件是否属于滑动事件;若是,则进一步判断该滑动事件的方向属性以及所处区域位置;
若所述滑动事件发生于第一虚拟边框分区内,则根据其方向属性将第一虚 拟边框分区内当前显示的功能项进行切换;若所述滑动事件发生于第二虚拟边框分区内,则根据其方向属性调节当前的功能项的功能参数。
其中,判断所述触控事件是否属于滑动事件的方法具体为:
根据所述触点的初始坐标位置和当前坐标位置计算触点的移动距离;若该移动距离超过预设阈值,则判定所述触控事件属于滑动事件,否则,判定所述触控事件不属于滑动事件。
其中,判断所述滑动事件的方向属性的方法具体为:
通过比较触点在初始坐标位置和当前坐标位置的竖直方向的坐标值判定所述滑动事件的方向属性。
其中,判断所述滑动事件所位区域位置的方法为:
若该滑动事件的触点的X轴坐标值currentX满足0<currentX<CW1的条件,则判定该滑动事件发生在位于触摸屏左侧边缘的第一虚拟边框分区内;
若触点的X轴坐标值currentX满足(W-CW2)<currentX<W,则判定该滑动事件发生在位于触摸屏右侧边缘的第二虚拟分区;
其中,所述W为屏幕的宽度、CW1为第一虚拟边框分区的宽度,CW2是第二虚拟分区的宽度。
其中,所述方法还包括:
将移动终端的触摸屏的可触摸区域划分为第一虚拟边框分区和第二虚拟边框分区。
其中,所述第一虚拟边框分区和第二虚拟边框分区分别为:位于触摸屏一侧或者两侧边缘位置的第一虚拟边框分区,以及触摸屏上除第一虚拟边框分区以外的剩余可触摸操作区域为第二虚拟边框。
其中,当触摸屏的可触摸操作区域的两侧边缘均划分有第一虚拟边框分区时,对于位于左右两侧的两个第一虚拟边框分区,分别设定其触控事件对应不同的特殊处理方式。
其中,还包括采用固定划分方式于触摸屏上划分所述虚拟边框区域的步骤:
在驱动初始化时,定义所述虚拟边框区域的位置及尺寸。
其中,还包括采用自由设定方式于触摸屏上划分所述虚拟边框区域的步骤:
设置虚拟边框区域设置接口;通过调用所述虚拟边框区域设置接口以创建或修改所述虚拟边框区域的数量、位置及大小。
其中,所述方法还包括:
在注册好该第一虚拟边框分区或第二虚拟边框分区对应的两个虚拟输入设备后;
根据虚拟输入设备的命名,以识别出当前用户触摸区域是第一虚拟边框分区或第二虚拟边框分区;
基于不同的触摸分区,确定执行不同的处理方式。
为此,本发明还提供了一种移动终端,其触摸屏上划分第一虚拟边框分区和第二虚拟边框分区,所述移动终端包括:
底层上报单元,用于在感测到与触点并发的触控事件时,实时上报该触点的坐标位置信息;
滑动识别单元,用于根据底层上报单元所上报的触点的坐标位置信息,判断所述触控事件是否属于滑动事件,若是则进一步判断该滑动事件的方向属性及所位区域位置;
功能切换单元,配置为在判定所述滑动事件发生于第一虚拟边框分区时,根据其方向属性将第一虚拟边框分区内当前显示的功能项进行切换,同时控制第二虚拟边框分区更新显示为对应的功能参数调节控件;
参数调节单元,配置为在判定所述滑动事件发生于第二虚拟边框分区时,根据其方向属性调节当前的功能项的功能参数。
其中,所述滑动识别单元进一步包括:
滑动方向判断模块,配置为通过比较触点在初始坐标位置和当前坐标位置的竖直方向的坐标值判定所述滑动事件的方向属性。
其中,所述滑动识别单元进一步包括:
事件区域判断模块,配置为根据所述触点的水平方向的坐标值、所述第一虚拟边框分区和第二虚拟边框分区的位置及尺寸信息,判断所述滑动事件发生于第一虚拟边框分区还是第二虚拟边框分区。
其中,所述移动终端还包括:
虚拟边框区域固定划分单元,配置为将移动终端的触摸屏的可触摸区域划分为第一虚拟边框分区和第二虚拟边框分区。
其中,所述第一虚拟边框分区和第二虚拟边框分区分别为:位于触摸屏一侧或者两侧边缘位置的第一虚拟边框分区,以及触摸屏上除第一虚拟边框分区以外的剩余可触摸操作区域为第二虚拟边框。
其中,虚拟边框区域固定划分单元,配置为当触摸屏的可触摸操作区域的两侧边缘均划分有第一虚拟边框分区时,对于位于左右两侧的两个第一虚拟边框分区,分别设定其触控事件对应不同的特殊处理方式。
其中,所述虚拟边框区域固定划分单元,配置为在驱动初始化时,定义所述虚拟边框区域的位置及尺寸。
其中,所述移动终端还包括:
虚拟边框区域设置接口,用于创建及修改虚拟边框区域的数量、位置及大小。
其中,虚拟边框区域固定划分单元,配置为在注册好该第一虚拟边框分区或第二虚拟边框分区对应的两个虚拟输入设备;
滑动识别单元,配置为根据虚拟输入设备的命名,以识别出当前用户触摸区域是第一虚拟边框分区或第二虚拟边框分区;
参数调节单元,配置为基于不同的触摸分区,确定执行不同的处理方式。
本发明对虚拟边框区域的左侧部分和右侧部分分别设置为功能切换区和功能参数调节区,可通过在功能切换区滑动以切换当前可调的功能项,通过在功能参数调节区滑动以调整当前功能项的参数值,大大方便了用户快速查找所需调整的功能项以及快速调整具体参数值,简化了操作程序,大大提升了用户使 用体验。
附图说明
图1为实现本发明各个实施例的移动终端的硬件结构示意图;
图2为如图1所示的移动终端的无线通信系统示意图;
图3为传统的移动终端的触摸屏划分方式示意图;
图4为本发明实施例中移动终端的触控操作方法流程图;
图5为本发明实施例中采用固定方式划分C区的一种示意图;
图6为本发明实施例中采用固定方式划分C区的另一种示意图;
图7为本发明实施例中采用自由设定方式划分C区的示意图;
图8为本发明实施例中在系统桌面下触摸屏的显示效果示意图;
图9为本发明实施例中在相机应用场景下触摸屏的显示效果示意图;
图10为本发明实施例中C区事件处理系统框架图;
图11为本发明实施例中C区滑动识别方法流程图;
图12为本发明实施例中C区触点移动示意图;图13为本发明实施例中不同虚拟边框区域内触控事件识别方法流程图;
图14为本发明实施例中C区与A区的尺寸示意图;
图15为本发明实施例中利用虚拟边框区域实现功能调节的方法流程图;
图16为本发明实施例中移动终端的结构示意图。
本发明目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
具体实施方式
应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
现在将参考附图描述实现本发明各个实施例的移动终端。在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本发明的说明,其本身并没有特定的意义。因此,"模块"与"部件"可以混合地使用。
移动终端可以以各种形式来实施。例如,本发明中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、数字广播接收器、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、导航装置等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是移动终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。
图1为实现本发明各个实施例的移动终端的硬件结构示意图。
移动终端100可以包括无线通信单元110、A/V(音频/视频)输入单元120、用户输入单元130、感测单元140、输出单元150、存储器160、接口单元170、控制器180和电源单元190等等。图1示出了具有各种组件的移动终端,但是应理解的是,并不要求实施所有示出的组件。可以替代地实施更多或更少的组件。将在下面详细描述移动终端的元件。
无线通信单元110通常包括一个或多个组件,其允许移动终端100与无线通信系统或网络之间的无线电通信。例如,无线通信单元可以包括广播接收模块111、移动通信模块112、无线互联网模块113、短程通信模块114和位置信息模块115中的至少一个。
广播接收模块111经由广播信道从外部广播管理服务器接收广播信号和/或广播相关信息。广播信道可以包括卫星信道和/或地面信道。广播管理服务器可以是生成并发送广播信号和/或广播相关信息的服务器或者接收之前生成的广播信号和/或广播相关信息并且将其发送给终端的服务器。广播信号可以包括TV广播信号、无线电广播信号、数据广播信号等等。而且,广播信号可以进一步包括与TV或无线电广播信号组合的广播信号。广播相关信息也可以经由移动通信网络提供,并且在该情况下,广播相关信息可以由移动通信模块112来接收。广播信号可以以各种形式存在,例如,其可以以数字多媒体广播(DMB)的电子节目指南(EPG)、数字视频广播手持(DVB-H)的电子服务指南(ESG)等等的形式而存在。广播接收模块111可以通过使用各种类型的广播系统接收信号 广播。特别地,广播接收模块111可以通过使用诸如多媒体广播-地面(DMB-T)、数字多媒体广播-卫星(DMB-S)、数字视频广播-手持(DVB-H),前向链路媒体(MediaFLO@)的数据广播系统、地面数字广播综合服务(ISDB-T)等等的数字广播系统接收数字广播。广播接收模块111可以被构造为适合提供广播信号的各种广播系统以及上述数字广播系统。经由广播接收模块111接收的广播信号和/或广播相关信息可以存储在存储器160(或者其它类型的存储介质)中。
移动通信模块112将无线电信号发送到基站(例如,接入点、节点B等等)、外部终端以及服务器中的至少一个和/或从其接收无线电信号。这样的无线电信号可以包括语音通话信号、视频通话信号、或者根据文本和/或多媒体消息发送和/或接收的各种类型的数据。
无线互联网模块113支持移动终端的无线互联网接入。该模块可以内部或外部地耦接到终端。该模块所涉及的无线互联网接入技术可以包括WLAN(无线LAN)(Wi-Fi)、Wibro(无线宽带)、Wimax(全球微波互联接入)、HSDPA(高速下行链路分组接入)等等。
短程通信模块114是用于支持短程通信的模块。短程通信技术的一些示例包括蓝牙TM、射频识别(RFID)、红外数据协会(IrDA)、超宽带(UWB)、紫蜂TM等等。
位置信息模块115是用于检查或获取移动终端的位置信息的模块。位置信息模块的典型示例是GPS(全球定位系统)。根据当前的技术,GPS模块115计算来自三个或更多卫星的距离信息和准确的时间信息并且对于计算的信息应用三角测量法,从而根据经度、纬度和高度准确地计算三维当前位置信息。当前,用于计算位置和时间信息的方法使用三颗卫星并且通过使用另外的一颗卫星校正计算出的位置和时间信息的误差。此外,GPS模块115能够通过实时地连续计算当前位置信息来计算速度信息。
A/V输入单元120用于接收音频或视频信号。A/V输入单元120可以包括相机121和麦克风1220,相机121对在视频捕获模式或图像捕获模式中由图像 捕获装置获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元151上。经相机121处理后的图像帧可以存储在存储器160(或其它存储介质)中或者经由无线通信单元110进行发送,可以根据移动终端的构造提供两个或更多相机1210。麦克风122可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由移动通信模块112发送到移动通信基站的格式输出。麦克风122可以实施各种类型的噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。
用户输入单元130可以根据用户输入的命令生成键输入数据以控制移动终端的各种操作。用户输入单元130允许用户输入各种类型的信息,并且可以包括键盘、锅仔片、触摸板(例如,检测由于被接触而导致的电阻、压力、电容等等的变化的触敏组件)、滚轮、摇杆等等。特别地,当触摸板以层的形式叠加在显示单元151上时,可以形成触摸屏。
感测单元140检测移动终端100的当前状态,(例如,移动终端100的打开或关闭状态)、移动终端100的位置、用户对于移动终端100的接触(即,触摸输入)的有无、移动终端100的取向、移动终端100的加速或减速移动和方向等等,并且生成用于控制移动终端100的操作的命令或信号。例如,当移动终端100实施为滑动型移动电话时,感测单元140可以感测该滑动型电话是打开还是关闭。另外,感测单元140能够检测电源单元190是否提供电力或者接口单元170是否与外部装置耦接。感测单元140可以包括接近传感器1410将在下面结合触摸屏来对此进行描述。
接口单元170用作至少一个外部装置与移动终端100连接可以通过的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。识别模块可以是 存储用于验证用户使用移动终端100的各种信息并且可以包括用户识别模块(UIM)、客户识别模块(SIM)、通用客户识别模块(USIM)等等。另外,具有识别模块的装置(下面称为"识别装置")可以采取智能卡的形式,因此,识别装置可以经由端口或其它连接装置与移动终端100连接。接口单元170可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端100内的一个或多个元件或者可以用于在移动终端和外部装置之间传输数据。
另外,当移动终端100与外部底座连接时,接口单元170可以用作允许通过其将电力从底座提供到移动终端100的路径或者可以用作允许从底座输入的各种命令信号通过其传输到移动终端的路径。从底座输入的各种命令信号或电力可以用作用于识别移动终端是否准确地安装在底座上的信号。输出单元150被构造为以视觉、音频和/或触觉方式提供输出信号(例如,音频信号、视频信号、警报信号、振动信号等等)。输出单元150可以包括显示单元151、音频输出模块152、警报单元153等等。
显示单元151可以显示在移动终端100中处理的信息。例如,当移动终端100处于电话通话模式时,显示单元151可以显示与通话或其它通信(例如,文本消息收发、多媒体文件下载等等)相关的用户界面(UI)或图形用户界面(GUI)。当移动终端100处于视频通话模式或者图像捕获模式时,显示单元151可以显示捕获的图像和/或接收的图像、示出视频或图像以及相关功能的UI或GUI等等。
同时,当显示单元151和触摸板以层的形式彼此叠加以形成触摸屏时,显示单元151可以用作输入装置和输出装置。显示单元151可以包括液晶显示器(LCD)、薄膜晶体管LCD(TFT-LCD)、有机发光二极管(OLED)显示器、柔性显示器、三维(3D)显示器等等中的至少一种。这些显示器中的一些可以被构造为透明状以允许用户从外部观看,这可以称为透明显示器,典型的透明显示器可以例如为TOLED(透明有机发光二极管)显示器等等。根据特定想要的实施方式, 移动终端100可以包括两个或更多显示单元(或其它显示装置),例如,移动终端可以包括外部显示单元(未示出)和内部显示单元(未示出)。触摸屏可用于检测触摸输入压力以及触摸输入位置和触摸输入面积。
音频输出模块152可以在移动终端处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将无线通信单元110接收的或者在存储器160中存储的音频数据转换音频信号并且输出为声音。而且,音频输出模块152可以提供与移动终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出模块152可以包括扬声器、蜂鸣器等等。
警报单元153可以提供输出以将事件的发生通知给移动终端100。典型的事件可以包括呼叫接收、消息接收、键信号输入、触摸输入等等。除了音频或视频输出之外,警报单元153可以以不同的方式提供输出以通知事件的发生。例如,警报单元153可以以振动的形式提供输出,当接收到呼叫、消息或一些其它进入通信(incoming communication)时,警报单元153可以提供触觉输出(即,振动)以将其通知给用户。通过提供这样的触觉输出,即使在用户的移动电话处于用户的口袋中时,用户也能够识别出各种事件的发生。警报单元153也可以经由显示单元151或音频输出模块152提供通知事件的发生的输出。
存储器160可以存储由控制器180执行的处理和控制操作的软件程序等等,或者可以暂时地存储已经输出或将要输出的数据(例如,电话簿、消息、静态图像、视频等等)。而且,存储器160可以存储关于当触摸施加到触摸屏时输出的各种方式的振动和音频信号的数据。
存储器160可以包括至少一种类型的存储介质,所述存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等等)、随机访问存储器(RAM)、静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘等等。而且,移动终端100可以与通过网络连接执行存储器160的存储功能的网 络存储装置协作。
控制器180通常控制移动终端的总体操作。例如,控制器180执行与语音通话、数据通信、视频通话等等相关的控制和处理。另外,控制器180可以包括用于再现(或回放)多媒体数据的多媒体模块1810,多媒体模块1810可以构造在控制器180内,或者可以构造为与控制器180分离。控制器180可以执行模式识别处理,以将在触摸屏上执行的手写输入或者图片绘制输入识别为字符或图像。
电源单元190在控制器180的控制下接收外部电力或内部电力并且提供操作各元件和组件所需的适当的电力。
这里描述的各种实施方式可以以使用例如计算机软件、硬件或其任何组合的计算机可读介质来实施。对于硬件实施,这里描述的实施方式可以通过使用特定用途集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理装置(DSPD)、可编程逻辑装置(PLD)、现场可编程门阵列(FPGA)、处理器、控制器、微控制器、微处理器、被设计为执行这里描述的功能的电子单元中的至少一种来实施,在一些情况下,这样的实施方式可以在控制器180中实施。对于软件实施,诸如过程或功能的实施方式可以与允许执行至少一种功能或操作的单独的软件模块来实施。软件代码可以由以任何适当的编程语言编写的软件应用程序(或程序)来实施,软件代码可以存储在存储器160中并且由控制器180执行。
至此,已经按照其功能描述了移动终端。下面,为了简要起见,将描述诸如折叠型、直板型、摆动型、滑动型移动终端等等的各种类型的移动终端中的滑动型移动终端作为示例。因此,本发明能够应用于任何类型的移动终端,并且不限于滑动型移动终端。
如图1中所示的移动终端100可以被构造为利用经由帧或分组发送数据的诸如有线和无线通信系统以及基于卫星的通信系统来操作。
现在将参考图2描述其中根据本发明的移动终端能够操作的通信系统。
这样的通信系统可以使用不同的空中接口和/或物理层。例如,由通信系统 使用的空中接口包括例如频分多址(FDMA)、时分多址(TDMA)、码分多址(CDMA)和通用移动通信系统(UMTS)(特别地,长期演进(LTE))、全球移动通信系统(GSM)等等。作为非限制性示例,下面的描述涉及CDMA通信系统,但是这样的教导同样适用于其它类型的系统。
参考图2,CDMA无线通信系统可以包括多个移动终端100、多个基站(BS)270、基站控制器(BSC)275和移动交换中心(MSC)280。MSC280被构造为与公共电话交换网络(PSTN)290形成接口。MSC280还被构造为与可以经由回程线路耦接到基站270的BSC275形成接口。回程线路可以根据若干已知的接口中的任一种来构造,所述接口包括例如E1/T1、ATM,IP、PPP、帧中继、HDSL、ADSL或xDSL。将理解的是,如图2中所示的系统可以包括多个BSC2750。
每个BS270可以服务一个或多个分区(或区域),由多向天线或指向特定方向的天线覆盖的每个分区放射状地远离BS270。或者,每个分区可以由用于分集接收的两个或更多天线覆盖。每个BS270可以被构造为支持多个频率分配,并且每个频率分配具有特定频谱(例如,1.25MHz,5MHz等等)。
分区与频率分配的交叉可以被称为CDMA信道。BS270也可以被称为基站收发器子系统(BTS)或者其它等效术语。在这样的情况下,术语"基站"可以用于笼统地表示单个BSC275和至少一个BS270。基站也可以被称为"蜂窝站"。或者,特定BS270的各分区可以被称为多个蜂窝站。
如图2中所示,广播发射器(BT)295将广播信号发送给在系统内操作的移动终端100。如图1中所示的广播接收模块111被设置在移动终端100处以接收由BT295发送的广播信号。在图2中,示出了几个全球定位系统(GPS)卫星300。卫星300帮助定位多个移动终端100中的至少一个。
在图2中,描绘了多个卫星300,但是理解的是,可以利用任何数目的卫星获得有用的定位信息。如图1中所示的GPS模块115通常被构造为与卫星300配合以获得想要的定位信息。替代GPS跟踪技术或者在GPS跟踪技术之外, 可以使用可以跟踪移动终端的位置的其它技术。另外,至少一个GPS卫星300可以选择性地或者额外地处理卫星DMB传输。
作为无线通信系统的一个典型操作,BS270接收来自各种移动终端100的反向链路信号。移动终端100通常参与通话、消息收发和其它类型的通信。特定基站270接收的每个反向链路信号被在特定BS270内进行处理。获得的数据被转发给相关的BSC275。BSC提供通话资源分配和包括BS270之间的软切换过程的协调的移动管理功能。BSC275还将接收到的数据路由到MSC280,其提供用于与PSTN290形成接口的额外的路由服务。类似地,PSTN290与MSC280形成接口,MSC与BSC275形成接口,并且BSC275相应地控制BS270以将正向链路信号发送到移动终端100。
基于上述移动终端硬件结构以及通信系统,提出本发明方法各个实施例。
如图3所示,传统的移动终端的触摸屏划分为可触摸操作区域(以下简称为A′区)和物理按键区域(以下简称为B区)。其中,A′区为可触摸操作区域,用于检测触摸点坐标;B区为物理按键区域,用于检测菜单键、Home键、返回键等。
基于传统的触摸屏划分方式,本发明实施例提出了新的触摸屏划分方式,实现了新的触控操作方法,特别适用于窄边框或无边框移动终端,首先将移动终端的A′区分割为两个分区,其中一个分区为位于屏幕边缘的虚拟边框区域(以下简称为C区),另一个分区为与现有技术相同的普通分区(以下简称为A区),并为每一个分区分配一个虚拟输入设备;当感测到触控事件时,判断该触控事件发生在哪个分区内,若发生在C区则通过C区所对应的虚拟输入设备上报触控事件,若发生在A区则通过A区所对应的虚拟输入设备上报触控事件;最后,移动终端对C区所对应的虚拟输入设备上报的触控事件进行特殊处理,对A区所对应的虚拟输入设备上报的触控事件像现有技术一样,进行正常处理。
所述对C区的触控事件进行特殊处理可以理解为:对C区的触控事件进行与A区的正常处理方式不同的其他处理方式,如忽略、生成特效、功能切换、 参数调节或者自定义的其它处理方式。
请参阅图4,该图示出了本发明移动终端的触控操作方法,包括以下步骤:
步骤401、将移动终端的触摸屏的可触摸区域划分为两个分区,其中,所述两个分区包括第一虚拟边框分区和第二虚拟边框分区,所述第一虚拟边框分区和第二虚拟边框分区可以分别为:位于触摸屏一侧或者两侧边缘位置的第一虚拟边框分区即C区,以及触摸屏上除C区以外的剩余可触摸操作区域第二虚拟边框分区即A区。
本步骤中,C区的划分方式有以下两种:
第一种为固定划分方式:在驱动初始化时固定设置C区的位置和尺寸(如宽度、长度等),设置好C区后,触摸屏上剩余可触摸区域区域即为A区。
其中,C区设置优选如图5所示,设于触摸屏的边缘位置,宽度较窄,以免影响A区的触摸操作;
或者,如图6所示,A区包括A0区和A1区,其中A0区为可操作区域,用于检测触摸点坐标,A1区为虚拟按键区域,用于检测菜单键、Home键、返回键等,C区设于触摸屏边缘并位于A区两侧。此外,也可以根据需要将C区设于其它任何容易导致误操作的区域。
第二种为自由划分方式:在驱动层设置虚拟边框区域设置接口;在应用层,通过调用所述虚拟边框区域设置接口以创建或修改所述虚拟边框区域的数量、位置及大小。如图7所示,具体来说,其中C区的宽度、高度及位置均可进行自定义设置,比如具体由用户自定义进行设置。
优选地,针对不同应用场景,本实施例还可以包括:通过调用虚拟边框区域设置接口,分别设置虚拟边框区域的数量、每一个虚拟边框区域的位置以及大小。其中,所述设置可以为创建或者修改,比如,可以为调用虚拟边框区域设置接口分别创建或者修改适用于当前应用场景的虚拟边框区域的数量、位置及大小;
如图8所示,在系统桌面下,因为图标占位较多,两侧的C区宽度设置的 相对较窄;如图9所示,当点击相机图标进入相机应用后,可通过上层调用C区设置接口以设置此场景下的C区数量、位置、大小,在不影响对焦的情况下,C区宽度可设置的相对较宽。
步骤402、在触摸屏驱动初始化时,通过input_allocate_device()分配两个虚拟输入设备(分别定义为:input0和input1),并通过input_register_device()注册这两个输入设备,其中input0对应于C区、input1对应于A区。
本实施例提供的方法可以包括:在注册好该第一虚拟边框分区或第二虚拟边框分区对应的两个虚拟输入设备后,上层将根据驱动层上报的虚拟输入设备的命名,以识别出当前用户触摸区域是第一虚拟边框分区或第二虚拟边框分区,具体的为C区或A区;进而基于不同的触摸分区,上层确定执行不同的处理方式,后续步骤中将会介绍。
本发明所述的上层通常指框架(Framework)层、应用层等,在移动终端的系统中,例如android、IOS等定制系统,通常包括底层(物理层,驱动层)以及上层(框架层,应用层),信号流的走向为:物理层(触控面板)接收到用户的触控操作,物理按压转变为电信号TP,将TP传递至驱动层,驱动层对按压的位置进行解析,得到位置点的具体坐标,持续时间,压力等参数,将该参数上传至框架层,框架层与驱动层的通信可通过相应的接口来实现,框架层接收到驱动层的输入设备(input),解析该输入设备,从而选择响应或不响应该输入设备,并将有效的输入向上传递给具体哪一个应用,以满足应用层根据不同的事件执行不同的应用操作。
步骤403、感测与触点并发的触控事件。移动终端可通过驱动层来感测触控事件。
步骤404、判断触控事件发生在C区或A区。
触控事件通常为点击、滑动等操作事件,每一触控事件由一个或多个触点组成,因此移动终端可以通过侦测触控事件的触点落入的区域,来判断触控事件是发生在C区还是A区。
具体实现上,移动终端的驱动层获取触控事件的触点的坐标,判断触点的坐标落入了哪个分区。当触点的坐标落入C区时,则判定触控事件发生在C区内;当触点的坐标没有落入C区,而是落入A区时,则判定触控事件发生在A区内。
步骤405:通过触控事件发生区域所对应的虚拟输入设备上报触控事件。
当C区感测有与触点并发的触控事件时,通过虚拟输入设备input0向上层上报该触控事件;当A区感测到有触点并发的触控事件时,通过虚拟输入设备input1向上层上报该触控事件。
步骤406、对于C区的触控事件,执行预设的特殊处理操作;对于A区的触控事件,执行正常处理操作。
在框架(Framework)层接收到上报事件(上报事件包括输入设备以及触控点各项参数等)后,首先根据输入设备的命名,识别所述上报事件对应的触控区域,如上一步骤中驱动层(kernel)识别是在C区触控,则驱动层上报到框架层的输入设备是input1,而不是用input0来上报,即,框架层不需要判断当前触点在哪一个分区,也不需要判断分区的大小和位置,这些判断操作在驱动层上完成,并且,驱动层除了上报具体是哪一个输入设备,还会上报该触控点的各项参数至框架层,例如按压时间,位置坐标,压力大小等等。
需要说明的是,框架层在接收到上报事件后,通过单通道转多通道的机制,上报到应用层。具体为:先注册一个通道,通过该通道传递该上报事件,通过监听器(1istener)监听该事件,将该事件通过不同的通道,传递至对应的应用模块,产生不同的应用操作,其中,应用模块包括摄像、联系人等常用应用;产生不同的应用操作,例如在摄像应用下,用户在C区点击,则会产生调焦,拍摄,调摄像参数等不同操作。要注意,上报事件传递到监听器之前,是单通道,监听器监听之后,上报事件走的是多通道,且多通道同时存在,其好处在于可同时传递至不同的应用模块,不同应用模块产生不同的响应操作。
可选地,上述步骤的具体实现为:利用面向对象化的方式,定义A区和C 区的类别以及实现方式,在判断是C区后,通过EventHub函数将不同分辨率的触点坐标转化为LCD的坐标,定义单通道函数(例如serverchannel和clientchannel等),该函数的作用是,当收到上报事件后,将该事件通过该通道传递至事件管理器(TouchEventManager),通过监听器的监听,将该事件通过多通道同时或逐一传递至多个响应的应用模块下,也可以只传递给其中的一个应用模块,应用模块如camera,gallery等,不同应用模块产生相应的操作。当然,上述步骤的具体实现也可以为其他方式的步骤实现,本发明实施例对此不做限制。
结合参见图10,将以另一种方式对本发明的触控操作流程做进一步说明,为简化起见,图10中,将虚拟边框区域简称为C区,其他区域简称为A区,触控事件的上报流程如下:
驱动层通过物理硬件如触摸屏接收触控事件,并判断触控操作发生在A区还是C区,并通过A区或C区设备文件节点上报事件。Native层从A区、C区的设备文件中读取事件,并对A区、C区的事件进行处理,如坐标计算,通过设备ID对A、C区的事件进行区分,最后分别派发A区和C区事件。其中A区事件走原生流程,按通常的方式对A区事件进行处理;C区事件则从事先注册到Native层的C区专用通道进行派发,由Native端口输入,系统端口输出至C区事件结束系统服务,再通过C区事件接收对外接口上报至各应用。
请参阅图11,基于图4所述的触控操作方法,本发明还提出了一种C区的滑动识别方法,包括步骤:
步骤1101、当C区感测有与触点并发的触控事件的初始时刻,通过虚拟输入设备input0向上层上报该触点的初始坐标位置(downX,downY)和初始按下时刻信息(downTime),并在触点移动过程中按照预设的周期向上层实时上报该触点的当前坐标位置(currentX,currentY);上层记录下该信息作为后续的滑动判断依据。如图12所示,触摸屏的中间部分为A区,左右两侧的窄边为C区,灰色原点代表C区中的触点。
步骤1102、上层根据触点的初始坐标位置和当前坐标位置信息判断触控事件是否为滑动事件,若是则继续执行下一步。
具体来说,为实现较为准确的判断,虚拟输入设备input0的上报周期可以设定为较短时间值,比如1/85秒。
该步骤中,判断与触点并发的触控事件是否为滑动事件的具体方法为:判断触点当前位置与初始位置的移动距离;若该移动距离超过预设阈值,则判定该触控事件为滑动事件,否则,判定该触控事件不是滑动事件。
触点的移动距离的计算公式为:
Figure PCTCN2016079794-appb-000001
由于C区为虚拟边框区域,一般设置的比较窄,X轴方向的移动距离减小,可以忽略不计,因而上述触点的移动距离的计算公式可以简化为:
移动距离=|currentY-downY|。
步骤1103、对滑动事件,根据其触点的纵坐标变化信息判断其方向属性。
该步骤中,判断触点的滑动方向的方法具体为:比较触点当前位置和初始位置的Y轴坐标值的大小,若currentY>downY,则判定触点的滑动方向为向下,否则判定滑动方向为向上。
当然,方向属性不局限于向上或者向下,还可以为单次往复、多次往复等,其判断方法均可基于触点的Y轴坐标值的变化轨迹来实现。
通过上述实现方案可知,可仅在触摸屏的可触摸操作区域的一侧边缘位置划分出C区,也可以在两侧边缘分别划分出C区。
当触摸屏的可触摸操作区域的两侧边缘均划分有第一虚拟边框分区C区时,对于位于左右两侧的两个第一虚拟边框分区C区部分,可分别设定其触控事件对应不同的特殊处理方式,以提高操作方便性。为此,需要对触控事件发生在左侧C区部分还是右侧C区部分进行识别。
请参阅图13,本发明提出了一种不同虚拟边框区域内触控事件的识别方法,包括步骤:
步骤1301、当C区感测有与触点并发的触控事件时,通过虚拟输入设备input0周期性地向上层上报该触点的坐标位置(currentX,currentY);上层记录下该坐标信息作为后续的判断依据。
步骤1302、框架层或者应用层根据触点的X轴坐标值、左侧C区部分和右侧C区部分的位置判断触点所位区域。
具体判断方式为:若触点的X轴坐标满足0<currentX<CW1,则判定与触点并发的触控事件发生在左侧C区部分;若触点的X轴坐标满足(W-CW2)<currentX<W,则判定与触点并发的触控事件发生在右侧C区部分。其中,如图14所示,W是屏幕的宽度,CW1是左侧C区的宽度,CW2是右侧C区的宽度,CW1与CW2可以相同也可以不同。
请参阅图15,该图示出了本发明利用虚拟边框区域实现功能调节的方法,适用于触摸屏两侧边缘均划分有虚拟边框区域的情况下(为区别描述,以下称为左侧C区和右侧C区),包括步骤:
步骤1501、设定左侧C区为功能切换区、右侧C区为功能参数调整区;预先分别设置各种应用场景下,左侧C区可切换显示的适用于当前应用场景的功能项,以及每种功能项处于活跃状态时在右侧C区对应显示的功能参数调整控件。
本步骤中,左侧C区内可切换显示适用于当前应用场景的多种功能项;对于当前显示于左侧C区的功能项,设定该功能项处于活跃状态,此时右侧C区对应显示该功能项的功能参数调整控件;对于当前未显示于左侧C区的其他功能项,设定其处于隐身状态,此时右侧C区不显示其对应的功能参数调整控件。
步骤1502、接收与触点并发的触控事件。
步骤1503、判断所述触控事件是否为滑动事件,若是,则继续执行下一步。具体判断方法如图11所述。
步骤1504、判断该滑动事件所位区域,若位于左侧C区,则执行步骤1505;如位于右侧C区,则执行步骤1506。具体判断方法如图13所述。
步骤1505、判断该滑动事件的方向属性,若为向上滑动,则将左侧C区内当前显示的功能项切换为上一个功能项,与此同时,右侧C区内更新显示为对应的功能参数调整控件;若为向下滑动,则将左侧C区内当前显示的功能项切换为下一个功能项,与此同时,右侧C区内更新显示为对应的功能参数调整控件。之后,跳转至步骤1507执行。
本步骤中,通过在左侧C区滑动,可实现功能项的切换。
步骤1506、判断该滑动事件的方向属性,若为向上滑动,则将当前功能项对应的功能参数由初始参数向高/低调整;若为向下滑动,则将当前的功能参数由初始参数向低/高调整。
在退出当前应用后再此进入该应用时,优选地将左侧C区和右侧C区显示内容恢复到默认项,即不记录上一次应用结束时的显示内容。
通过图15所示的方法流程,用户可以通过在左侧C区上下滑动以切换当前应用场景下的多种功能项,以选择需要调整的功能项;并通过在右侧C区上下滑动以调整当前所选功能项的功能参数,方便快捷。下面将通过一个实例来对该方法进行说明。
例如:在图片处理应用场景下,其图片处理功能项可包括:缩放处理功能、对比度调整功能、模糊处理功能、亮度调节功能等。基于此,利用C区来实现图片处理功能的方法为:
1)在查看单张照片页面,左侧C区显示为默认的功能项,如缩放处理功能。
2)左手长按图片,右手在右侧C区从下往上滑动,则可以对当前图片进行放大;从上往下滑动则可以对当前图片进行缩小;
3)如果需要对对比度、模糊度或者亮度等其他功能项的功能参数进行调整,则可以在左侧C区滑动一次或者多次直至切换至所需操作的功能项,滑动一次切换一次。例如:从下往上滑动,可由当前的缩放处理功能项依次切换到对比度调整功能项、模糊处理功能项;从上往下滑动,可由当前的缩放处理功能项依次切换到亮度调节功能项及其他。为符合不同用户的使用习惯,可选择设置 为可循环切换或者不可循环切换。
4)当选择了新的功能项后,比如模糊处理功能,则一只手长按图片,另一只手在右侧C区从下往上滑动时增大模糊度参数,在右侧C区从上往下滑动时减小模糊度参数。
相应地,如图16所述,本实施例还提供了一种移动终端,包括:
C区划分单元1610,用于在触摸屏的两侧边缘划分出左侧C区和右侧C区。具体地,该单元包括:C区固定划分单元1611,用于在驱动初始化时,定义所述虚拟边框区域的位置及尺寸,实现固定划分方式;C区设置接口1612,用于创建及修改虚拟边框区域的数量、位置及大小,由上层调用实现自定义划分方式。
功能设置单元1620,用于在初始化时,针对当前应用场景,预先设置左侧C区内可切换显示的各种功能项,以及在每种功能项处于显示状态时右侧C区内对应显示的功能参数调节控件。
底层上报单元1630,用于在感测到与触点并发的触控事件时,实时上报该触点的坐标位置信息。
滑动识别单元1640,用于根据底层上报单元1630所上报的触点的坐标位置信息,判断触控事件是否属于滑动事件,若是则进一步判断该滑动事件的方向属性及所位区域位置。
功能切换单元1650,用于在判定滑动事件发生于左侧C区时,根据其方向属性将左侧C区内当前显示的功能项按序切换至上一个或者下一个,同时控制右侧C区更新显示为对应的功能参数调节控件。
参数调节单元1660,用于在判定滑动事件发生于右侧C区时,根据其方向属性将当前的功能参数由初始值调高或者调低。
具体地,该滑动识别单元1640包括:
记录模块1641,用于记录触点的坐标位置信息;
滑动事件判断模块1642,用于根据触点的初始坐标位置和当前坐标位置计 算触点的移动距离,通过比较该移动距离与预设的阈值以判断触控事件是否属于滑动事件;
滑动方向判断模块1643,用于通过比较触点在初始坐标位置和当前坐标位置的竖直方向的坐标值判定滑动事件的方向属性;
事件区域判断模块1644,用于根据触点的水平方向的坐标值、左侧C和右侧C区的位置及尺寸信息,判断滑动事件发生于左侧C区还是右侧C区。
在上述实施例中,仅对C区的一种应用方法作了描述。在实际应用中,还可以利用C区实现其他各种各样的功能,来提升用户的使用体验。下面将举例描述。
若仅在触摸屏的一侧设置一条宽度较窄的C区,采用单边上下滑设置方式,可实现如下应用:
1)多任务切换功能:通过在C区进行向上滑动操作,A区的显示界面由当前应用界面切换至上一个应用界面;通过在C区进行向下滑动操作,A区的显示界面由当前应用界面切换至下一个应用界面,这样方便用户切换应用。
2)打开指定应用功能:预先为不同滑动方向分别设定对应的应用类型;当C区感测到向上/向下滑动事件时,打开预设的对应的一种应用。通过此功能,用户可根据自身的使用习惯,对使用频率最高的一种或两种应用进行设置,克服传统方式(从桌面上众多应用中找寻指定应用)所存在的耗时费力的缺陷。
3)返回功能:当C区感测到向上/向下滑动事件时,上层立即执行返回操作。通过此功能,用户无需再按显示屏底部的物理按键或者触摸屏上的虚拟返回按键,大大增加了操作的方便性。
4)多任务缩略图切换功能:在C区上下滑动时,于C区按序显示后台当前正在运行的各种应用的缩略图,抬起时启动对应的应用。
5)快速翻页功能:通过定义C区的触摸事件对应的处理方式为翻页,用户只需在C区轻轻滑动,即可实现前翻、后翻,大大方便了广大读者。
若在触摸屏的两侧分别设置一条宽度较窄的C区,采用双边上下滑方式, 可实现如下功能:
1)亮度调节功能:通过在一边C区内向上滑动操作来调高屏幕亮度,通过在另一边C区内向下滑动操作来调低屏幕亮度,用户不需在通过物理按键来调节,也无需进入设置界面来调节。
2)音量调节功能:通过在一边C区内向上滑动操作来调高音量,通过在另一边C区内向下滑动操作来调低音量。
3)显示隐藏内容:向上滑动操作时显示隐藏内容(如应用、图片、文件、短信等),向下滑动操作时隐藏相关内容。
若在触摸屏的单侧设置一条宽度较窄的C区,采用单边往返滑方式,可实现如下功能:
手机加速功能:当上下滑动达到两个来回的临界值时启动,用以清理后台应用以释放内存,并于处理结束时通知用户处理结果,适用于游戏玩家等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个......”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。
以上仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (20)

  1. 一种利用虚拟边框区域实现功能调节的方法,所述虚拟边框区域包括第一虚拟边框分区和第二虚拟边框分区,所述方法包括步骤:
    感测与触点并发的触控事件,判断所述触控事件是否属于滑动事件;若是,则进一步判断该滑动事件的方向属性以及所处区域位置;
    若所述滑动事件发生于第一虚拟边框分区内,则根据其方向属性将第一虚拟边框分区内当前显示的功能项进行切换;若所述滑动事件发生于第二虚拟边框分区内,则根据其方向属性调节当前的功能项的参数。
  2. 如权利要求1所述利用虚拟边框区域实现功能调节的方法,其中,判断所述触控事件是否属于滑动事件的方法具体为:
    根据所述触点的初始坐标位置和当前坐标位置计算触点的移动距离;若该移动距离超过预设阈值,则判定所述触控事件属于滑动事件,否则,判定所述触控事件不属于滑动事件。
  3. 如权利要求1所述利用虚拟边框区域实现功能调节的方法,其中,判断所述滑动事件的方向属性的方法具体为:
    通过比较触点在初始坐标位置和当前坐标位置的竖直方向的坐标值判定所述滑动事件的方向属性。
  4. 如权利要求1所述利用虚拟边框区域实现功能调节的方法,其中,所述方法中,判断所述滑动事件所位区域位置的方法为:
    若该滑动事件的触点的X轴坐标值currentX满足0<currentX<CW1的条件,则判定该滑动事件发生在位于触摸屏左侧边缘的第一虚拟边框分区内;
    若触点的X轴坐标值currentX满足(W-CW2)<currentX<W,则判定该滑动事件发生在位于触摸屏右侧边缘的第二虚拟分区;
    其中,所述W为屏幕的宽度、CW1为第一虚拟边框分区的宽度,CW2是第二虚拟分区的宽度。
  5. 如权利要求1-4任一项所述利用虚拟边框区域实现功能调节的方法,其中,所述方法还包括:
    将移动终端的触摸屏的可触摸区域划分为第一虚拟边框分区和第二虚拟边框分区。
  6. 如权利要求5所述利用虚拟边框区域实现功能调节的方法,其中,
    所述第一虚拟边框分区和第二虚拟边框分区分别为:位于触摸屏一侧或者两侧边缘位置的第一虚拟边框分区,以及触摸屏上除第一虚拟边框分区以外的剩余可触摸操作区域为第二虚拟边框。
  7. 如权利要求6所述利用虚拟边框区域实现功能调节的方法,其中,所述方法还包括:
    当触摸屏的可触摸操作区域的两侧边缘均划分有第一虚拟边框分区时,对于位于左右两侧的两个第一虚拟边框分区,分别设定其触控事件对应不同的特殊处理方式。
  8. 如权利要求5所述利用虚拟边框区域实现功能调节的方法,其中,所述方法还包括:
    采用固定划分方式于触摸屏上划分所述虚拟边框区域,具体为:
    在驱动初始化时,定义所述虚拟边框区域的位置及尺寸。
  9. 如权利要求5所述利用虚拟边框区域实现功能调节的方法,其中,所述方法还包括:
    采用自由设定方式于触摸屏上划分所述虚拟边框区域,具体为:
    设置虚拟边框区域设置接口;
    通过调用所述虚拟边框区域设置接口以创建或修改所述虚拟边框区域的数量、位置及大小。
  10. 如权利要求1-4任一项所述利用虚拟边框区域实现功能调节的方法,其中,所述方法还包括:
    在注册好该第一虚拟边框分区或第二虚拟边框分区对应的两个虚拟输入设 备后;
    根据虚拟输入设备的命名,以识别出当前用户触摸区域是第一虚拟边框分区或第二虚拟边框分区;
    基于不同的触摸分区,确定执行不同的处理方式。
  11. 一种移动终端,其触摸屏上划分有虚拟边框区域,该虚拟边框区域包括第一虚拟边框分区和第二虚拟边框分区,所述移动终端包括:
    底层上报单元,配置为在感测到与触点并发的触控事件时,实时上报该触点的坐标位置信息;
    滑动识别单元,配置为根据底层上报单元所上报的触点的坐标位置信息,判断所述触控事件是否属于滑动事件,若是则进一步判断该滑动事件的方向属性及所处区域位置;
    功能切换单元,配置为在判定所述滑动事件发生于第一虚拟边框分区时,根据其方向属性将第一虚拟边框分区内当前显示的功能项进行切换,同时控制第二虚拟边框分区更新显示为对应的功能参数调节控件;
    参数调节单元,配置为在判定所述滑动事件发生于第二虚拟边框分区时,根据其方向属性调节当前的功能项的功能参数。
  12. 如权利要求11所述的移动终端,其中,所述滑动识别单元进一步包括:
    记录模块,配置为记录所述底层上报单元上报的触点的坐标位置信息;
    滑动事件判断模块,配置为根据触点的初始坐标位置和当前坐标位置计算触点的移动距离,通过比较该移动距离与预设的阈值以判断所述触控事件是否属于滑动事件。
  13. 如权利要求12所述的移动终端,其中,所述滑动识别单元进一步包括:
    滑动方向判断模块,配置为通过比较触点在初始坐标位置和当前坐标位置的竖直方向的坐标值判定所述滑动事件的方向属性。
  14. 如权利要求13所述的移动终端,其中,所述滑动识别单元进一步包括:
    事件区域判断模块,配置为根据所述触点的水平方向的坐标值、所述第一 虚拟边框分区和第二虚拟边框分区的位置及尺寸信息,判断所述滑动事件发生于第一虚拟边框分区还是第二虚拟边框分区。
  15. 如权利要求11-14任一项所述移动终端,其中,所述移动终端还包括:
    虚拟边框区域固定划分单元,配置为将移动终端的触摸屏的可触摸区域划分为第一虚拟边框分区和第二虚拟边框分区。
  16. 如权利要求15所述移动终端,其中,
    所述第一虚拟边框分区和第二虚拟边框分区分别为:位于触摸屏一侧或者两侧边缘位置的第一虚拟边框分区,以及触摸屏上除第一虚拟边框分区以外的剩余可触摸操作区域为第二虚拟边框。
  17. 如权利要求16所述移动终端,其中,
    虚拟边框区域固定划分单元,配置为当触摸屏的可触摸操作区域的两侧边缘均划分有第一虚拟边框分区时,对于位于左右两侧的两个第一虚拟边框分区,分别设定其触控事件对应不同的特殊处理方式。
  18. 如权利要求15所述的移动终端,其中,所述虚拟边框区域固定划分单元,配置为在驱动初始化时,定义所述虚拟边框区域的位置及尺寸。
  19. 如权利要求15所述的移动终端,其中,所述移动终端还包括:
    虚拟边框区域设置接口,配置为创建及修改虚拟边框区域的数量、位置及大小。
  20. 如权利要求15所述移动终端,其中,
    虚拟边框区域固定划分单元,配置为在注册好该第一虚拟边框分区或第二虚拟边框分区对应的两个虚拟输入设备;
    滑动识别单元,配置为根据虚拟输入设备的命名,以识别出当前用户触摸区域是第一虚拟边框分区或第二虚拟边框分区;
    参数调节单元,配置为基于不同的触摸分区,确定执行不同的处理方式。
PCT/CN2016/079794 2015-04-23 2016-04-20 移动终端及其利用虚拟边框区域实现功能调节的方法 WO2016169483A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/567,569 US20180113591A1 (en) 2015-04-23 2016-04-20 Method for realizing function adjustment by using a virtual frame region and mobile terminal thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510197042.3 2015-04-23
CN201510197042.3A CN104935725B (zh) 2015-04-23 2015-04-23 移动终端及其利用虚拟边框区域实现功能调节的方法

Publications (1)

Publication Number Publication Date
WO2016169483A1 true WO2016169483A1 (zh) 2016-10-27

Family

ID=54122685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/079794 WO2016169483A1 (zh) 2015-04-23 2016-04-20 移动终端及其利用虚拟边框区域实现功能调节的方法

Country Status (3)

Country Link
US (1) US20180113591A1 (zh)
CN (1) CN104935725B (zh)
WO (1) WO2016169483A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107402634A (zh) * 2017-07-28 2017-11-28 歌尔科技有限公司 虚拟现实设备的参数调节方法及装置

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104935725B (zh) * 2015-04-23 2016-07-27 努比亚技术有限公司 移动终端及其利用虚拟边框区域实现功能调节的方法
CN105242852A (zh) * 2015-10-23 2016-01-13 努比亚技术有限公司 一种移动终端和控制光标移动的方法
CN105224210A (zh) * 2015-10-30 2016-01-06 努比亚技术有限公司 一种移动终端及其控制屏幕显示方向的方法
CN105653027B (zh) * 2015-12-24 2019-08-02 小米科技有限责任公司 页面缩放方法及装置
CN105786353A (zh) * 2016-02-19 2016-07-20 努比亚技术有限公司 锁屏界面调整方法和装置
CN105700709B (zh) * 2016-02-25 2019-03-01 努比亚技术有限公司 一种移动终端及控制移动终端不可触控区域的方法
CN105975192B (zh) * 2016-04-28 2019-04-19 努比亚技术有限公司 一种图像信息处理方法及移动终端
CN106896997B (zh) 2016-06-28 2020-11-10 创新先进技术有限公司 滑动控件控制方法及装置、滑块选择器
CN106201309B (zh) * 2016-06-29 2019-08-20 维沃移动通信有限公司 一种状态栏处理方法和移动终端
CN106201267A (zh) * 2016-07-09 2016-12-07 王静 一种设置多参数的方法
CN107665694B (zh) * 2016-07-29 2020-06-30 上海和辉光电有限公司 显示装置的亮度调整方法及系统
CN106325683A (zh) * 2016-08-31 2017-01-11 瓦戈科技(上海)有限公司 移动终端侧边功能栏的使用方法
CN106502569A (zh) * 2016-10-31 2017-03-15 珠海市魅族科技有限公司 一种功能调节方法及装置
CN106850984B (zh) * 2017-01-20 2020-09-01 努比亚技术有限公司 一种移动终端及其控制方法
US10691329B2 (en) * 2017-06-19 2020-06-23 Simple Design Ltd. User interface of media player application for controlling media content display
EP3680760A4 (en) * 2017-09-08 2020-10-14 Guangdong Oppo Mobile Telecommunications Corp., Ltd. KEY DISPLAY PROCEDURE, DEVICE AND TERMINAL
CN108650412B (zh) * 2018-04-25 2021-05-18 北京小米移动软件有限公司 一种显示方法、显示装置和计算机可读存储介质
CN108744494A (zh) * 2018-05-17 2018-11-06 Oppo广东移动通信有限公司 游戏应用控制方法、装置、存储介质及电子设备
CN110647259A (zh) * 2018-06-26 2020-01-03 青岛海信移动通信技术股份有限公司 一种触控显示装置及其震动方法
CN109085987B (zh) * 2018-07-10 2020-08-04 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备
CN110430296A (zh) * 2019-08-01 2019-11-08 深圳市闻耀电子科技有限公司 一种实现虚拟按键功能的方法、装置、终端及存储介质
CN111610912B (zh) * 2020-04-24 2023-10-10 北京小米移动软件有限公司 应用显示方法、应用显示装置及存储介质
CN111672115B (zh) * 2020-06-05 2022-09-23 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置、计算机设备及存储介质
CN115312009A (zh) * 2021-05-07 2022-11-08 海信视像科技股份有限公司 图像显示方法和装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040012572A1 (en) * 2002-03-16 2004-01-22 Anthony Sowden Display and touch screen method and apparatus
CN101432677A (zh) * 2005-03-04 2009-05-13 苹果公司 具有显示器和用于用户界面及控制的周围触摸敏感边框的电子设备
CN102316194A (zh) * 2011-09-09 2012-01-11 深圳桑菲消费通信有限公司 手机、手机的用户交互方法及装置
CN104935725A (zh) * 2015-04-23 2015-09-23 努比亚技术有限公司 移动终端及其利用虚拟边框区域实现功能调节的方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092494A (zh) * 2011-10-28 2013-05-08 腾讯科技(深圳)有限公司 触摸屏终端的应用程序切换方法和装置
KR20150019165A (ko) * 2013-08-12 2015-02-25 엘지전자 주식회사 이동 단말기 및 이의 제어방법
CN103458122B (zh) * 2013-08-30 2015-08-12 广东欧珀移动通信有限公司 一种快速便捷的手势截屏方法和装置
CN103558970A (zh) * 2013-10-29 2014-02-05 广东欧珀移动通信有限公司 音量调节方法
CN203883901U (zh) * 2013-12-23 2014-10-15 上海斐讯数据通信技术有限公司 基于侧边触摸屏模块调节屏幕亮度和显示比例的手机
KR102298972B1 (ko) * 2014-10-21 2021-09-07 삼성전자 주식회사 전자 장치의 엣지에서 수행되는 제스처를 기반으로 하는 동작 수행
CN104348978A (zh) * 2014-11-12 2015-02-11 天津三星通信技术研究有限公司 移动终端的通话处理方法和装置
US10140013B2 (en) * 2015-02-13 2018-11-27 Here Global B.V. Method, apparatus and computer program product for calculating a virtual touch position

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040012572A1 (en) * 2002-03-16 2004-01-22 Anthony Sowden Display and touch screen method and apparatus
CN101432677A (zh) * 2005-03-04 2009-05-13 苹果公司 具有显示器和用于用户界面及控制的周围触摸敏感边框的电子设备
CN102316194A (zh) * 2011-09-09 2012-01-11 深圳桑菲消费通信有限公司 手机、手机的用户交互方法及装置
CN104935725A (zh) * 2015-04-23 2015-09-23 努比亚技术有限公司 移动终端及其利用虚拟边框区域实现功能调节的方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107402634A (zh) * 2017-07-28 2017-11-28 歌尔科技有限公司 虚拟现实设备的参数调节方法及装置

Also Published As

Publication number Publication date
US20180113591A1 (en) 2018-04-26
CN104935725A (zh) 2015-09-23
CN104935725B (zh) 2016-07-27

Similar Documents

Publication Publication Date Title
WO2016169483A1 (zh) 移动终端及其利用虚拟边框区域实现功能调节的方法
WO2016155454A1 (zh) 移动终端及其虚拟边框区域的滑动识别方法
WO2016155550A1 (zh) 无边框终端的应用切换方法及无边框终端
WO2017071424A1 (zh) 一种移动终端和分享文件的方法
WO2016169524A1 (zh) 快速调节屏幕亮度的方法和装置、移动终端、存储介质
WO2017143847A1 (zh) 关联应用分屏显示装置、方法及终端
WO2016029766A1 (zh) 移动终端及其操作方法和计算机存储介质
WO2016173414A1 (zh) 移动终端及其应用程序的快速启动方法和装置
WO2016169480A1 (zh) 移动终端控制方法、装置及计算机存储介质
WO2016155424A1 (zh) 移动终端的应用切换方法及移动终端及计算机存储介质
WO2016155423A1 (zh) 调节设置参数的方法及终端、计算机存储介质
WO2016034055A1 (zh) 移动终端及其操作方法、计算机存储介质
WO2016173468A1 (zh) 组合操作方法和装置、触摸屏操作方法及电子设备
WO2016155597A1 (zh) 基于无边框终端的应用控制方法及装置
WO2016155509A1 (zh) 移动终端的握持方式判断方法及装置
WO2016119648A1 (zh) 移动终端防误触控方法及装置
WO2016119635A1 (zh) 移动终端防误触控方法及装置
WO2017143855A1 (zh) 具有截屏功能的装置和截屏方法
WO2017071599A1 (zh) 一种分屏大小实时调整方法、分屏装置及计算机存储介质
WO2016161986A1 (zh) 操作识别方法、装置、移动终端及计算机存储介质
WO2016155434A1 (zh) 移动终端的握持识别方法及装置、存储介质和终端
WO2017012385A1 (zh) 快速启动应用的方法、装置和终端
WO2016155431A1 (zh) 终端及其触控操作方法和装置、存储介质
WO2017071456A1 (zh) 一种终端的处理方法、终端及存储介质
CN106445352B (zh) 移动终端的边缘触控装置及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16782630

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15567569

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 160418)

122 Ep: pct application non-entry in european phase

Ref document number: 16782630

Country of ref document: EP

Kind code of ref document: A1