WO2016155454A1 - Terminal mobile et procédé de reconnaissance de défilement pour une zone de trame virtuelle d'un terminal mobile - Google Patents

Terminal mobile et procédé de reconnaissance de défilement pour une zone de trame virtuelle d'un terminal mobile Download PDF

Info

Publication number
WO2016155454A1
WO2016155454A1 PCT/CN2016/075349 CN2016075349W WO2016155454A1 WO 2016155454 A1 WO2016155454 A1 WO 2016155454A1 CN 2016075349 W CN2016075349 W CN 2016075349W WO 2016155454 A1 WO2016155454 A1 WO 2016155454A1
Authority
WO
WIPO (PCT)
Prior art keywords
sliding
contact
area
virtual
coordinate position
Prior art date
Application number
PCT/CN2016/075349
Other languages
English (en)
Chinese (zh)
Inventor
迟建华
陈伟韬
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2016155454A1 publication Critical patent/WO2016155454A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Definitions

  • the present invention relates to the field of communications technologies, and in particular, to a mobile terminal and a sliding identification method thereof.
  • a narrow border or no borders gives the user a visual impact, but it also poses a problem: when holding the phone, it is easy to accidentally touch the edge of the screen, resulting in many misuses.
  • an anti-missing design scheme is proposed for a borderless or narrow-frame mobile terminal: by dividing an area at the edge of the touch screen and virtualizing it as a border area, the system performs special operations on the contact operations reported in the virtual border area. Handle (if ignored) to prevent false touches. Accordingly, the related problems of the virtual border area need to be solved.
  • the main purpose of the present invention is to provide a sliding recognition method for a mobile terminal and a virtual bezel area thereof, which effectively recognizes a sliding operation and a sliding direction in a virtual bezel area.
  • the present invention provides a sliding identification method for a virtual bezel area, including the steps of:
  • the method for determining whether the touch event belongs to a sliding event is specifically:
  • the method for determining the direction attribute of the sliding event is specifically:
  • the direction attribute of the sliding event is determined by comparing coordinate values of the contact in the vertical direction of the initial coordinate position and the current coordinate position.
  • the method includes the steps of dividing a virtual border area on the touch screen by using a fixed dividing manner:
  • the position and width of the virtual border area are defined.
  • the method includes the steps of dividing a virtual border area on the touch screen by using a free setting manner:
  • the method further includes: calling the virtual border area setting interface to create or modify the number, location, and size of the virtual border area applicable to the current application scene, respectively, for different application scenarios.
  • the present invention also provides a mobile terminal, including a touch screen, and further comprising:
  • the bottom layer reporting unit is configured to report an initial coordinate position of the contact when the touch event concurrent with the contact is sensed in the virtual border area; during the moving of the contact, the current current of the contact is reported in real time Coordinate position
  • the sliding recognition unit is configured to determine whether the touch event belongs to a sliding event according to an initial coordinate position and a current coordinate position information of the contact reported by the bottom reporting unit, and if yes, further determine a direction attribute of the sliding event.
  • the sliding recognition unit further includes:
  • a recording module configured to record initial coordinate position information of the contact reported by the bottom reporting unit
  • a sliding event judging module configured to calculate a moving distance of the contact according to the initial coordinate position of the contact and the current coordinate position, and compare the moving distance with a preset threshold to determine whether the touch event belongs to a sliding event;
  • Sliding direction judging module for comparing the contact at the initial coordinate position and the current coordinate position
  • the coordinate value of the vertical direction determines the direction attribute of the touch event.
  • the mobile terminal further includes:
  • the virtual bezel area fixed dividing unit is configured to define a position and a width of the virtual bezel area when the driving is initialized.
  • the mobile terminal further includes:
  • the virtual border area setting interface is used to create and modify the number, position and size of the virtual border area.
  • the invention also provides a sliding recognition method for a virtual border area, comprising:
  • the touch event belongs to a sliding event, it is further determined that the direction attribute of the sliding event is obtained.
  • the present invention also provides a mobile terminal, including a touch screen, and further comprising:
  • the bottom layer reporting unit is configured to report the initial coordinate position of the contact when the touch event generated by the contact in the virtual border area is sensed; during the movement of the contact, the contact is reported in real time Current coordinate position;
  • the sliding recognition unit is configured to determine whether the touch event belongs to a sliding event according to an initial coordinate position and a current coordinate position information of the contact reported by the bottom reporting unit, and if the touch event belongs to a sliding event, further determining The direction attribute of the sliding event.
  • the solution proposed by the invention can accurately recognize the sliding action and the sliding direction of the contact, and recognize the sliding of the moving distance exceeding the threshold as an effective sliding action by setting the threshold value, thereby improving the accuracy of the recognition and avoiding the recognition error caused by the misoperation.
  • it is applicable to various mobile terminals that use a fixed division mode and an automatic setting mode to set a virtual border area.
  • FIG. 1 is a schematic structural diagram of hardware of a mobile terminal that implements various embodiments of the present invention
  • FIG. 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
  • FIG. 3 is a schematic diagram of a conventional touch screen division manner of a mobile terminal
  • FIG. 4 is a flowchart of a method for judging slippage of a C zone according to a first embodiment of the present invention
  • FIG. 5 is a schematic diagram of dividing a C area by a fixed manner in a second embodiment of the present invention.
  • FIG. 6 is a flowchart of a method for judging slippage of a C zone according to a second embodiment of the present invention.
  • FIG. 7 is a schematic diagram showing a display effect of a touch screen according to a second embodiment of the present invention.
  • Figure 8 is a schematic view showing the movement of the contact of the C zone in the second embodiment of the present invention.
  • FIG. 9 is a block diagram of a C-zone event processing system in a second embodiment of the present invention.
  • FIG. 10 is a schematic structural diagram of a mobile terminal according to a second embodiment of the present invention.
  • FIG. 11 is a schematic diagram of dividing a C area by a free setting method in a third embodiment of the present invention.
  • FIG. 12 is a schematic diagram showing a display effect of a touch screen under a desktop of a system according to a third embodiment of the present invention.
  • FIG. 13 is a schematic diagram showing a display effect of a touch screen in a camera application scenario according to a third embodiment of the present invention.
  • the mobile terminal can be implemented in various forms.
  • the terminal described in the present invention may include Mobile terminals such as mobile phones, smart phones, notebook computers, digital broadcast receivers, personal digital assistants (PDAs), tablet computers (PADs), portable multimedia players (PMPs), navigation devices, and the like, and such as digital TVs, desktop computers And so on.
  • PDAs personal digital assistants
  • PADs tablet computers
  • PMPs portable multimedia players
  • navigation devices and the like
  • digital TVs desktop computers And so on.
  • FIG. 1 is a schematic diagram showing the hardware structure of a mobile terminal that implements various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
  • Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may exist in the form of Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), and the like.
  • the broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
  • MediaFLO forward link media
  • the digital broadcasting system of the @ ) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .
  • the short range communication module 114 is a module for supporting short range communication.
  • Some examples of short-range communication technology include Bluetooth TM, a radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, etc. TM.
  • the location information module 115 is a module for checking or acquiring location information of the mobile terminal.
  • a typical example of a location information module is GPS (Global Positioning System).
  • GPS Global Positioning System
  • the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
  • the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
  • the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
  • the A/V input unit 120 is for receiving an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 1220 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be called on the phone In the case of the mode, it is converted into a format output that can be transmitted to the mobile communication base station via the mobile communication module 112.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
  • Sensing unit 140 may include proximity sensor 1410 which will be described below in connection with a touch screen.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identity Module (USIM), and the like.
  • the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a permission to allow input from the base The path through which various command signals are transmitted to the mobile terminal. Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 can display a user interface (UI) or a graphical user interface (GUI) related to a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor LCD
  • OLED organic light emitting diode
  • a flexible display a three-dimensional (3D) display, and the like.
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like.
  • TOLED Transparent Organic Light Emitting Diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
  • the touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to the audio or video output, the alert unit 153 can provide an output in a different manner to notify the event occur. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (ie, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like. Additionally, the controller 180 can include a multimedia module 1810 for reproducing (or playing back) multimedia data, which can be constructed within the controller 180 or can be configured to be separate from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
  • the various embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( FPGA), processor, controller, microcontroller, microprocessor, in an electronic unit designed to perform the functions described herein
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable gate arrays
  • processor controller, microcontroller, microprocessor
  • controller 180 programmable logic devices
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by controller 180
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
  • the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • Such communication systems may use different air interfaces and/or physical layers.
  • air interfaces used by communication systems include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)). ), Global System for Mobile Communications (GSM), etc.
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
  • a CDMA wireless communication system can include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a public switched telephone network (PSTN) 290.
  • PSTN public switched telephone network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
  • the backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in FIG. 2 can include multiple BSCs 275.
  • Each BS 270 can serve one or more partitions (or regions), each of which is covered by a multi-directional antenna or an antenna directed to a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS Base Transceiver Subsystem
  • the term "base station” can be used to generally refer to a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as a plurality of cellular stations.
  • a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • GPS Global Positioning System
  • the satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the GPS module 115 as shown in Figure 1 is typically configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • the touch screen of the conventional mobile terminal is divided into a touchable operation area (hereinafter referred to as A' area) and a key area (hereinafter referred to as B area).
  • A' area a touchable operation area
  • B area a key area
  • the A' area is a touchable operation area for detecting touch point coordinates
  • the B area is a key area for detecting a menu key, a Home key, a return key, and the like.
  • the present invention proposes two anti-missing schemes: one is to divide the virtual border area (hereinafter referred to as C area) in the A' area by a fixed method, and one is adopted.
  • the free setting mode divides the C area in the A' area.
  • the touch operation in the A area remaining after the C area is still in accordance with the existing normal processing mode; for the touch operation in the divided C area, it can be defined as a special processing mode, such as ignoring processing, special effect processing, and the like. This will be described in detail below.
  • the sliding identification method of the virtual frame area in this embodiment is applicable to the two division modes of the C area, and specifically includes the following steps:
  • the C area is divided by a fixed manner, and the dividing method is: dividing a partial area as a virtual border area (ie, C area) at an edge of the original A′ area of the touch screen, and other areas of the A′ area are referred to as an A area.
  • the C area includes a partial area on the left side and a partial area on the right side, and the position is fixedly disposed on both side edges of the A' area, as shown in FIG.
  • the sliding recognition method of the C area includes the following steps:
  • the upper layer After the two virtual input devices are registered, the upper layer will identify whether the current user touch area is the C area or the A area according to the naming of the virtual input device reported by the driver layer, and the upper layer has different processing modes, and the upper layer processing manner is different. Will introduce.
  • the upper layer of the present invention generally refers to a framework layer, an application layer, and the like.
  • a customized system such as android or IOS usually includes an underlying layer (physical layer, driving layer) and an upper layer (frame layer, application).
  • Layer the direction of the signal flow is: the physical layer (touch panel) receives the touch operation of the user, the physical compression is converted into the electrical signal TP, and the TP is transmitted to the driving layer, and the driving layer analyzes the pressed position to obtain the position.
  • the specific coordinates, duration, pressure and other parameters of the point are uploaded to the framework layer.
  • the communication between the framework layer and the driver layer can be realized through the corresponding interface.
  • the framework layer receives the input device of the driver layer and parses the input device. The input device is selected to respond or not respond to the input device, and the valid input is passed up to which specific application to meet the application layer to perform different application operations according to different events.
  • the initial coordinate position (downX, downY) and the initial pressed time information (downTime) of the contact are reported to the upper layer through the virtual input device input0, and the upper layer ( The system framework layer or application layer) records this information as a basis for subsequent sliding judgment.
  • the middle portion of the touch screen is the A area
  • the narrow sides on the left and right sides are the C area
  • the gray origin represents the contacts in the C area.
  • the touch operation is usually a click, slide, etc., and each touch operation is composed of one or more contacts, so that the mobile terminal can determine that the touch operation is generated by detecting the area where the contact of the touch operation falls. In Area C or Area A.
  • the reported event After receiving the reported event in the framework layer (the reported event includes the input device and the parameters of the touch point, etc.), first, according to the naming of the input device, which region is identified, such as the driver layer (kernel) identification is in C
  • the input device reported to the frame layer by the driver layer is input0 instead of being reported by input1. That is, the frame layer does not need to determine which partition the current contact is in, nor does it need to determine the size and position of the partition.
  • the judging operation is completed on the driving layer, and the driving layer not only reports which input device is specifically, but also reports various parameters of the contact to the frame layer, such as pressing time, position coordinates, pressure magnitude and the like.
  • the driver layer determines whether the event occurs in the A zone or the C zone, and then reports the frame layer through the input device; the frame layer determines whether it is an event of the A zone or an event of the C zone according to the naming of the input device, that is, for example, input0 It is the event in Area C, and input1 is the event in Area A.
  • the virtual input device input0 reports the current coordinate position (currentX, currentY) of the contact to the upper layer in real time according to a preset period; meanwhile, the upper layer is touched according to the touch.
  • the initial coordinate position of the point and the current coordinate position information determine whether the touch event is a sliding event, and if so, further determine its direction attribute, and perform preset special processing according to this, and special processing methods such as ignoring, generating special effects, and the like.
  • the reporting period of the virtual input device input0 can be set to a shorter time value, such as 1/85 second.
  • step 604 the specific method for determining whether the touch event generated by the contact is a sliding event is: determining a moving distance between the current position of the contact and the initial position; and if the moving distance exceeds a preset threshold, determining the touch event For a sliding event, otherwise, it is determined that the touch event is not a sliding event.
  • step 604 the method of determining the sliding direction of the contact is: comparing the current position of the contact with the Y-axis coordinate value of the initial position, and if currentY>downY, determining that the sliding direction of the contact is downward, otherwise determining the sliding The direction is up.
  • the framework layer reports to the application layer through a single-channel to multi-channel mechanism. Specifically, the channel is first registered, the reported event is transmitted through the channel, and the event is listened to by the listener, and the event is transmitted to the corresponding application module through different channels to generate different application operations, where the application
  • the module includes common applications such as camera and contacts; it generates different application operations. For example, in the camera application, when the user clicks in a special zone, it will produce different operations such as focusing, shooting, and adjusting camera parameters.
  • the report event is delivered to the listener, it is a single channel. After the listener listens, the reported event is multi-channel, and multiple channels exist at the same time. The advantage is that it can be transmitted to different application modules at the same time, and different application modules are generated. Different response operations.
  • the specific implementation of the foregoing steps is: using an object-oriented manner, defining a category and an implementation manner of the C area and the A area, and after determining that the area is C, converting the coordinates of the touch points of different resolutions by using the EventHub function.
  • the function of this function is to pass the event to the event manager (TouchEventManager) through the channel after receiving the reported event, and pass the event to multiple through the multi-channel simultaneously or one by one through the listener's monitoring.
  • the responding application module can also be passed only to one of the application modules, such as camera, gallery, etc., and different application modules generate corresponding operations.
  • the specific implementation of the foregoing steps may also be implemented in other manners, which is not limited by the embodiment of the present invention.
  • the touch operation flow of the present invention will be further described in another manner.
  • the virtual border area is simply referred to as the C area, and the other areas are simply referred to as the A area, and the touch event is abbreviated.
  • the reporting process is as follows:
  • the driving layer receives the touch event through physical hardware such as a touch screen, and determines whether the touch operation occurs in the A area or the C area, and reports the event through the device file node of the A area or the C area.
  • the Native layer reads events from device files in Areas A and C, and processes events in Areas A and C, such as coordinate calculations. The devices in the A and C areas are distinguished by the device ID, and finally the A area is distributed. And zone C events.
  • the A area event takes the original process, and the A area event is processed in the usual way; the C area event is distributed from the dedicated channel of the C area registered in the Native layer in advance, and is input by the Native port, and the system port is output to the C area.
  • the system service then reports the external interface to each application through the C area event.
  • the embodiment further provides a mobile terminal, including:
  • the C area fixed dividing unit 1010 is configured to define the width of the C area, and divide the entire touchable area of the touch screen into a C area located at both side edges and an A area located in the middle, and the position and size of the C area are fixed and cannot be adjusted.
  • the bottom layer reporting unit 1020 is configured to allocate and register two virtual input devices when the touch screen driver is initialized, corresponding to the C area and the A area respectively; when the contact is pressed in the C area, the virtual input device of the C area reports the touch The initial coordinate position and initial time information of the point, and the current coordinate position is reported in real time according to the preset period during the movement of the contact.
  • the slide recognition unit 1030 is configured to record the initial coordinate position and the current coordinate position information of the contact reported by the bottom layer reporting unit 920 to determine in real time whether the contact has a sliding and sliding direction.
  • the sliding recognition unit 1030 further includes:
  • the recording module 1031 is configured to record initial coordinate position and initial time information of the contact reported by the bottom reporting unit 1020, as a judgment basis for subsequent sliding;
  • the sliding event judging module 1032 is configured to calculate a moving distance of the contact according to the initial coordinate position of the contact and the current coordinate position, and determine whether the contact has slipped by comparing the moving distance with a preset threshold;
  • the sliding direction determining module 1033 is configured to determine the sliding direction of the contact by comparing the initial Y-axis coordinate value of the contact with the current U-axis coordinate value.
  • the touch operation in the virtual border area is specially processed, which can effectively solve the problem that the narrow border and the borderless terminal are prone to accidental touch, and can quickly and accurately identify Whether the contacts in the virtual frame area have a sliding direction and a sliding direction lay the foundation for the subsequent effective execution of the corresponding special processing operations.
  • the C area adopts a fixed division manner, that is, the size of the C area is not variable, which affects the user experience. Therefore, in the third embodiment, the division manner of the C area is improved, and the number, position, and size of the C area are freely set, thereby improving the user experience.
  • the present embodiment replaces the C-area fixed dividing unit 1010 in the mobile terminal with:
  • the C area sets the interface for creating and modifying the number, location, and size of the C area.
  • the interface is implemented in the driver layer, and the upper layer can call the interface to adjust the number, location and size of the C area at any time according to its own needs.
  • the basic graphic design of the C area is a rectangle, and the position and size of the C area can be determined by inputting the coordinates of the two vertices of the diagonal of the graphic.
  • the figure creates two C areas on the lower left side and the lower right side of the touch screen. In specific implementation, all C areas may correspond to one virtual input device.
  • the sliding identification method of the C area is the same as that of the second embodiment, and details are not described herein.
  • the C area can be used to implement various functions to enhance the user experience. An example will be described below.
  • Multi-task switching function By sliding up in the C area, the display interface of the A area is switched from the current application interface to the previous application interface; by sliding down in the C area, the display interface of the A area is from the current application interface. Switch to the next application interface, which is convenient for users to switch applications.
  • Multi-task thumbnail switching function When sliding up and down in the C area, the thumbnails of various applications currently running in the background are sequentially displayed in the C area, and the corresponding application is started when lifting.
  • Brightness adjustment function Increase the screen brightness by sliding up in the side C area, and lower the screen brightness by sliding down on the other side C area, the user does not need to adjust by physical buttons, and does not need to enter Set the interface to adjust.
  • Volume adjustment function increase the volume by sliding up in the C area, and pass the On the other side, the area C slides down to adjust the volume.
  • Show hidden content Show hidden content (such as apps, images, files, text messages, etc.) when sliding up, and hide related content when sliding down.
  • Mobile phone acceleration function It is activated when the up and down sliding reaches two thresholds for back and forth, to clean up the background application to release the memory, and notify the user to process the result at the end of the processing, which is suitable for game players and the like.
  • the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better.
  • Implementation Based on such understanding, the technical solution of the present invention, which is essential or contributes to the prior art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk,
  • the optical disc includes a number of instructions for causing a terminal device (which may be a cell phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the methods described in various embodiments of the present invention.
  • the solution proposed by the invention can accurately recognize the sliding action and the sliding direction of the contact, and recognize the sliding of the moving distance exceeding the threshold as an effective sliding action by setting the threshold, thereby improving the recognition
  • the accuracy is to avoid the recognition error caused by misoperation, and is applicable to various mobile terminals that use the fixed division mode and the automatic setting mode to set the virtual border area.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

L'invention concerne un terminal mobile et un procédé de reconnaissance de défilement pour une zone de trame virtuelle du terminal mobile. Le procédé comprend les étapes consistant : à détecter un événement de toucher généré par un contact dans une zone de trame virtuelle (401) ; à rapporter une position de coordonnée courante du contact en temps réel, et à déterminer l'événement de toucher simultané avec le contact selon une position de coordonnée initiale et la position de coordonnée courante du contact (402) ; et déterminer si un événement de toucher est un événement de défilement ; si tel est le cas, déterminer en outre d'obtenir un attribut de direction de l'événement de défilement (403). Au moyen du procédé, une action de défilement et une direction de défilement d'un contact peuvent être reconnues avec précision, un défilement ayant une distance de déplacement dépassant un seuil est reconnu comme étant une action de défilement effective par réglage du seuil, de telle sorte que la précision de reconnaissance peut être améliorée et les mauvaises mises en œuvre peuvent être évitées, et le procédé peut s'appliquer à une diversité de terminaux mobiles sur lesquels une zone de trame virtuelle est réglée à la manière d'une division fixe et à la manière d'un réglage automatique.
PCT/CN2016/075349 2015-03-27 2016-03-02 Terminal mobile et procédé de reconnaissance de défilement pour une zone de trame virtuelle d'un terminal mobile WO2016155454A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510142649.1A CN104793867A (zh) 2015-03-27 2015-03-27 移动终端及其虚拟边框区域的滑动识别方法
CN201510142649.1 2015-03-27

Publications (1)

Publication Number Publication Date
WO2016155454A1 true WO2016155454A1 (fr) 2016-10-06

Family

ID=53558698

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/075349 WO2016155454A1 (fr) 2015-03-27 2016-03-02 Terminal mobile et procédé de reconnaissance de défilement pour une zone de trame virtuelle d'un terminal mobile

Country Status (2)

Country Link
CN (1) CN104793867A (fr)
WO (1) WO2016155454A1 (fr)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104793867A (zh) * 2015-03-27 2015-07-22 努比亚技术有限公司 移动终端及其虚拟边框区域的滑动识别方法
CN105183343A (zh) * 2015-08-25 2015-12-23 努比亚技术有限公司 一种处理报点信息的装置和方法
CN105068736B (zh) * 2015-09-11 2019-03-05 努比亚技术有限公司 一种打开应用的移动终端和方法
CN105681594B (zh) * 2016-03-29 2019-03-01 努比亚技术有限公司 一种终端的边缘交互系统和方法
CN106066766A (zh) * 2016-05-26 2016-11-02 努比亚技术有限公司 一种移动终端及其控制方法
CN106020706A (zh) * 2016-05-30 2016-10-12 努比亚技术有限公司 一种触控操作处理方法及移动终端
CN106527933B (zh) * 2016-10-31 2020-09-01 努比亚技术有限公司 移动终端边缘手势的控制方法及装置
CN107037971A (zh) * 2017-03-27 2017-08-11 努比亚技术有限公司 应用管理装置、移动终端及方法
CN107045420B (zh) * 2017-04-27 2021-01-05 南通易通网络科技有限公司 应用程序的切换方法及移动终端、存储介质
CN108744494A (zh) * 2018-05-17 2018-11-06 Oppo广东移动通信有限公司 游戏应用控制方法、装置、存储介质及电子设备
CN109635542B (zh) * 2018-11-30 2023-02-03 华为技术有限公司 一种生物识别交互方法、图形交互界面及相关装置
CN112394859A (zh) * 2020-11-12 2021-02-23 青岛海信商用显示股份有限公司 表格动态调整方法和终端设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130067397A1 (en) * 2011-09-12 2013-03-14 Microsoft Corporation Control area for a touch screen
CN103914243A (zh) * 2013-01-08 2014-07-09 联想(北京)有限公司 一种信息处理的方法及电子设备
CN104156073A (zh) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 移动终端及其操作方法
CN104793867A (zh) * 2015-03-27 2015-07-22 努比亚技术有限公司 移动终端及其虚拟边框区域的滑动识别方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI608407B (zh) * 2013-11-27 2017-12-11 緯創資通股份有限公司 觸控裝置及其控制方法
CN104580741A (zh) * 2015-01-27 2015-04-29 深圳市中兴移动通信有限公司 基于无边框手机的界面显示方法及装置
CN104714691B (zh) * 2015-01-30 2017-02-15 努比亚技术有限公司 移动终端防误触控方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130067397A1 (en) * 2011-09-12 2013-03-14 Microsoft Corporation Control area for a touch screen
CN103914243A (zh) * 2013-01-08 2014-07-09 联想(北京)有限公司 一种信息处理的方法及电子设备
CN104156073A (zh) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 移动终端及其操作方法
CN104793867A (zh) * 2015-03-27 2015-07-22 努比亚技术有限公司 移动终端及其虚拟边框区域的滑动识别方法

Also Published As

Publication number Publication date
CN104793867A (zh) 2015-07-22

Similar Documents

Publication Publication Date Title
WO2016169483A1 (fr) Terminal mobile et procédé d'ajustement de fonction à l'aide de la région de trame virtuelle associée
WO2016155454A1 (fr) Terminal mobile et procédé de reconnaissance de défilement pour une zone de trame virtuelle d'un terminal mobile
WO2016169524A1 (fr) Procédé et dispositif de réglage rapide de luminosité d'écran, terminal mobile et support d'informations
WO2016155550A1 (fr) Procédé de commutation d'application pour terminal sans cadre et terminal sans cadre
WO2017143847A1 (fr) Dispositif et procédé d'affichage à écrans multiples d'application associée, et terminal
WO2017071424A1 (fr) Terminal mobile et procédé de partage de fichier
WO2016155597A1 (fr) Procédé et dispositif de commande d'application basée sur un terminal sans cadre
WO2016173414A1 (fr) Terminal mobile et procédé de démarrage rapide et dispositif pour programme d'application associé
WO2016169480A1 (fr) Procédé et dispositif de commande de terminal de mobile, et support de stockage informatique
WO2016155423A1 (fr) Procédé et terminal pour ajuster des paramètres de réglage, et support de stockage informatique
WO2016155424A1 (fr) Procédé de commutation d'application pour terminal mobile, terminal mobile et support de stockage informatique
US11928312B2 (en) Method for displaying different application shortcuts on different screens
WO2016034055A1 (fr) Terminal mobile et son procédé de fonctionnement et support de stockage informatique
WO2016029766A1 (fr) Terminal mobile et son procede de fonctionnement et support de stockage informatique
WO2016119635A1 (fr) Procédé et dispositif de prévention de toucher par erreur pour un terminal mobile
WO2016119648A1 (fr) Procédé et appareil pour empêcher un effleurement accidentel d'un terminal mobile
WO2016173468A1 (fr) Procédé et dispositif d'opération combinée, procédé de fonctionnement d'écran tactile et dispositif électronique
WO2016155434A1 (fr) Procédé et dispositif pour reconnaître la tenue d'un terminal mobile, support de stockage et terminal
WO2017020771A1 (fr) Dispositif et procédé de commande de terminal
WO2016155509A1 (fr) Procédé et dispositif permettant de déterminer un mode de tenue d'un terminal mobile
WO2016173498A1 (fr) Procédé et dispositif permettant d'entrer un nombre ou un symbole au moyen d'un cadre d'écran tactile, terminal et support
WO2016161986A1 (fr) Procédé et appareil de reconnaissance d'opération, terminal mobile et support de stockage informatique
WO2017071599A1 (fr) Procédé d'ajustement en temps réel de taille d'écran divisé, dispositif d'écran divisé et support de stockage informatique
WO2017071456A1 (fr) Procédé de traitement de terminal, terminal, et support de stockage
WO2017012385A1 (fr) Procédé et appareil de démarrage rapide d'application, et terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16771209

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16771209

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 160418)

122 Ep: pct application non-entry in european phase

Ref document number: 16771209

Country of ref document: EP

Kind code of ref document: A1