CN108008859B - Screen control method and mobile terminal - Google Patents

Screen control method and mobile terminal Download PDF

Info

Publication number
CN108008859B
CN108008859B CN201711341157.0A CN201711341157A CN108008859B CN 108008859 B CN108008859 B CN 108008859B CN 201711341157 A CN201711341157 A CN 201711341157A CN 108008859 B CN108008859 B CN 108008859B
Authority
CN
China
Prior art keywords
touch
area
determining
screen
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711341157.0A
Other languages
Chinese (zh)
Other versions
CN108008859A (en
Inventor
刘林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201711341157.0A priority Critical patent/CN108008859B/en
Publication of CN108008859A publication Critical patent/CN108008859A/en
Application granted granted Critical
Publication of CN108008859B publication Critical patent/CN108008859B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a screen control method and a mobile terminal, wherein the method comprises the following steps: acquiring point reporting data on a screen; determining a touch area on a screen according to the report point data; and when the touch area is a preset edge area, adjusting the display area of the current user interface according to the touch area, and/or performing invalidation processing on the report point data. The method and the device can effectively avoid misoperation caused by mistaken touch on the edge of the screen, improve the accuracy of identifying the mistaken touch and improve the user experience.

Description

Screen control method and mobile terminal
Technical Field
The invention relates to the technical field of electronics, in particular to a screen control method and a mobile terminal.
Background
With the continuous development of electronic technology, smart mobile devices have become the most important communication and entertainment tools in people's daily life, and people have higher and higher requirements for smart mobile terminals. In order to meet the requirements of users, the screen of the mobile terminal is getting bigger and bigger at present. However, as the screen is larger and larger, when the user uses the mobile terminal, the user often touches the mobile terminal with different hands, so that the position of the edge of the screen is touched by mistake to cause misoperation, and the use experience of the user is further influenced.
Disclosure of Invention
The embodiment of the invention provides a screen control method and a mobile terminal, and aims to solve the problem that misoperation is caused by mistaken touch of the edge of a screen frequently when a user uses the mobile terminal in the prior art.
In order to solve the technical problem, the invention is realized as follows: a screen control method, comprising:
acquiring point reporting data on a screen;
determining a touch area on the screen according to the report point data;
and when the touch area is a preset edge area, adjusting the display area of the current user interface according to the touch area, and/or performing invalidation processing on the report point data.
In a first aspect, an embodiment of the present invention further provides a mobile terminal, including:
the acquisition module is used for acquiring point reporting data on a screen;
the determining module is used for determining a touch area on the screen according to the report point data;
and the adjusting module is used for adjusting the display area of the current user interface according to the touch area and/or carrying out invalidation processing on the report point data when the touch area is a preset edge area.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the screen control method according to any one of the above.
In a third aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the screen control method according to any one of the above.
In the embodiment of the invention, misoperation caused by mistaken touch on the edge of the screen can be effectively avoided, the accuracy of identifying the mistaken touch is improved, and the user experience is improved.
Drawings
FIG. 1 is a flowchart of a screen control method according to an embodiment of the present invention;
FIG. 2a is a schematic diagram of a user interface reduction of a screen according to an embodiment of the present invention;
FIG. 2b is a second schematic diagram illustrating a user interface reduction of a screen according to an embodiment of the present invention;
FIG. 2c is a third schematic diagram illustrating a user interface reduction of a screen according to an embodiment of the present invention;
FIG. 3 is another flowchart of a screen control method according to an embodiment of the present invention;
FIG. 4 is another flowchart of a screen control method according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 6a is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 6b is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 7 is another schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In some embodiments of the present invention, there is provided a screen control method, as shown in fig. 1, including:
step 101, point reporting data on a screen is acquired.
Here, by acquiring point count data on the screen, it is convenient to identify whether the touch operation is a false touch based on the point count data.
Among them, a Touch Panel (TP) on a screen is a device that can receive an input signal. Touch screens are generally classified into resistive and capacitive types. Capacitive touch screens are basically used on mobile terminals at present. When a user touches the capacitive touch screen, due to an electric field of a human body, a finger of the user and a working surface form a coupling capacitor, because the working surface is connected with a high-frequency signal, a very small current is absorbed by the finger, the current respectively flows out from electrodes on four corners of the touch screen, and theoretically, the current flowing through the four electrodes is proportional to the distances from the finger to the four corners of the touch screen, so that the position of a touch point can be obtained through precise calculation of the proportion of the four currents.
Touch screen Integrated circuits (TPICs) and mobile terminal Application Processors (APs) are typically connected via an Integrated Circuit bus (I2C) or a Serial Peripheral Interface (SPI) bus. When a user finger touches the touch screen, the TP IC calculates the position of a touch point, and the AP acquires the position of the touch point through I2C or an SPI bus, thereby completing the process of touch screen point reporting.
And step 102, determining a touch area on the screen according to the report point data.
Here, from the touch point data, the position of the touch point can be obtained, so that the touch area can be accurately determined on the screen so as to identify whether the touch operation is a false touch based on the touch area.
And 103, when the touch area is a preset edge area, adjusting the display area of the current user interface according to the touch area, and/or performing invalidation processing on the report point data.
According to the touch area, a display area of a current User Interface (UI) is adjusted, and/or invalidation processing is carried out on the report point data, namely report points of the touch area are restrained, response to the report point data is cancelled, the response area of the touch screen can be reduced, and the situation of misoperation caused by mistaken touch of the edge of the screen is avoided.
Here, according to the operation habit of the user, the mistaken touch area at the edge of the screen is determined in advance, when the touch area is the preset edge area, the touch operation can be determined to be the mistaken touch, the display area of the current user interface is adjusted according to the touch area, and/or the point reporting data is subjected to invalid processing, so that the misoperation caused by the mistaken touch of the edge area of the screen is effectively avoided, the accuracy of the mistaken touch recognition is high, and the user experience is improved.
The preset edge area touched by mistake can be determined in a machine learning mode according to the big data of the user operation counted in advance, so that the operation area of the user is normalized, if the user touches the area by mistake next time, the display area of the current user interface is adjusted, and/or the report point data is subjected to invalid processing, namely, the report point data is not responded, and the misoperation caused by the mistake touch is effectively avoided.
Further, whether the touch area belongs to the edge area of the screen or not can be judged firstly, and when the touch area belongs to the edge area of the screen, whether the touch area belongs to the preset edge area touched by mistake or not is further judged.
The judgment criterion pertaining to the edge area of the screen may be, for example, that the distance between the touch area and the screen boundary is less than or equal to 30px, but is not limited thereto, and may be determined according to the actual size of the screen or other parameters.
The screen control method provided by the embodiment of the invention can effectively avoid misoperation caused by mistaken touch on the edge of the screen, improves the accuracy of mistaken touch recognition, and improves user experience.
Optionally, in step 103, the step of adjusting the display area of the current user interface includes:
and step 1031, narrowing the display area of the current user interface to a preset size.
Here, the purpose of reducing the response area of the touch screen is achieved by reducing the display area of the current UI to a preset size, so that misoperation caused by mistaken touch on the edge of the screen is avoided.
The peripheral edge of the display area of the current UI 10 can be completely retracted, so that the display area of the UI 10 is reduced to a preset size, as shown in fig. 2a, by fine-tuning the display area of the UI 10, the response area of the touch screen is reduced, misoperation caused by mistaken touch of the edge is avoided, the UI 10 cannot be seriously deformed, and the viewing effect of a user is ensured.
Or step 1032, the display area of the current user interface is reduced to not include the touch area.
Here, by reducing the display area of the current UI, the reduced display area of the UI avoids the touch area, and thus, it is avoided that an edge is touched by mistake to cause a malfunction.
As shown in fig. 2b, assuming that the touch area is at the lower left corner of the screen, the display area of the current UI 10 is reduced to a touch area that does not include the lower left corner, so that the reduced display area of the UI 10 avoids the touch area, and at this time, by fine-tuning the display area of the UI 10, the response area of the touch screen is reduced, thereby avoiding the misoperation caused by the edge being touched by mistake, and not causing serious deformation to the UI 10, and ensuring the viewing experience of the user.
Optionally, the step 102 includes:
step 1021, determining the touch time of the touch operation according to the report data.
Here, the reason why the user often touches the edge area of the screen when holding the mobile terminal is that the user holds the mobile terminal, and the touch screen is a long press operation, not a click operation, and the point data is a continuous point.
In this step, the touch time of the touch operation is determined according to the report data, so as to identify whether the touch operation is a long press operation.
And step 1022, when the touch time exceeds the preset time, determining a touch area on the screen according to the touch point data.
When the touch time exceeds the preset time, the touch operation can be determined to be a long-time press operation, the point reporting data is continuous point reporting, the subsequent process is started again under the condition, the touch area is determined on the screen according to the point reporting data, otherwise, the touch operation is determined not to be a mistaken touch, the subsequent process is not started again, and the power consumption is saved.
The preset time can be obtained through experiments according to historical data, and any reasonable time value can be taken as the preset time.
The following illustrates a specific implementation procedure of the embodiment of the present invention.
As shown in fig. 3, the screen control method according to the embodiment of the present invention includes:
step 301, point data on a screen is acquired.
Step 302, determining the touch time of the touch operation according to the report data.
And 303, determining a touch area on the screen according to the report point data when the touch time exceeds the preset time.
And step 304, judging whether the touch area is a preset edge area, if so, jumping to step 305, otherwise, jumping to step 306.
And 305, adjusting the display area of the current user interface according to the touch area, or performing invalidation processing on the report point data.
And step 306, ending.
According to the screen control method provided by the embodiment of the invention, the response area of the touch screen is reduced by finely adjusting the UI interface or inhibiting point reporting, the identification accuracy is high, the misoperation caused by mistaken touch of the edge is avoided, the UI cannot be seriously deformed, and the watching effect of a user is ensured.
Optionally, the step 102 includes:
and 1023, determining the touch time of the touch operation according to the report point data, and determining whether the touch operation has a moving trend according to the report point data.
Here, the false touch is usually caused by touching an edge area of a screen when the user holds the mobile terminal, and the touch screen is a long-press operation instead of a click operation when the user holds the mobile terminal, and at this time, the point data is a continuous point. In addition, the mistaken touch when the user holds the mobile terminal is generally that the user needs to move the thumb on the screen when holding the mobile terminal with one hand, so that the connecting part of the thumb and the palm is mistakenly touched on the edge of the screen, and at the moment, the connecting part of the thumb and the palm has a moving trend along with the thumb on the screen.
In this step, the touch time of the touch operation is determined according to the point reporting data so as to identify whether the touch operation is a long-time press operation, and whether the touch operation has a moving trend is determined according to the point reporting data so as to further avoid identifying the normal operation error of the user as the error operation.
Step 1024, when the touch time exceeds a preset time and the touch operation has a moving trend, determining a touch area on the screen according to the report point data.
And if the touch time exceeds the preset time, namely the touch operation is the long-time pressing operation and the touch operation has the moving trend, starting the subsequent flow, determining the touch area on the screen according to the point reporting data, otherwise, determining that the touch area is not the wrong touch, and not starting the subsequent flow, so that the power consumption is saved, and the condition that the normal touch is mistakenly judged as the wrong touch is avoided.
As mentioned above, the preset time can be obtained through experiments according to historical data, and the preset time can be any reasonable time value.
Optionally, in step 103, the step of adjusting the display area of the current user interface further includes:
and 1033, determining the moving direction of the touch operation according to the report point data.
Here, when the user holds the mobile terminal with one hand, the thumb is required to move on the screen, and the thumb is inconvenient to operate when clicking the edge area of the screen.
In this step, the moving direction of the touch operation is determined according to the report data, so that the UI is adjusted based on the moving direction, and the operation of the user is facilitated.
The touch operation method comprises the steps of continuously acquiring point reporting data for n times, determining a corresponding touch area on a screen according to the point reporting data acquired each time, and then determining the moving direction of the touch operation according to the touch area corresponding to the point reporting data acquired each time. Wherein n is an integer greater than 1.
Step 1034, retracting the area corresponding to the moving direction in the current user interface.
Specifically, by retracting the area corresponding to the moving direction in the current UI, the operation of the thumb of the user can be facilitated, and the user experience is further improved.
The area corresponding to the moving direction may be preset, for example, when the moving direction is moving to the upper left/right corner of the screen, the corresponding area is the upper left/right corner of the current UI, and when the moving direction is moving to the lower left/right corner of the screen, the corresponding area is the lower left/right corner of the current UI.
As shown in fig. 2c, when the user holds the mobile terminal with the left hand and needs the left thumb to click the upper right corner of the screen, the moving direction of the touch operation can be determined to be moving to the upper right corner of the screen according to the click data, and at this time, the area corresponding to the moving direction in the current UI 10, that is, the area of the upper right corner can be retracted, so that the operation of the user thumb is facilitated, and the user experience is further improved.
While the area corresponding to the moving direction in the current UI is retracted, the display area of the UI may be narrowed according to the above step 1031 or step 1032, so as to avoid the misoperation caused by the touch error.
Another specific implementation flow of the embodiment of the present invention is illustrated as follows.
As shown in fig. 4, the screen control method according to the embodiment of the present invention includes:
step 401, obtaining the report point data on the screen.
Step 402, determining the touch time of the touch operation according to the report data.
And step 403, when the touch time exceeds the preset time, determining whether the touch operation has a moving trend according to the report point data.
And step 404, when the touch operation has a moving trend, determining a touch area on the screen according to the report point data.
And 405, judging whether the touch area is a preset edge area, if so, jumping to 406, and otherwise, jumping to 409.
And step 406, determining the moving direction of the touch operation according to the report point data.
Step 407, the area corresponding to the moving direction in the current user interface is retracted.
And step 408, adjusting the display area of the current user interface according to the touch area, or performing invalidation processing on the report point data.
And step 409, ending.
According to the screen control method, the response area of the touch screen is reduced by finely adjusting the UI interface or inhibiting point reporting, the identification accuracy is high, misoperation caused by mistaken touch of the edge is avoided, the UI cannot be seriously deformed, the watching effect of a user is ensured, the user operation is facilitated, and the user experience is improved.
In some embodiments of the present invention, as illustrated with reference to fig. 5, a mobile terminal 500 is also provided. The mobile terminal 500 includes:
an obtaining module 501, configured to obtain point reporting data on a screen;
a determining module 502, configured to determine a touch area on the screen according to the report point data;
an adjusting module 503, configured to adjust a display area of a current user interface according to the touch area when the touch area is a preset edge area, and/or perform invalidation processing on the report point data.
The mobile terminal 500 of the embodiment of the invention can effectively avoid misoperation caused by mistaken touch on the edge of the screen, improve the accuracy of mistaken touch recognition and improve user experience.
Optionally, as shown in fig. 6a, the adjusting module 503 includes:
a first narrowing sub-module 5031 configured to narrow the display area of the current user interface to a preset size; or
A second narrowing sub-module 5032 for narrowing the display area of the current user interface to exclude the touch area.
Optionally, the determining module 502 includes:
the first determining submodule 5021 is used for determining the touch time of the touch operation according to the report point data;
the second determining submodule 5022 is used for determining a touch area on the screen according to the hit data when the touch time exceeds the preset time.
Optionally, as shown in fig. 6b, the determining module 502 includes:
the third determining submodule 5023 is used for determining the touch time of the touch operation according to the report point data and determining whether the touch operation has a moving trend or not according to the report point data;
the fourth determining submodule 5024 is used for determining a touch area on the screen according to the report point data when the touch time exceeds the preset time and the touch operation has a moving trend.
Optionally, the adjusting module 503 includes:
a fifth determining sub-module 5033, configured to determine, according to the report data, a moving direction of the touch operation;
a third narrowing sub-module 5034, configured to narrow the area corresponding to the moving direction in the current user interface.
The mobile terminal provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 4, and is not described herein again in order to avoid repetition. The mobile terminal 500 of the embodiment of the invention can effectively avoid misoperation caused by mistaken touch on the edge of the screen, improve the accuracy of mistaken touch recognition and improve user experience.
Fig. 7 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention. The mobile terminal 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. The display unit 706 includes a screen. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 7 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 710 is configured to obtain point reporting data on a screen; determining a touch area on the screen according to the report point data; and when the touch area is a preset edge area, adjusting the display area of the current user interface according to the touch area, and/or performing invalidation processing on the report point data.
This mobile terminal 700 can effectively avoid the screen edge to be touched by the mistake and bump the maloperation that arouses, and has improved the mistake and touched the accuracy of touching discernment, has promoted user experience.
Optionally, the processor 710 is further configured to: reducing the display area of the current user interface to a preset size; or reducing the display area of the current user interface to not include the touch area.
Optionally, the processor 710 is further configured to: determining the touch time of the touch operation according to the report data; and when the touch time exceeds the preset time, determining a touch area on the screen according to the report point data.
Optionally, the processor 710 is further configured to: determining the touch time of the touch operation according to the report point data, and determining whether the touch operation has a moving trend or not according to the report point data; and when the touch time exceeds the preset time and the touch operation has a moving trend, determining a touch area on the screen according to the report point data.
Optionally, the processor 710 is further configured to: determining the moving direction of the touch operation according to the report point data; and retracting the area corresponding to the moving direction in the current user interface.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 701 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access via the network module 702, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the mobile terminal 700 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The mobile terminal 700 also includes at least one sensor 705, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or a backlight when the mobile terminal 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 705 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although the touch panel 7071 and the display panel 7061 are shown in fig. 7 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 708 is an interface through which an external device is connected to the mobile terminal 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 700 or may be used to transmit data between the mobile terminal 700 and external devices.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 709 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby integrally monitoring the mobile terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The mobile terminal 700 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may be logically coupled to the processor 710 via a power management system that may enable managing charging, discharging, and power consumption by the power management system.
In addition, the mobile terminal 700 includes some functional modules that are not shown, and thus will not be described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 710, a memory 709, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program is executed by the processor 710 to implement each process of the above-mentioned screen control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the screen control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. A screen control method, comprising:
acquiring point reporting data on a screen;
determining a touch area on the screen according to the report point data;
when the touch area is a preset edge area, adjusting the display area of the current user interface according to the touch area, or adjusting the display area of the current user interface and carrying out invalidation processing on the report point data; determining the preset edge area touched by mistake in a machine learning mode according to big data of user operation counted in advance;
the step of determining a touch area on the screen according to the hit data comprises:
determining the touch time of the touch operation according to the report point data, and determining whether the touch operation has a moving trend or not according to the report point data;
and when the touch time exceeds the preset time and the touch operation has a moving trend, determining a touch area on the screen according to the report point data.
2. The method of claim 1, wherein the step of adjusting the display area of the current user interface comprises:
reducing the display area of the current user interface to a preset size; or
Shrinking a display area of the current user interface to exclude the touch area.
3. The method of claim 1, wherein the step of adjusting the display area of the current user interface comprises:
determining the moving direction of the touch operation according to the report point data;
and retracting the area corresponding to the moving direction in the current user interface.
4. A mobile terminal, comprising:
the acquisition module is used for acquiring point reporting data on a screen;
the determining module is used for determining a touch area on the screen according to the report point data;
the adjusting module is used for adjusting the display area of the current user interface according to the touch area when the touch area is a preset edge area, or adjusting the display area of the current user interface and carrying out invalidation processing on the report point data; determining the preset edge area touched by mistake in a machine learning mode according to big data of user operation counted in advance;
the determining module comprises:
the third determining submodule is used for determining the touch time of the touch operation according to the report point data and determining whether the touch operation has a moving trend or not according to the report point data;
and the fourth determining submodule is used for determining a touch area on the screen according to the report point data when the touch time exceeds the preset time and the touch operation has a moving trend.
5. The mobile terminal of claim 4, wherein the adjusting module comprises:
the first reduction submodule is used for reducing the display area of the current user interface to a preset size; or
A second reduction submodule for reducing the display area of the current user interface to not include the touch area.
6. The mobile terminal of claim 4, wherein the adjusting module comprises:
a fifth determining submodule, configured to determine a moving direction of the touch operation according to the report point data;
and the third reduction submodule is used for reducing the area corresponding to the moving direction in the current user interface.
7. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on said memory and executable on said processor, said computer program, when executed by said processor, implementing the steps of the screen control method according to any one of claims 1 to 3.
8. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, realizes the steps of the screen control method according to any one of claims 1 to 3.
CN201711341157.0A 2017-12-14 2017-12-14 Screen control method and mobile terminal Active CN108008859B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711341157.0A CN108008859B (en) 2017-12-14 2017-12-14 Screen control method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711341157.0A CN108008859B (en) 2017-12-14 2017-12-14 Screen control method and mobile terminal

Publications (2)

Publication Number Publication Date
CN108008859A CN108008859A (en) 2018-05-08
CN108008859B true CN108008859B (en) 2020-09-15

Family

ID=62058812

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711341157.0A Active CN108008859B (en) 2017-12-14 2017-12-14 Screen control method and mobile terminal

Country Status (1)

Country Link
CN (1) CN108008859B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109663353B (en) * 2018-12-28 2023-08-11 努比亚技术有限公司 Game operation method, mobile terminal and computer readable storage medium
US20200310587A1 (en) * 2019-03-26 2020-10-01 Shenzhen Fugui Precision Ind. Co., Ltd. Touch-input computing device with optimized touch operation and method thereof
CN112445448B (en) * 2019-08-30 2022-07-22 华为技术有限公司 Flexible screen display method and electronic equipment
CN112134994A (en) * 2020-07-29 2020-12-25 深圳市修远文化创意有限公司 Processing method for reducing mistaken touch on mobile phone screen
CN112068730A (en) * 2020-08-27 2020-12-11 北京小米移动软件有限公司 Point output control method, point output control device, and storage medium
CN113485632A (en) * 2021-07-26 2021-10-08 深圳市柔宇科技股份有限公司 Touch control method for folding screen, terminal device and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106527806A (en) * 2016-11-09 2017-03-22 努比亚技术有限公司 Method and device for realizing touch control processing
CN107316033A (en) * 2017-07-07 2017-11-03 广东欧珀移动通信有限公司 Fingerprint identification method, device and the storage medium and mobile terminal of mobile terminal
CN107390923A (en) * 2017-06-30 2017-11-24 广东欧珀移动通信有限公司 A kind of screen false-touch prevention method, apparatus, storage medium and terminal
CN107390932A (en) * 2017-07-27 2017-11-24 北京小米移动软件有限公司 Edge false-touch prevention method and device, computer-readable recording medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201443763A (en) * 2013-05-14 2014-11-16 Acer Inc Mistouch identification method and device using the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106527806A (en) * 2016-11-09 2017-03-22 努比亚技术有限公司 Method and device for realizing touch control processing
CN107390923A (en) * 2017-06-30 2017-11-24 广东欧珀移动通信有限公司 A kind of screen false-touch prevention method, apparatus, storage medium and terminal
CN107316033A (en) * 2017-07-07 2017-11-03 广东欧珀移动通信有限公司 Fingerprint identification method, device and the storage medium and mobile terminal of mobile terminal
CN107390932A (en) * 2017-07-27 2017-11-24 北京小米移动软件有限公司 Edge false-touch prevention method and device, computer-readable recording medium

Also Published As

Publication number Publication date
CN108008859A (en) 2018-05-08

Similar Documents

Publication Publication Date Title
CN108459797B (en) Control method of folding screen and mobile terminal
CN108008859B (en) Screen control method and mobile terminal
CN108182043B (en) Information display method and mobile terminal
CN108319390B (en) Control method of flexible screen and mobile terminal
CN109871174B (en) Virtual key display method and mobile terminal
CN110531915B (en) Screen operation method and terminal equipment
CN108108113B (en) Webpage switching method and device
CN111459330B (en) Electronic equipment and pressure key operation method
CN107728923B (en) Operation processing method and mobile terminal
CN111049510A (en) Touch key, control method and electronic equipment
CN111106821A (en) Touch control method and wearable device
CN109443261B (en) Method for acquiring folding angle of folding screen mobile terminal and mobile terminal
CN108984099B (en) Man-machine interaction method and terminal
CN111352559A (en) Electronic equipment and control method
CN109189514B (en) Terminal device control method and terminal device
CN111158548A (en) Screen folding method and electronic equipment
CN108170310B (en) Touch screen control method and mobile terminal
CN108388400B (en) Operation processing method and mobile terminal
CN111310165B (en) Account switching or registering method and electronic equipment
CN110471808B (en) Pressure key detection method and device and mobile terminal
CN110673761B (en) Touch key detection method and terminal equipment thereof
CN109947345B (en) Fingerprint identification method and terminal equipment
CN109327605B (en) Display control method and device and mobile terminal
CN110764650A (en) Key trigger detection method and electronic equipment
CN111475247A (en) Display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant