CN115469787A - Processing method and device for clicking operation and storage medium - Google Patents

Processing method and device for clicking operation and storage medium Download PDF

Info

Publication number
CN115469787A
CN115469787A CN202211163803.XA CN202211163803A CN115469787A CN 115469787 A CN115469787 A CN 115469787A CN 202211163803 A CN202211163803 A CN 202211163803A CN 115469787 A CN115469787 A CN 115469787A
Authority
CN
China
Prior art keywords
control
click
volume
full
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211163803.XA
Other languages
Chinese (zh)
Inventor
赵伯豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211163803.XA priority Critical patent/CN115469787A/en
Publication of CN115469787A publication Critical patent/CN115469787A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a processing method and equipment for clicking operation and a storage medium. According to the method, when the controls acted on by one-time clicking operation are not unique, namely when the clicking positions corresponding to the clicking operation are overlapped, a target control selection model obtained based on the training of a decision tree sorting algorithm analyzes and processes the overlapping matching degree factor, the importance degree factor, the occurrence time length factor, the operation frequency factor and the injury degree factor of each overlapped control, and further determines the clicking priority and the clicking injury degree of each control acted on by the current-time clicking operation, so that whether different controls are clicked or not under different conditions is accurately judged, finally, the controls which need to be acted on by the current-time clicking operation are determined according to the clicking priority and the clicking injury degree of each control, the determined controls respond to the current-time clicking operation, the adverse effect caused by the misoperation is effectively avoided, and the user experience is improved.

Description

Processing method and device for clicking operation and storage medium
The present application is a divisional application, filed on 202111308242.3, filed on 2021, 11/month 05, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method, a device, and a storage medium for processing a click operation.
Background
With the popularization of electronic devices such as mobile phones and the increase of application complexity, more and more controls appear on the electronic devices at the same time, and a situation that a plurality of controls are overlapped often occurs in some use scenes. For example, in a playing interface of some live-broadcast applications and video applications, a full-screen control for implementing full-screen playing and a volume control for adjusting volume may overlap.
In these usage scenarios, when a user clicks a control, the control coinciding with the control may be touched by mistake, and the touched control responds to the click operation, for example, the full-screen control responds to the click operation to play the currently played content in full screen, but the volume control responds to the click operation to adjust the volume to the maximum by mistake, which not only brings a bad user experience to the user, but also brings unnecessary trouble to the user.
Disclosure of Invention
In order to solve the technical problem, the present application provides a method, a device and a storage medium for processing a click operation, which aim to accurately determine whether different controls are in a click miss-touch mode under different conditions, so as to avoid adverse consequences caused by the miss-operation.
In a first aspect, the present application provides a method for processing a click operation. The method comprises the following steps; when the clicking operation is monitored, determining a clicking position corresponding to the clicking operation; when a first control and a second control which are overlapped exist at the click position, respectively acquiring a first characteristic factor of the first control and a second characteristic factor of the second control, wherein the first characteristic factor and the second characteristic factor respectively comprise an overlap matching degree factor, an importance degree factor, an appearance time factor, an operation frequency factor and a damage degree factor; processing the first characteristic factor and the second characteristic factor through a target control selection model, and determining a first click priority and a first click injury degree corresponding to the first control and a second click priority and a second click injury degree corresponding to the second control; and selecting one from the first control and the second control as a target control according to the first clicking priority, the first clicking injury degree, the second clicking priority and the second clicking injury degree, and controlling the target control to respond to the clicking operation.
Therefore, according to the method provided by the embodiment of the application, when the controls acted on by one click operation are not unique, namely the click positions corresponding to the click operations are coincident controls, a target control selection model obtained based on the training of a decision tree sorting algorithm analyzes and processes the coincidence matching degree factor, the importance degree factor, the appearance time factor, the operation frequency factor and the damage degree factor of each coincident control, and further determines the click priority and the click damage degree of each control acted on by the click operation, so that whether different controls click mistakenly touch under different conditions or not is accurately judged through multiple characteristic factors, finally the controls which are finally acted on by the click operation are determined according to the click priority and the click damage degree of each control, the determined controls respond to the click operation, the adverse consequences caused by the misoperation are effectively avoided, and the user experience is improved.
According to a first aspect, the selecting one of the first control and the second control as a target control according to the first click priority, the first click injury, the second click priority and the second click injury, and controlling the target control to respond to the click operation includes: selecting one of the first control and the second control as the target control according to the first click priority and the second click priority; and controlling the target control to respond to the clicking operation according to the clicking injury degree corresponding to the target control. Therefore, the control targeted by the current clicking operation is determined through the clicking priority and the clicking injury determined based on the multiple characteristic factors, and the determined control is controlled to respond to the current clicking operation, so that the accuracy of the judgment result is ensured through multi-dimensional reference factors.
According to the first aspect, or any implementation manner of the first aspect, the selecting one of the first control and the second control as the target control according to the first click priority and the second click priority includes: when the first click priority is higher than the second click priority, selecting the first control as the target control; and when the second click priority is higher than the first click priority, selecting the second control as the target control. Therefore, the control with high clicking priority is selected as the target control, so that the finally determined target control is the control which the user really wants to click, and the phenomenon of mistaken clicking is reduced.
According to the first aspect or any implementation manner of the first aspect, the controlling the target control to respond to the click operation according to the click injury degree corresponding to the target control includes: when the click injury degree corresponding to the target control is lower than an injury degree threshold value, controlling the target control to respond to the click operation; when the click injury degree corresponding to the target control is not lower than the injury degree threshold value, prompting a user whether to execute the click operation on the target control; and controlling the target control to respond to the click operation when the click operation is determined to be executed on the target control. According to the technical scheme, whether the target control directly responds to the click operation or firstly prompts the user by utilizing the judgment factor of the click injury degree, and then whether the target control responds to the click operation or not is determined again by the user, so that when the user does not have secondary borrowing, the click operation is ensured to be triggered to have high click priority, the control with low click injury degree is ensured, and unnecessary trouble is brought to the user by mistakenly touching the control with high click injury degree.
According to the first aspect, or any implementation manner of the first aspect, the target control is a volume control for adjusting volume; before the controlling the target control to respond to the click operation according to the click injury degree corresponding to the target control, the method further includes: acquiring a first volume value of a current environment; determining the injury threshold according to the first volume value. Specifically, in the technical scheme provided by the embodiment of the application, the injury threshold is closely related to the volume of the current environment where the electronic equipment is located, so that the target control can respond to the click operation, the adjusted volume does not make a user feel uncomfortable with surrounding people, and the user experience is further prompted.
According to the first aspect, or any implementation manner of the first aspect, before the prompting the user whether to perform the click operation on the target control, the method includes: acquiring a current second volume value of the electronic equipment; determining a third volume value corresponding to the electronic equipment after the target control responds to the click operation; when the difference value of the second volume value and the third volume value is smaller than a volume threshold value, executing the step of controlling the target control to respond to the click operation; and when the difference value of the second volume value and the third volume value is not less than the volume threshold value, executing the step of prompting the user whether to execute the click operation on the target control. Specifically, in the technical scheme provided by the embodiment of the application, the click injury degree corresponding to the target control is not lower than the injury degree threshold, the difference between the current volume value of the electronic device and the volume value adjusted in response to the click operation is further judged to be in relation to the volume threshold before the user is prompted, and then whether the electronic device responds to the click operation or prompts the user according to the judgment result is determined.
According to the first aspect, or any implementation manner of the first aspect, the prompting a user whether to perform the click operation on the target control includes: prompting a user whether to execute the clicking operation on the target control in a popup mode; or prompting the user whether to execute the click operation on the target control in a voice mode. Therefore, under the condition that the use of the user is not influenced as much as possible, the user can know that a certain risk exists when the target control responds to the clicking operation, and therefore correct operation is conducted again according to the prompt.
According to the first aspect, or any implementation manner of the first aspect, the target control selection model is obtained by a server based on a decision tree sorting algorithm; prior to the processing of the first and second feature factors by the target control selection model, the method further comprises: detecting whether the target control selection model exists locally; when the target control selection model does not exist locally, sending a request for acquiring the target control selection model to the server; before user behavior data are not sent to the server, the target control selection model does not exist in the server, and the user behavior data are generated according to response information of the target control to the click operation; receiving an initial control selection model sent by the server, and taking the initial control selection model as the target control selection model; wherein the initial control selection model is obtained by the server through training group sample data based on the decision tree sorting algorithm. Specifically, when the electronic device obtains the target control selection model from the server for the first time, since the user behavior data of the user using the target control selection model is not fed back to the server, the server does not have a target control selection model tailored to the user, and the initial control selection model obtained based on the user behavior data training of the users in the big data group is pushed to the electronic device, so that the processing result of the electronic device for the click operation according to the initial control selection model can be basically suitable for the current user.
In addition, the control selection model for determining the target control is obtained by training of the server, so that whether the resources of the electronic equipment support the training model or not does not need to be considered, the performance requirement on the electronic equipment is reduced, and the technical scheme provided by the embodiment of the application can be suitable for more electronic equipment.
According to a first aspect, or any implementation manner of the first aspect above, after the controlling the target control in response to the click operation, the method further includes; generating user behavior data according to the response information of the target control to the click operation; sending the user behavior data to the server for the server to perform optimization training on the initial control selection model according to the user behavior data; and receiving the initial control selection model after the optimization training of the server to replace the locally stored target control selection model. Therefore, according to the method provided by the embodiment of the application, the electronic equipment feeds back the user behavior data generated when the target control is operated to the server providing the control selection model, so that the server can perform optimization training on the control selection model (possibly an initial control selection model or a target control selection model) needing to be sent to the electronic equipment by using the user behavior data fed back by each electronic equipment, and then pushes the optimized target control selection model to the corresponding electronic equipment for subsequent use.
In a second aspect, the present application provides a method for processing a click operation. The method is applied to the electronic equipment with the display interface, and comprises the following steps: displaying a multimedia picture and a full-screen control in the display interface, wherein the multimedia picture is played in a non-full-screen mode, and the multimedia picture is played in a full-screen mode when the full-screen control is clicked; after receiving a volume adjustment operation, displaying a volume control, wherein a first part of the volume control is overlapped with the full screen control; and in the process of simultaneously displaying the volume control and the full-screen control, after receiving the click operation on the first part, playing the multimedia picture in full screen. Therefore, when the full-screen control and the volume control are simultaneously displayed on the display interface, even if the user clicks the overlapping area of the two controls, the electronic equipment can play the multimedia picture in the full screen mode in response to the clicking operation instead of considering that the user needs to adjust the volume, and therefore error response is avoided.
According to a second aspect, the displaying a multimedia picture and a full screen control in the display interface includes: displaying multimedia options in the display interface when a click operation of an application providing the multimedia picture is received, wherein the multimedia options correspond to the multimedia picture, and the display interface plays the corresponding multimedia picture when the multimedia options are clicked; when the click operation of the multimedia options is received, displaying the multimedia pictures corresponding to the multimedia options in the display interface; and when the clicking operation on the multimedia picture is received, displaying the full-screen control in the display interface. Therefore, the change of the display interface from displaying the icons of the application to displaying the multimedia picture and the full-screen control is realized.
According to a second aspect, or any implementation manner of the second aspect, in the process of simultaneously displaying the volume control and the full screen control, when a click operation on the first portion is received, the full screen playing the multimedia picture includes: determining the click priority of the volume control and the click priority of the full screen control; when the click priority of the volume control is higher than that of the full-screen control, adjusting the volume to the maximum volume value; and when the click priority of the full screen control is higher than the click priority of the volume control, the multimedia picture is played in full screen.
According to a second aspect, or any implementation form of the second aspect above, the method further comprises: when the click priority of the volume control is equal to the click priority of the full-screen control, popping up a first prompt window on the display interface, wherein the first prompt window comprises a first option and a second option, the first option corresponds to the volume control, and the second option corresponds to the full-screen control. In this way, when the electronic device cannot determine which control responds to the click operation, a prompt window pops up on the display interface to prompt the user so that the user can further select to determine which control is operated.
According to a second aspect, or any implementation manner of the second aspect, after the display interface pops up the first prompt window, the method further includes: when the clicking operation on the first option is received, adjusting the volume to the maximum volume value; and when the click operation on the second option is received, the multimedia picture is played in a full screen mode. Therefore, whether the user needs to adjust the volume or play the multimedia picture in a full screen mode is determined according to the clicking of the user on the options in the prompt window, so that the accuracy of each response is ensured, and the inconvenience brought to the user by false response is effectively avoided.
According to a second aspect, or any implementation manner of the second aspect above, when the click priority of the volume control is higher than the click priority of the full screen control, the method further includes, before adjusting the volume to a maximum volume value: determining the click injury degree of the volume control; when the click injury degree of the volume control is lower than an injury degree threshold value, executing the step of adjusting the volume to the maximum volume value; when the click injury degree of the volume control is not lower than the injury degree threshold value, popping up a second prompt window on the display interface, wherein the second prompt window comprises prompt information, a third option and a fourth option, the prompt information is used for prompting a user whether to execute the step of adjusting the volume to the maximum volume value, the third option corresponds to the volume control, and the fourth option is used for canceling the click operation.
According to a second aspect, or any implementation manner of the second aspect above, after the display interface pops up a second prompt window, the method further includes: when the clicking operation on the third option is received, adjusting the volume to the maximum volume value; and when the click operation on the fourth option is received, continuing playing the multimedia picture in the video area of the display interface.
According to a second aspect, or any implementation manner of the second aspect, the determining the click priority of the volume control and the click priority of the full screen control includes: respectively acquiring characteristic factors of the volume control and the full-screen control, wherein the characteristic factors comprise a coincidence matching degree factor, an importance degree factor, an appearance duration factor, an operation frequency factor and a damage degree factor; and processing the characteristic factors of the volume control and the full screen control through a target control selection model, and determining the click priority corresponding to the volume control and the click priority corresponding to the full screen control.
In a third aspect, the present application provides an electronic device comprising a memory and a processor, the memory and the processor coupled; the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the first aspect or instructions of the method in any implementation of the first aspect above.
In a fourth aspect, the present application provides a server comprising a memory and a processor, the memory and the processor coupled; the memory stores program instructions that, when executed by the processor, cause the server to perform the steps of:
training user behavior data provided by a big data group when the control is operated based on a decision tree sorting algorithm to obtain an initial control selection model;
when a request for acquiring a target control selection model sent by electronic equipment is received, whether the target control selection model requested by the electronic equipment exists locally is checked;
when the target control selection model exists, sending the target control selection model to the electronic equipment, wherein the target control selection model is obtained after carrying out optimization training on the initial control selection model based on user behavior data sent by the electronic equipment;
and when the target control selection model does not exist, sending the initial control selection model to the electronic equipment.
According to a fourth aspect, the program instructions, when executed by the processor, cause the server to further perform the steps of:
receiving user behavior data sent by the electronic equipment;
and performing optimization training on the initial control selection model according to the user behavior data to obtain the target control selection model corresponding to the electronic equipment.
In a fifth aspect, the present application provides a computer-readable medium for storing a computer program which, when run on an electronic device, causes the electronic device to execute the instructions of the first aspect, or the method in any implementation of the above first aspect, or the instructions of the second aspect, or the method in any implementation of the above second aspect; the instructions which, when executed on a server, cause the server to perform the fourth aspect or the method of any implementation of the fourth aspect above.
In a sixth aspect, the present application provides a chip comprising a processing circuit, a transceiver pin. Wherein the transceiver pin and the processing circuit are in communication with each other via an internal connection path, and the processing circuit executes instructions of the method in the first aspect, or any one of the implementations of the first aspect above, or executes instructions of the method in the second aspect, or any one of the implementations of the second aspect above, or executes instructions of the method in the fourth aspect, or any one of the implementations of the fourth aspect above, to control the receiver pin to receive signals, and to control the transmitter pin to transmit signals.
In a seventh aspect, the present application provides a system for processing a click operation, where the system includes the electronic device according to the third aspect and the server according to the fourth aspect.
Drawings
Fig. 1 is a schematic diagram illustrating a hardware configuration of an electronic device;
FIG. 2 is a schematic view of an exemplary scenario to be solved by a processing method of a click operation provided in an embodiment of the present application;
FIG. 3 is a second schematic view of a scenario to be solved by the method for processing a click operation according to the embodiment of the present application;
fig. 4 is a third schematic view of a scenario to be solved by the processing method of a click operation provided by the embodiment of the present application, which is shown exemplarily;
FIG. 5 is a fourth schematic view of a scenario to be solved by the processing method of the click operation provided by the embodiment of the present application, which is exemplarily shown;
fig. 6 is a fifth schematic view of a scenario to be solved by the processing method of the click operation provided by the embodiment of the present application, which is shown exemplarily;
FIG. 7 is a sixth exemplary view illustrating a scenario to be solved by the method for processing a click operation according to the embodiment of the present application;
fig. 8 is a seventh schematic view of a scenario to be solved by a processing method of a click operation provided by the embodiment of the present application, which is exemplarily shown;
fig. 9 is an eighth exemplary view illustrating a scenario to be solved by a processing method of a click operation according to an embodiment of the present application;
FIG. 10 is a ninth exemplary view illustrating a scenario to be solved by a processing method of a click operation according to an embodiment of the present application;
FIG. 11 is a diagram illustrating ten scenarios to be solved by a processing method of a click operation provided by an embodiment of the present application;
FIG. 12 is an eleventh view schematically illustrating a scenario to be solved by a processing method of a click operation provided by an embodiment of the present application;
fig. 13 is a twelve-point schematic view illustrating a scenario to be solved by a processing method of a click operation according to an embodiment of the present application;
fig. 14 is a thirteen schematic views of exemplary scenarios to be solved by a processing method of a click operation provided in an embodiment of the present application;
fig. 15 is a fourteenth exemplary view illustrating a scenario to be solved by a processing method of a click operation according to an embodiment of the present application;
fig. 16 is a fifteen-level schematic view schematically illustrating a scenario to be solved by a processing method of a click operation provided by an embodiment of the present application;
FIG. 17 is a schematic flowchart illustrating a processing method of a click operation according to an embodiment of the present application;
FIG. 18 is an optimized diagram of an exemplary control selection model used in a method for processing a click operation according to an embodiment of the present application;
FIG. 19 is another optimized schematic diagram of an exemplary control selection model used in the processing method of the click operation according to the embodiment of the present application;
FIG. 20 is a schematic diagram illustrating an interface for processing overlapping full-screen controls and audio controls according to a processing method of a click operation provided by an embodiment of the present application;
FIG. 21 is a second schematic diagram illustrating an interface for processing overlapping full-screen controls and audio controls according to the processing method for a click operation provided by the embodiment of the present application;
FIG. 22 is a third exemplary illustration of an interface for processing overlapping full-screen controls and audio controls according to the click operation processing method provided by the embodiment of the present application;
FIG. 23 is a fourth interface diagram illustrating processing of overlapping full-screen controls and audio controls based on the processing method for a click operation provided by the embodiment of the application;
fig. 24 is a fifth interface schematic diagram exemplarily illustrating processing of the overlapped full-screen control and audio control based on the processing method of the click operation provided by the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The terms "first" and "second," and the like, in the description and in the claims of the embodiments of the present application are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first target object and the second target object, etc. are specific sequences for distinguishing different target objects, rather than describing target objects.
In the embodiments of the present application, the words "exemplary" or "such as" are used herein to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present application, the meaning of "a plurality" means two or more unless otherwise specified. For example, a plurality of processing units refers to two or more processing units; the plurality of systems refers to two or more systems.
In order to better understand the technical solutions provided by the embodiments of the present application, before describing the technical solutions of the embodiments of the present application, first, a hardware structure of an electronic device (for example, a mobile phone, a tablet device, a PC device, etc.) to which the embodiments of the present application are applied is described with reference to the drawings.
Referring to fig. 1, the electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like.
Illustratively, the audio module 170 may include a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and the like.
For example, the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
Specifically, in the technical solution provided in the embodiment of the present application, the click operation specifically acts on a control displayed in the display screen 194, and the monitoring on the click operation may be realized by, for example, a touch sensor, and the click position corresponding to the click operation may be determined by, for example, a pressure sensor.
In addition, in some embodiments, for a scene in which the click operation is to adjust the volume, the volume of the speaker or the volume currently played by the earphone may be considered when the volume is adjusted; in other embodiments, for a scene in which the clicking operation is to adjust the brightness of the screen, the brightness of the current environment may also be determined by the ambient light sensor during the brightness adjustment, so as to make a reasonable response according to the actual environment adjustment.
While the hardware architecture of electronic device 100 is described herein, it should be understood that electronic device 100 shown in FIG. 1 is merely an example, and that electronic device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components in a particular implementation. The various components shown in fig. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Based on the hardware structure shown in fig. 1, specifically in practical applications, there are roughly the following scenarios that need to process the click operation by using the technical solution provided in the embodiment of the present application to avoid false touch:
scene 1:
for example, referring to fig. 2, for some video-like applications, a teenager guard mode is currently provided to avoid long-term teenager viewing. Taking the electronic device for installing the video applications as an example of a mobile phone, specifically, when the user clicks the video application installed in the mobile phone in fig. 2 (1), the mobile phone jumps to the main page of the video application in response to the clicking operation, for example, as shown in fig. 2 (2), and then for the video application of the type providing the teenager guard mode, a pop-up window for opening the teenager guard mode shown in fig. 2 (3) may pop up at a certain time, for example, 5s after jumping to the main page shown in fig. 2 (2), when the video application is at the main page shown in fig. 2 (2). If the user clicks the control corresponding to the video B displayed in the main page at the moment, and the popup window for starting the teenager protection mode is just displayed on the control corresponding to the video B at the moment, the clicking operation of the user on the control corresponding to the video B originally can be mistaken for starting the teenager protection mode control in the popup window for starting the teenager protection mode. In this case, the mobile phone jumps to the details interface for turning on the teenager guard mode in response to the click operation of the user, as shown in fig. 2 (4), for example.
Scene 2:
for example, referring to fig. 3, in some application scenarios, in order to collect the user's rating of the application, an interface jumping to the rating application may pop up on the main page of the application while the user is using the application. Still taking the example of the user triggered video application, similar to the scenario 1 described above, when the user clicks the video application installed in the mobile phone in fig. 3 (1), the mobile phone jumps to the home page of the video application in response to the click operation, for example, as shown in fig. 3 (2), and may pop up the pop-up window shown in fig. 3 (3) at a certain time, for example, 5s after jumping to the home page shown in fig. 3 (2) when the mobile phone is at the home page shown in fig. 3 (2). If the user is just clicking the control corresponding to the video B displayed in the main page at the moment, the clicking operation of jumping to write comments is mistaken for the control of 'not good in popup window, suggestion' shown in the picture 3 (3) at the moment. In this case, the mobile phone jumps to an interface for writing comments in response to a click operation by the user, for example, as shown in fig. 3 (4).
Scene 3:
for example, referring to fig. 4, in some application scenarios, in order to facilitate a user to know in time that a new version is released, when a certain application has the new version released, the new version may be directly displayed in a popup window on a user interface of the mobile phone, and still taking a clicked application as an example, after the mobile phone jumps to an interface shown in fig. 4 (2) in response to a click operation of the user, assuming that the user still wants to click a space corresponding to "video B", and at the instant of the click operation of the user, a popup window shown in fig. 4 (3) for "finding whether the new version is updated" happens to pop up, and a click position corresponding to the click operation of the user happens to fall on a control "updated immediately" in the popup window, in an existing solution, the mobile phone jumps to a page updated by the video application in response to the click operation, for example, as shown in fig. 4 (4).
Scene 4:
illustratively, in the playing interfaces of some live broadcast applications and video applications, a full-screen control for realizing full-screen playing and a volume control for adjusting volume are overlapped, so that when a click position corresponding to a click operation triggered by a user has both the full-screen control and the volume control, a response behavior to the click operation has two different expression forms. One is that the user actually wants to adjust the volume, and as a result, the mobile phone mistakenly thinks that the full screen needs to be realized, so that the mobile phone can adjust the currently played picture to be full screen playing after responding to the click operation of the user, for example, as shown in fig. 5; the other is that the user actually wants to set the current picture to be full screen, and as a result, the mobile phone mistakenly thinks that the volume needs to be adjusted, so that the mobile phone responds to the click operation of the user and then adjusts the current volume to the maximum volume, for example, as shown in fig. 6.
It should be noted that, in practical applications, the volume control shown in fig. 5 and 6 may be skipped out through a mechanical button on the side of the mobile phone, for example, when the user operates the popped volume control, the mobile phone may mistakenly recognize a position where the user finally stops interacting or a position where the user starts sliding as a click operation of the user every time the user recognizes a sliding gesture, and if the click position corresponding to the click operation just falls in the area of the full-screen control, the mobile phone may mistakenly think that the current operation is the operation for realizing the full screen.
Correspondingly, when the user actually clicks the full-screen control, if the user accidentally touches the volume control jumped out by the mechanical key on the side of the mobile phone, the mobile phone may mistakenly assume that the operation is directed to the volume control when the user clicks the full-screen control.
Scene 5:
for example, in a multi-screen collaboration scenario, there may be a situation where a user wants to open a memo application (installed on the Pad) that is placed on the operating interface on the Pad by a mobile phone, for example, as shown in fig. 7, but mistakenly touches a close button on the display interface close to the memo, and as a result, the interface of the Pad becomes as shown in fig. 8.
For example, in a multi-screen collaborative scenario, there may be a situation where a user wants to close an operation interface that a mobile phone puts on a Pad, but a click operation is determined to be a memo application on the Pad by mistake, and as a result, the interface of the Pad becomes as shown in fig. 9.
Scene 6:
for example, referring to fig. 10, in some implementation scenarios, a user wants to call for three pages of a searched page, and as a result, when clicking on a phone control on the column of three pages in fig. 10 (1), a notification that a new mail is received pops up, and as a result, the clicking operation is mistaken for an operation on the notification column that has just popped up, as shown in fig. 10 (2), in which case, the mobile phone jumps to the mail interface in response to the clicking operation, as shown in fig. 10 (3).
Illustratively, referring to fig. 11, in some implementation scenarios, for example, the user wants to make a call to page five in the interface of fig. 11 (1), but this time pops up a notification of receiving a new mail, as shown in fig. 11 (2), if the user wants to slide the notification bar upwards, the content in the notification bar is temporarily ignored, and as a result, the control of dialing a page three phone is touched by mistake during the operation, in which case, the mobile phone jumps to the call interface of calling page three in response to the click operation, as shown in fig. 11 (3).
Scene 7:
for example, in some application scenarios, for example, a floating window (for example, the large window for playing a video picture in fig. 12 and 13, and the floating icon for opening a video playing picture in fig. 14 and 15) is positioned on the upper layer of the display interface of the mobile phone, the user may want to close the large window for playing the video picture, and as a result, the click operation is mistaken for opening the memo application under the floating window, and the interface of the mobile phone switches from the content shown in fig. 12 (1) to the content shown in fig. 12 (2) after responding to the click operation; in another case, the user actually wants to open the memo application under the floating window while keeping the current interface to continue playing the video picture in the floating window, but the click operation is mistaken for closing the large window playing the video picture, and after the mobile phone responds to the click operation, the interface is switched from the content shown in fig. 13 (1) to the content shown in fig. 13 (2).
For example, in some application scenarios, for example, a floating icon (for example, the floating icon for starting a video playing screen in fig. 14 and 15) is arranged on the upper layer of the display interface of the mobile phone, the user wants to open the memo application under the floating icon while keeping the current interface to continue displaying the floating icon, but the user is mistakenly considered to enter the playing interface corresponding to the floating icon by a click operation, and after the mobile phone responds to the click operation, the interface is switched from the content shown in fig. 14 (1) to the content shown in fig. 14 (2); in another case, the user actually wants to enter the playing interface corresponding to the floating icon, but the clicking operation is mistaken for opening the memo application under the floating icon while keeping the current interface to continue displaying the floating icon, and after the mobile phone responds to the clicking operation, the interface is switched from the content shown in fig. 15 (1) to the content shown in fig. 15 (2).
Scene 8:
for example, referring to fig. 16, in some application scenarios, a user wants to view a short message received in a score of 6.
It should be understood that the above description is only for better understanding of the technical solution of the present embodiment, and the cited examples can be applied to the scenario, and are not to be taken as the only limitation to the present embodiment. The following describes, with reference to fig. 17, a technical solution provided in the embodiment of the present application for solving the problem in the foregoing scenario in detail.
Referring to fig. 17, an implementation procedure of the technical solution provided in the embodiment of the present application specifically includes:
step S101, when the clicking operation is monitored, the clicking position corresponding to the clicking operation is determined.
It can be understood that, in current electronic devices, for example, devices such as a mobile phone and a Pad, generally, operations on a control displayed on a screen are realized by touching and clicking the screen, and thus the above-mentioned clicking operations are actually touching and clicking a certain position on the screen by a user.
For example, the above-mentioned clicking operation is monitored by a touch sensor disposed inside the electronic device, and the corresponding clicking position of the clicking operation on the screen, i.e. the specific clicking coordinate, is determined by a pressure sensor disposed inside the electronic device.
It can be understood that, in practical applications, the pressure sensor may determine not only a specific click position of a click operation triggered by a user on a screen, but also information such as a pressure value of the click operation on the screen at this time.
Step S102, when a first control and a second control which are overlapped exist at the clicking position, a first characteristic factor of the first control and a second characteristic factor of the second control are respectively obtained.
Specifically, after the click position corresponding to the click operation is determined, the position information of each control in the layout can be determined according to the layout corresponding to the picture displayed on the current screen, so that whether the click position corresponding to the click operation exists in the position information included by traversing each control is searched, all controls including the click position corresponding to the click operation are screened, and whether the overlapped control exists at the click position can be determined.
Correspondingly, when the screened controls are more than or equal to two controls, the condition that the superposed controls exist at the clicking position is indicated.
For convenience of illustration, the embodiment takes the case that two coincident controls, namely a first control and a second control, exist at a click position as an example.
Accordingly, when it is determined that the first control and the second control which are overlapped exist at the click position, in order to avoid that the electronic device mistakenly considers the click operation on the first control as being performed on the second control or considers the click operation on the second control as being performed on the first control, the technical scheme provided by this embodiment needs to analyze and process according to a plurality of feature factors of the two overlapped controls, and therefore needs to respectively obtain the first feature factor of the first control and the second feature factor of the second control.
In addition, it is understood that regarding the above-mentioned overlapping, in some embodiments, the first control may overlap with a partial area of the second control, or the first control may overlap with the second control completely.
Step S103, processing the first characteristic factor and the second characteristic factor through a target control selection model, and determining a first click priority and a first click injury degree corresponding to the first control, and a second click priority and a second click injury degree corresponding to the second control.
For example, in an implementation manner, the target control selection model used for determining the click priority and the click damage degree of each control in the embodiment may be obtained by a server based on a decision tree ranking algorithm for training, and in this manner, the pressure of the training model is transferred to the server, so that the requirement on the performance of the electronic device is reduced, and the technical scheme provided by the embodiment can be applied to more electronic devices.
For example, in another implementation manner, the target control selection model used for determining the click priority and the click damage degree of each control in the embodiment may also be obtained by the electronic device through training based on a decision tree sorting algorithm, and in this manner, the electronic device does not need to interact with a server, and even if there is no network, the target control selection model can be implemented.
For convenience of illustration, the target control selection model is obtained by the server based on training of a decision tree sorting algorithm in this embodiment.
It should be noted that, in order to accurately determine whether different controls click the wrong touch under different conditions, the training data required for training the target control selection model includes feature factors with multiple dimensions.
Illustratively, these characteristic factors may be, for example, a coincidence matching factor, an importance factor, a length of occurrence factor, an operation frequency factor, and a degree of injury factor.
Illustratively, the coincidence matching degree factor can be refined into the control area, the control coincidence degree, the click position and the control matching degree.
The area of the control is used for representing the area of the control under the N × N unit pixels, and N is a number larger than 0; the control overlapping degree is used for representing the overlapping degree of the areas of each group of controls participating in the sequencing (the click priority and the click injury degree); the click position is the position coordinate of the click operation in the small unit area of N x N; the matching degree of the click position and the control is used for judging whether the click operation is in the area occupied by the control or not, and according to the degree that the click operation is located at the core position or the remote position of the control.
For example, the importance factor is specifically characterized by the control operation type, for example, the priority of the system operation type is set to be lower than the priority of the third party Application (APP) operation type.
Illustratively, the occurrence duration factor specifically refers to a duration of the control appearing in the screen, and specifically refers to a historical duration of the control which should be operated in the screen if the click operation is not processed by the algorithm.
Illustratively, the operation frequency can be further refined into the number of times of clicking the control each day and the number of times of clicking the control in unit time.
The number of times of clicking the control every day is that the control which should be subjected to the clicking operation is subjected to big data statistics to average the number of times of clicking every day by each user if the clicking operation is not processed by the algorithm; the number of times of clicking the control in unit time is unit time of the user, for example, the number of times of trying to click the control in 5s is larger, and the larger the number of times is, the larger the error of the sorting is proved to be, and the user repeatedly tries to click.
Illustratively, the injury factor is a severity level of injury to the user from a clicking operation. The level of injury caused by the user clicking the operation includes, but is not limited to, physical injury (e.g., adjusting the volume too much, adjusting the screen brightness too much, etc.) and injury to the user experience (resetting the mobile phone, restoring factory settings, etc.).
With respect to the above listed several characteristic factors, and the characteristic value threshold corresponding to the specific characteristic information included in each characteristic factor, for example, it can be shown in table 1.
TABLE 1A list of characteristic factors included in training data for training a target control selection model
Figure BDA0003861338950000121
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
Based on the dimensions given in table 1, when the server trains the big data group data by using a decision tree sorting algorithm, the training process is as follows:
1. preprocessing training data
For example, the screen may be segmented according to N × N pixel units, then control samples of various system applications and common third-party applications in each unit are obtained, click training is performed, and then the controls are segmented and preprocessed according to the characteristic factors in table 1.
It can be understood that the pixel units of N × N according to which the screen is divided can be set according to actual business requirements, for example, the requirement is training precision, which is not limited in this embodiment.
In addition, the click training of the control may include, for example, an nociceptive click, a mis-touch click, and a normal click.
So-called nociceptive clicks may be, for example, the use of a hard click on a control in the screen; the click error can be, for example, after the first control at the overlapping position responds to the click operation, the user immediately exits from the interface responding to the click operation on the first control, and the click operation is performed on the overlapping position again until the second control responds to the click operation, so as to explain the previous click error click; the normal click is a response made for each click operation, and the user does not return to the previous interface to perform the click operation again.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
2. Training process
Further, after the division is performed according to the 5 types of feature factors in table 1, data corresponding to the 5 types of feature factors can be used as input data of a Decision Tree ordering (FBDT) algorithm training control selection model (specifically, an initial control selection model), iterative training is performed according to the estimated iteration number and the weight reduction coefficient, and parameter tuning optimization is continuously performed in each training process until the output control selection model meets the corresponding service requirement.
Then, for each control involved in the click operation, according to the characteristic factors given in table 1, the characteristic factors corresponding to the click operation are input into the trained control selection model again, so that the click priority and the click injury degree corresponding to each control can be obtained.
Illustratively, when the click priority is higher, the higher the intention of the user to click the control is; when the click injury degree is higher, the higher the danger degree of the user clicking the control is indicated.
Therefore, which control the user wants to click can be determined according to the click priority and the click injury, and whether the user has a certain danger degree after clicking the control or not can be determined.
As can be seen from the above description, the control is trained by the server when selecting the model, so that in order to ensure that the first characteristic factor of the first control and the second characteristic factor of the second control obtained in step S102 can be analyzed and processed by the target control selection model, so as to obtain the first click priority and the first click injury degree of the first control, and the second click priority and the second click injury degree of the second control, before the click operation is monitored, or after the first characteristic factor and the second characteristic factor are obtained, the electronic device needs to first detect whether the target control selection model exists locally.
Correspondingly, if the local exists, the step S103 may be directly executed; otherwise, the electronic device needs to obtain the target control selection model from the server.
Specifically, in this embodiment, in order to reduce the training pressure on the server as much as possible and enable the trained control selection model to be suitable for most user groups, the control selection model at the first training position of the server may be based on user behavior data provided by users with large data groups, that is, data generated when a user operates a control in a screen, train an initial control selection model, and then push the trained initial control selection model to each electronic device providing user behavior data, such as the mobile phone 1, the mobile phone 2, and the mobile phone 3 in fig. 18, or after receiving a request from each electronic device, push the trained initial control selection model to the corresponding electronic device.
Further, in order to implement individual differences between users and achieve individual differences, the server may further collect user behavior data subsequently generated by each electronic device using the initial control selection model, perform optimization training on the initial control selection model according to different user behavior data, further obtain target control selection models for different electronic devices, and push the target control selection models of different electronic devices to corresponding electronic devices for use, for example, as shown in fig. 19.
That is to say, when the target control selection model does not exist locally in the electronic device, a request for obtaining the target control selection model needs to be sent to the server, and before the electronic device sends the request for obtaining the target control selection model to the server, if user behavior data generated by a response message of the target control determined by using the initial control selection model to a click operation is not sent to the server, it indicates that the target control selection model after the initial control selection model is optimally trained based on the user behavior data does not exist in the server, and the initial control selection model at a training position based on the behavior data of a large data group user and pushed to the electronic device by the server is still the initial control selection model. Therefore, after the electronic equipment receives the initial control selection model sent by the server, the electronic equipment can only use the initial control selection model as the target control selection model before acquiring the target control selection model pushed by the server.
And step S104, selecting one of the first control and the second control as a target control according to the first click priority, the first click injury degree, the second click priority and the second click injury degree, and controlling the target control to respond to the click operation.
For example, in this embodiment, specifically, one of the first control and the second control is selected as the target control according to the first click priority and the second click priority, and then the target control is controlled to respond to the click operation according to the click injury degree corresponding to the target control.
As can be seen from the above description, a higher click priority indicates a stronger intention of a user to click, so that the selecting of one of the first control and the second control as the target control according to the first click priority and the second click priority is specifically, in this embodiment, when the first click priority is higher than the second click priority, the selecting of the first control as the target control is specifically performed; and when the second click priority is higher than the first click priority, selecting the second control as the target control. That is, the target control with the high click priority is selected.
In addition, it should be noted that, specifically in an actual application scenario, there may be some click injuries with high click injury degrees of click operations, but a target control targeted by the click operation is indeed a control that a user wants to click, and the current click operation is also a normal click, and in addition, in order to avoid an injury brought to the user by directly responding to the click operation, a prompt may be made to the user according to a service requirement.
That is to say, when the target control is controlled to respond to the click operation according to the click injury degree corresponding to the target control, it may be determined whether the click injury degree corresponding to the target control is lower than an injury degree threshold value.
Correspondingly, when the click injury degree corresponding to the target control is lower than an injury degree threshold value, the target control can be directly controlled to respond to the click operation; otherwise, when the click injury degree corresponding to the target control is not lower than the injury degree threshold value, a prompt can be made to a user to prompt the user whether to execute the click operation on the target control, and when the click operation needs to be executed on the target control, the target control is controlled to respond to the click operation.
For example, the prompt to the user may be a pop-up window to prompt the user whether to perform the click operation on the target control, or a voice to prompt the user whether to perform the click operation on the target control.
The processing mode can be applied to various scenes, when each different scene is specified, the target control selection model is optimized according to different clicked controls and different conditions, and then the target control is determined according to the optimized target control selection model and whether the target control responds to the click operation or not.
For example, when the method is applied to the scene shown in fig. 2, the information of the user account currently logged in by the video software may be acquired, the age of the user is determined, or the user head portrait is acquired by a front-facing camera to determine the age range of the user, and if the user is a teenager, the clicking operation responds to the start and teenager protection mode; otherwise, video B is turned on.
For example, when the method is applied to the scenario shown in fig. 3, it may be determined whether the user likes to evaluate the application according to evaluation habits of other applications during the use of the electronic device by the user, and if the user does not like the application, the control of the "video B" is controlled to respond to a click operation, that is, the video B is opened; otherwise, jumping to a comment writing interface.
For example, when the method is applied to the scene shown in fig. 4, the habit of updating the application by the user may be used as a basis for calculating the click priority, for example, when the video application releases a new version, the user may update in time, and in this case, the click priority of the user for clicking "update immediately" is higher than the click priority of the "video B" control; conversely, the click priority of the user clicking the "video B" control is higher than the click priority of "update immediately".
For example, when applied to the scenarios shown in fig. 7 to 9, the click priority may be calculated according to the historical usage record of the user and the current mode of the electronic device.
For example, for the case that the electronic device currently selects the meeting mode, in this case, the user is in the meeting at this time, and therefore there is a high probability that the meeting content needs to be recorded by using the memo, that is, the click priority corresponding to the control of the memo application is high.
For another example, if the user turns on the multi-screen collaborative mode according to the historical usage record, the user may use 8:00 will use the multi-screen cooperative mode, and the later time will not use, so when the trigger time of the current click operation is 12.
For example, when the method is applied to the scenarios shown in fig. 10 and fig. 11, the historical usage habits of the user and the importance of the pop-up content may also be referred to when calculating the click priority, which is not limited in this embodiment.
For example, when the method is applied to the scenarios shown in fig. 12 to fig. 15, the processing manner of the click operation may refer to the scenarios shown in fig. 7 to fig. 9, and is not described herein again.
For example, when applied to the scenario shown in fig. 16, the click priority for clicking different messages may be determined according to the time of the message at the click position and the information adjacent to the click position, the time corresponding to the click operation, and the message of which user is to be viewed by the user with priority each time, and then it is determined whether to click the message of which user is "777" in fig. 16, the message receiving time is 8.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
Therefore, according to the method provided by the embodiment of the application, when the control acted by one click operation is not unique, namely when the click position corresponding to the click operation is overlapped, the overlapped matching degree factor, the importance degree factor, the occurrence time factor, the operation frequency factor and the injury degree factor of each overlapped control are analyzed and processed by the target control selection model obtained based on the training of the decision tree sorting algorithm, so that the click priority and the click injury degree of each control acted by the click operation at this time are determined, accurate judgment on whether the different controls are clicked and touched in different conditions is realized through the multi-feature factors, the control which needs to be acted by the click operation at this time is finally determined according to the click priority and the click injury degree of each control, the determined controls respond to the click operation at this time, the adverse effect caused by the misoperation is effectively avoided, and the user experience is improved.
For better understanding, the following describes, with reference to fig. 20 to 24, processing of a click operation performed by a user by taking the first control as a volume control and the second control as a full-screen control as an example.
For example, referring to fig. 20 (1), icons of a plurality of applications, such as a clock icon, a calendar icon, a gallery icon, a memo icon, a file management icon, an email icon, a music icon, a calculator icon, a video icon, an exercise health icon, a weather icon, a browser icon, a setting icon, a recorder icon, and the like, are displayed in a display interface of an electronic device, such as a mobile phone, and this is not limited herein.
For example, when the user clicks the video icon in fig. 20 (1), the content displayed in the display interface is switched to that shown in fig. 20 (2) in response to the click operation of the user by the mobile phone.
For example, referring to fig. 20 (2), multimedia options that can be selected by the user, such as a video a option, a video B option, a video C option, a video D option, a video E option, a video F option, a video G option, etc., are displayed in the display interface of the mobile phone, which is not illustrated here one by one, and this embodiment is not limited thereto.
For example, when the user clicks the video G option in fig. 20 (2), the mobile phone responds to the click operation of the user, and the display interface displays a multimedia picture corresponding to the video G option, for example, as shown in fig. 20 (3).
Illustratively, referring to fig. 20 (3), after the user clicks on the multimedia screen 10, a full screen control 20 is also displayed in the display interface, for example, as shown in fig. 20 (4).
It can be understood that, in an actual application scenario, after a user clicks a multimedia screen, a play progress bar control, a pause play control, a return control, and the like may also be displayed in the display interface, which is not listed one by one, but this embodiment is not limited thereto.
That is, when receiving a click operation of a user on an application providing a multimedia screen, the mobile phone displays a multimedia option in the display interface. And because the displayed multimedia options correspond to the multimedia pictures, the display interface plays the corresponding multimedia pictures when the multimedia options are clicked. That is, when the click operation on the multimedia option is received, the multimedia picture corresponding to the multimedia option is displayed in the display interface, and when the click operation on the multimedia picture is received, the full-screen control is displayed in the display interface.
Further, when the content displayed on the display interface of the mobile phone is as shown in fig. 20 (4), if the user presses a volume key on the side of the mobile phone, or slides up or down through a specific gesture, for example, a finger pressing a certain area of the display interface displaying a multimedia picture, a volume adjustment operation is triggered.
For example, referring to fig. 21, after the mobile phone receives a volume adjustment operation triggered by a user, a volume control 30 is displayed in the display interface.
Specifically, in the scenario to which the technical solution provided in this embodiment is applied, the first portion 30-1 of the volume control 30 is partially overlapped with the full screen control 20.
Correspondingly, in the process of simultaneously displaying the volume control and the full-screen control, when the click operation of the user is monitored, the click position corresponding to the click operation needs to be determined, then when the click position is located in the first part 30-1, namely the overlapping area of the volume control 30 and the full-screen control 20, the click priority of the full-screen control 20 and the click priority of the volume control 30 are determined by using the target control selection model in the processing scheme of the click operation provided by the embodiment of the application, and then the control with high click priority is selected to respond to the click operation.
For example, referring to fig. 21, a coincident volume node control and a full-screen control exist at a click position corresponding to a click operation of a user, a feature factor corresponding to the volume control and a feature factor corresponding to the full-screen control are obtained according to the types of the feature factors given in table 1, and then the feature factor corresponding to the volume control and the feature factor corresponding to the full-screen control are respectively used as input data to respectively input a control selection model (which may be an initial control selection model or a target control selection model), so as to obtain a click priority and a click damage degree corresponding to the volume control, and a click priority and a click damage degree corresponding to the full-screen control.
Illustratively, when the click priority of the volume control is higher than the click priority of the full-screen control, the volume value is adjusted to the volume value corresponding to the first portion, and the volume bar corresponding to the adjusted volume control 30 is shown in fig. 22 (1).
Illustratively, when the click priority of the full-screen control is higher than the click priority of the volume control, the multimedia picture is played in full screen on the display interface, that is, the display interface becomes as shown in fig. 22 (2).
Illustratively, when the click priority of the volume control is equal to the click priority of the full-screen control, a first prompt window pops up on the display interface, such as the window 40 popped up in the display interface in fig. 23.
Illustratively, the first prompt window 40 may include a first option and a second option. Wherein the first option corresponds to the volume control and the second option corresponds to the full screen control.
With continued reference to FIG. 23, the first option may be directly the volume option 40-1 and the second option may be directly the full screen option 40-2.
Further, after the first prompt window 40 is popped up from the display interface, the mobile phone needs to continue to monitor the click operation of the user.
Accordingly, when the clicking operation of the user on the first option, namely the volume option 40-1 in fig. 23, is monitored, the volume value is adjusted to the volume value corresponding to the first part, namely, the volume bar corresponding to the adjusted volume control 30 is as shown in fig. 22 (1); when the clicking operation of the user on the second option, namely the full screen option 40-2 in fig. 23, is monitored, the multimedia picture is played in the display interface in full screen, namely the display interface becomes as shown in fig. 22 (2).
In addition, it should be noted that if the control to be actually clicked by the user is determined to be the volume control by comparing the click priorities corresponding to the two controls, and the volume is suddenly increased, discomfort may be brought to the user, and user experience may be affected. Therefore, it is necessary to determine whether the click injury of the volume control is lower than the injury threshold.
Specifically, in the example of fig. 21, the click injury degree may be divided according to the sound volume, for example, the sound volume range supported by the electronic device is divided into 10 levels from 0 to 100, and the unit of 10 is 10, for example, the injury degree corresponding to 0 to 10 is 1, the injury degree corresponding to 11 to 20 is 2, the injury degree corresponding to 21 to 30 is 3, the injury degree corresponding to 31 to 40 is 4, the injury degree corresponding to 41 to 50 is 5, the injury degree corresponding to 51 to 60 is 6, the injury degree corresponding to 61 to 70 is 7, the injury degree corresponding to 71 to 80 is 8, the injury degree corresponding to 81 to 90 is 9, and the injury degree corresponding to 91 to 100 is 10, so that the injury degree threshold is a value in a section that makes the user's hearing comfortable under the current environment.
That is to say, in the process of determining the injury threshold corresponding to the volume control, it is required to first obtain a volume value (hereinafter referred to as a first volume value) of an environment in which the electronic device is currently located, and then determine the injury threshold according to the first volume value, for example, a value of an interval in which the hearing of the user is comfortable in the current environment is 60, and then the injury degree of the corresponding injury threshold is 6, that is, 6 when the injury threshold is reached.
With reference to fig. 21, in an actual application scenario, since the click position is located at the topmost end of the volume control, in this case, the electronic device adjusts the volume to the maximum value in response to the click operation, and for the electronic device with the maximum volume value of 100, the click injury degree corresponding to the division is 10. In this way, at the time of the injury threshold 6, when the click injury is 10, the click injury is significantly higher than the injury threshold, and in order to avoid causing injury to the user, a prompt is given to the user according to a preset prompt mode, for example, a pop-up window in the current interface pops up in the prompt window in fig. 24.
That is, when the click injury degree of the volume control is not lower than the injury degree threshold, a second prompt window 50 pops up on the display interface.
Referring to fig. 24, the second prompt window 50 includes prompt information, such as "is the current environment unsuitable for maximizing volume, continue, in fig. 24? ", a third option, such as the continue option 50-1 in fig. 23, and a fourth option, such as the stop option 50-2 in fig. 24.
As shown in fig. 24, the prompt information is used to prompt the user whether to execute the step of adjusting the volume value to the volume value corresponding to the first portion, the third option corresponds to the volume control, and the fourth option is used to cancel the first click operation.
Accordingly, if the user has clicked the continuation option 50-1, the volume is adjusted to the highest, e.g., 100, in response to the user's click operation to adjust the volume; if the user clicks the stop option 50-2, the volume is not adjusted and the electronic device continues to play the video content at the current volume.
Further, specifically in an actual application scenario, in order to achieve both user experience and minimize user intervention, before the click injury degree is higher than the injury degree threshold value and a prompt is given to the user, a current volume value (subsequently referred to as a second volume value) of the electronic device may be further obtained, then, if the target control responds to the click operation, a volume value (subsequently referred to as a third volume value) corresponding to the electronic device, that is, an adjusted volume value, is determined, and then, whether a difference value between the second volume value and the third volume value is smaller than the volume threshold value is determined.
Correspondingly, if the volume value is smaller than the first volume value, the adjustment range is not large after the volume is adjusted from the second volume value to the third volume value, the influence on the user is small, in this case, a prompt can not be made for the user, and the target control is directly controlled to respond to the click operation; on the contrary, the adjustment amplitude is larger, which greatly affects the user experience, in this case, a prompt needs to be given to the user, and the user determines whether to continue to adjust the volume.
That is, even if the click injury is not lower than the injury threshold, since the volume difference before and after adjustment is smaller than the volume threshold, the user may not be prompted, and the target control is directly controlled to respond to the current click operation.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not intended to limit the present embodiment.
As can be seen from the above description, the processing method of the click operation provided in the present embodiment is suitable for the following scenarios.
Illustratively, when a first interface, such as the multimedia screen, is displayed in a non-full-screen playing manner, i.e., in a small window manner in a display interface of an electronic device, such as the above-mentioned mobile phone, when a first operation, such as a click/touch operation, on the multimedia screen is received/monitored, a full-screen control is displayed in the multimedia screen.
It can be understood that the full-screen control in this embodiment is a control for switching a multimedia frame played in a non-full-screen mode to a full-screen mode.
In addition, it can be understood that, regarding the display of the first interface, for example, after a user performs a click operation on an application provided in the display interface of the electronic device for playing the multimedia screen, the electronic device may display, for example, an interface shown in (2) in fig. 20 in response to the click operation, which may be referred to as a second interface for distinction.
For example, in some implementations, at least one multimedia option, such as the 7 multimedia options of video a through video G shown in (2) of fig. 20, can be displayed in the second interface.
Taking the example that the multimedia picture displayed in the first interface in the non-full-screen playing manner is the video G shown in (2) in fig. 20, after the user clicks the multimedia option corresponding to the video G in the second interface, the electronic device responds to the clicking operation, and displays the first interface in the display interface in the non-full-screen playing manner, such as the interface shown in (3) in fig. 20.
Illustratively, after displaying a full screen control, such as the full screen control 20 mentioned above, in the multimedia screen, when receiving a pressing operation of a volume key located on a side frame of the electronic device by a user, the electronic device may display a volume control, such as the volume control 30 mentioned above, in the current interface for adjusting the volume of the played multimedia in response to the pressing operation.
For example, in some implementations, there may be instances where there is overlap in the positions of the full screen control and the volume control, such as where the full screen control may be partially or fully obscured by the volume control. In this embodiment, the full screen control is partially blocked by the volume control, as shown in fig. 21 as an example.
For example, as shown in fig. 21, if the user clicks a first portion of the volume control, such as the 30-1 area shown in fig. 21, within the time of displaying the full-screen control, such as 2 seconds, the electronic device may play the multimedia screen in full screen in response to an operation applied to the 30-1 area, so as to distinguish the second operation, which is hereinafter referred to as a second operation, as shown in (2) in fig. 22.
Specifically, whether the electronic device plays the multimedia picture in full screen or adjusts the volume in response to the operation applied to the 30-1 area may specifically be determined by determining the click priority of the volume control and the click priority of the full screen control.
Correspondingly, when the click priority of the full screen control is higher than the click priority of the volume control, the multimedia picture is played in full screen, as shown in (2) in fig. 22; when the click priority of the volume control is higher than that of the full screen control, the volume is adjusted to the maximum volume value, as shown in (1) in fig. 22.
The determination method of the click priority of the volume control and the click priority of the full-screen control includes, for example, respectively obtaining the characteristic factor of the volume control and the characteristic factor of the full-screen control, then processing the characteristic factor of the volume control and the characteristic factor of the full-screen control through a target control selection model, and determining the click priority corresponding to the volume control and the click priority corresponding to the full-screen control.
Illustratively, the characteristic factors mentioned in the present embodiment include a coincidence matching degree factor, an importance degree factor, an appearance duration factor, an operation frequency factor, and a damage degree factor. For specific descriptions of these factors and implementation details for determining click priorities of corresponding controls according to these factors, see above, and are not described here again.
In addition, it should be noted that in other implementation manners, in addition to pressing a volume key set on a side frame of the electronic device, a third operation may be performed on the multimedia screen to trigger the electronic device to display the audio control in the first interface.
Illustratively, in some implementations, the third operation is, for example, a sliding operation is performed on the first area of the multimedia screen, i.e., a specific area.
The sliding operation is, for example, the above-mentioned upward sliding or downward sliding, and the present embodiment does not limit this.
In addition, it should be further noted that, in other implementation manners, if the click priority of the volume control is equal to the click priority of the full-screen control after the click priorities of the volume control and the full-screen control are determined according to the foregoing, in order to ensure the user experience and avoid a response error, the decision may be handed to the user, for example, a first prompt window may pop up on the display interface, as shown in fig. 23 as 40.
Taking the first prompt window 40 shown in FIG. 23 as an example, the exemplary first prompt window may include a first option corresponding to a volume control, such as 40-1, and a second option corresponding to a full screen control, such as 40-2. After the user clicks the first option, responding to the clicking operation acted on the first option, and adjusting the volume to the maximum volume value; and after the user clicks the second option, responding to the click operation acted on the second option, and playing the multimedia picture in a full screen mode.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
In addition, it should be noted that, in consideration of inconvenience that the user may be caused by adjusting the volume to the maximum volume value, the click injury degree of the volume control may be further determined before the volume is adjusted to the maximum volume value, and for the determination manner of low click injury, reference may be made to the above, and details are not repeated here.
Correspondingly, when the click injury degree of the volume control is lower than the injury degree threshold value, the step of adjusting the volume to the maximum volume value is executed; when the click injury level of the volume control is not below the injury threshold, a second prompt window pops up in the display interface, as shown at 50 in fig. 24.
Taking the example second prompt window 50 shown in FIG. 24 as an example, the example second prompt window may include prompt information, a third option, such as 50-1, and a fourth option, such as 50-2. The prompt information is used for prompting the user whether to execute the step of adjusting the volume to the maximum volume value, the third option corresponds to the volume control, and the fourth option is used for canceling the click operation.
Illustratively, when the user clicks the third option, the volume is adjusted to the maximum volume value in response to a click operation applied to the third option; and after the user clicks the fourth option, responding to the click operation acted on the fourth option, and continuously displaying the multimedia picture in a non-full-screen playing mode on the display interface.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
In addition, it should be noted that, within a preset time after the full-screen control is displayed, if the second operation that acts on the first portion of the volume control is not received, the displayed full-screen control is cancelled after the preset time, and the multimedia picture is continuously displayed on the display interface in a non-full-screen playing manner, that is, the interface shown in (1) in fig. 22 is restored to the interface shown in (3) in fig. 20.
Therefore, according to the processing method for the click operation provided by the embodiment, when the full-screen control and the volume control are simultaneously displayed on the display interface, even if the user clicks the overlapping area of the two controls, the electronic device responds to the click operation and plays the multimedia picture with high click priority in a full-screen manner instead of considering that the user needs to adjust the volume, so that the false response is avoided.
In addition, an embodiment of the present application further provides a computer-readable storage medium, where a computer instruction is stored in the computer storage medium, and when the computer instruction runs on an electronic device, the electronic device is enabled to execute the relevant method steps to implement the processing method of the click operation in the foregoing embodiment; when the computer instructions are run on the server, the server is caused to execute the relevant method steps to implement the method for training and optimizing the control selection model in the above embodiment.
In addition, embodiments of the present application also provide a chip (which may also be a component or a module), which may include one or more processing circuits and one or more transceiver pins; the receiving pin and the processing circuit are communicated with each other through an internal connection path, and the processing circuit executes the relevant method steps to realize the processing method of the click operation in the embodiment or train and optimize the method of the control selection model so as to control the receiving pin to receive signals and control the sending pin to send signals.
In addition, an embodiment of the present application further provides a system for processing a click operation, where an electronic device in the system is configured to execute the above related steps to implement the method for processing a click operation in the above embodiment, and a server in the system is configured to execute the above related method steps to implement the method for training and optimizing a control selection model in the above embodiment.
In addition, as can be seen from the above description, the electronic device, the server, the computer readable storage medium, or the chip provided in the embodiments of the present application are all configured to execute the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device, the server, the computer readable storage medium, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Furthermore, it should be understood that the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions deviate from the technical solutions of the embodiments of the present application.

Claims (14)

1. A processing method of a click operation is applied to an electronic device with a display interface, and comprises the following steps:
displaying a first interface in a non-full screen playing mode in the display interface, wherein the first interface comprises a multimedia picture;
responding to a first operation acted on the multimedia picture, and displaying a full-screen control in the multimedia picture;
receiving a pressing operation of a volume key, wherein the volume key is positioned on a side frame of the electronic equipment;
displaying a volume control in response to the pressing operation; wherein the full screen control is obscured by the volume control;
and responding to a second operation acting on a first part on the volume control within a preset time after the full screen control is displayed, and playing the multimedia picture in a full screen mode, wherein the first part on the volume control is a part for shielding the full screen control.
2. The method according to claim 1, wherein the displaying the first interface in the display interface in a non-full-screen playing manner comprises:
receiving click operation of an application providing the multimedia picture in the display interface;
responding to the click operation, displaying a second interface, wherein the second interface comprises at least one multimedia option, and the at least one multimedia option corresponds to the multimedia picture;
and responding to the click operation acted on the at least one multimedia option, and displaying a first interface in a non-full screen playing mode in the display interface.
3. The method of claim 1, wherein after displaying a full screen control within the multimedia screen, the method further comprises:
receiving a third operation on the multimedia picture;
and responding to the third operation, and displaying the volume control.
4. The method according to claim 3, wherein the third operation is a slide operation on the first region of the multimedia screen.
5. The method of claim 1, wherein the full-screen playing of the multimedia frame in response to the second operation on the first portion of the volume control comprises:
determining the click priority of the volume control and the click priority of the full screen control;
and when the click priority of the full screen control is higher than the click priority of the volume control, the multimedia picture is played in full screen.
6. The method of claim 5, further comprising:
and when the click priority of the volume control is higher than that of the full-screen control, adjusting the volume to the maximum volume value.
7. The method of claim 6, further comprising:
when the click priority of the volume control is equal to the click priority of the full-screen control, popping up a first prompt window on the display interface, wherein the first prompt window comprises a first option and a second option, the first option corresponds to the volume control, and the second option corresponds to the full-screen control.
8. The method of claim 7, wherein after the display interface pops up a first prompt window, the method further comprises:
when the clicking operation on the first option is received, adjusting the volume to the maximum volume value;
and when the click operation on the second option is received, the multimedia picture is played in a full screen mode.
9. The method of claim 6, wherein when the click priority of the volume control is higher than the click priority of the full screen control, the method further comprises, prior to adjusting the volume to a maximum volume value:
determining the click injury degree of the volume control;
when the click damage degree of the volume control is lower than a damage degree threshold value, executing the step of adjusting the volume to the maximum volume value;
when the clicking damage degree of the volume control is not lower than the damage degree threshold value, a second prompt window pops up on the display interface, the second prompt window comprises prompt information, a third option and a fourth option, the prompt information is used for prompting a user whether to execute the step of adjusting the volume to the maximum volume value, the third option corresponds to the volume control, and the fourth option is used for canceling the clicking operation.
10. The method of claim 9, wherein after popping up a second prompt window from the display interface, the method further comprises:
when the clicking operation on the third option is received, adjusting the volume to the maximum volume value;
and when the click operation on the fourth option is received, continuing to display the multimedia picture on the display interface in a non-full-screen playing mode.
11. The method of claim 5, wherein the determining the click priority of the volume control and the click priority of the full screen control comprises:
respectively obtaining characteristic factors of the volume control and characteristic factors of the full-screen control, wherein the characteristic factors comprise a coincidence matching degree factor, an important degree factor, an appearance duration factor, an operation frequency factor and a damage degree factor;
and processing the characteristic factors of the volume control and the full screen control through a target control selection model, and determining the click priority corresponding to the volume control and the click priority corresponding to the full screen control.
12. The method of claim 1, further comprising:
and within a preset time after the full-screen control is displayed, if a second operation acting on the first part of the volume control is not received, canceling the displayed full-screen control after the preset time, and continuously displaying the multimedia picture in a non-full-screen playing mode on the display interface.
13. An electronic device, comprising: a memory and a processor, the memory and the processor coupled; the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the method of processing a click operation of any one of claims 1 to 12.
14. A computer-readable storage medium comprising a computer program, characterized in that, when the computer program runs on an electronic device, the electronic device is caused to execute the processing method of a click operation according to any one of claims 1 to 12.
CN202211163803.XA 2021-11-05 2021-11-05 Processing method and device for clicking operation and storage medium Pending CN115469787A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211163803.XA CN115469787A (en) 2021-11-05 2021-11-05 Processing method and device for clicking operation and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211163803.XA CN115469787A (en) 2021-11-05 2021-11-05 Processing method and device for clicking operation and storage medium
CN202111308242.3A CN114168046B (en) 2021-11-05 2021-11-05 Click operation processing method, device and storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202111308242.3A Division CN114168046B (en) 2021-11-05 2021-11-05 Click operation processing method, device and storage medium

Publications (1)

Publication Number Publication Date
CN115469787A true CN115469787A (en) 2022-12-13

Family

ID=80478107

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111308242.3A Active CN114168046B (en) 2021-11-05 2021-11-05 Click operation processing method, device and storage medium
CN202211163803.XA Pending CN115469787A (en) 2021-11-05 2021-11-05 Processing method and device for clicking operation and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202111308242.3A Active CN114168046B (en) 2021-11-05 2021-11-05 Click operation processing method, device and storage medium

Country Status (1)

Country Link
CN (2) CN114168046B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115097980B (en) * 2022-08-24 2022-12-02 成都智暄科技有限责任公司 Small-area overlapping transparent control selection method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103179277A (en) * 2013-03-21 2013-06-26 广东欧珀移动通信有限公司 Switching method and device for audio/video play mode of mobile phone
CN105657564A (en) * 2015-12-30 2016-06-08 广东欧珀移动通信有限公司 Video processing method and video processing system for browser
CN106303655A (en) * 2016-08-17 2017-01-04 珠海市魅族科技有限公司 A kind of media content play cuing method and device
CN108379839A (en) * 2018-03-23 2018-08-10 网易(杭州)网络有限公司 Response method, device and the terminal of control
CN109814797A (en) * 2019-01-16 2019-05-28 努比亚技术有限公司 Touch-control control method and mobile terminal, computer readable storage medium
CN111338597A (en) * 2020-02-24 2020-06-26 维沃移动通信有限公司 Display method and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105653558A (en) * 2014-11-28 2016-06-08 阿里巴巴集团控股有限公司 Method for function recommendation of user terminal and device of same
CN107102802A (en) * 2017-04-19 2017-08-29 网易(杭州)网络有限公司 Overlay target system of selection and device, storage medium, electronic equipment
CN109002339A (en) * 2018-07-04 2018-12-14 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment
CN109388458A (en) * 2018-09-26 2019-02-26 深圳壹账通智能科技有限公司 Management method, terminal device and the computer readable storage medium of interface control
CN109495552A (en) * 2018-10-31 2019-03-19 北京字节跳动网络技术有限公司 Method and apparatus for updating clicking rate prediction model
CN111880704B (en) * 2020-07-20 2022-02-18 北京百度网讯科技有限公司 Application program processing method, device, equipment and medium
CN113032082B (en) * 2021-04-19 2023-08-11 北京新三优秀科技有限公司 Control display method, electronic equipment and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103179277A (en) * 2013-03-21 2013-06-26 广东欧珀移动通信有限公司 Switching method and device for audio/video play mode of mobile phone
CN105657564A (en) * 2015-12-30 2016-06-08 广东欧珀移动通信有限公司 Video processing method and video processing system for browser
CN106303655A (en) * 2016-08-17 2017-01-04 珠海市魅族科技有限公司 A kind of media content play cuing method and device
CN108379839A (en) * 2018-03-23 2018-08-10 网易(杭州)网络有限公司 Response method, device and the terminal of control
CN109814797A (en) * 2019-01-16 2019-05-28 努比亚技术有限公司 Touch-control control method and mobile terminal, computer readable storage medium
CN111338597A (en) * 2020-02-24 2020-06-26 维沃移动通信有限公司 Display method and electronic equipment

Also Published As

Publication number Publication date
CN114168046A (en) 2022-03-11
CN114168046B (en) 2022-10-14

Similar Documents

Publication Publication Date Title
CN108363593B (en) Application program preloading method and device, storage medium and terminal
CN111176960B (en) User operation behavior tracking method, device, equipment and storage medium
US10228891B2 (en) Method and apparatus for controlling display device, and intelligent pad
US8150700B2 (en) Mobile terminal and menu control method thereof
US20120062494A1 (en) Mobile electronic device, controlling method thereof and non-transitory recording medium thereof
CN107402712B (en) Touch operation response method and device, storage medium and terminal
CN110083266B (en) Information processing method, device and storage medium
CN106488282B (en) Multimedia information output control method and mobile terminal
CN104735256A (en) Method and device for judging holding mode of mobile terminal
CN112486382B (en) Interface logic execution method and device, electronic equipment and medium
CN114168046B (en) Click operation processing method, device and storage medium
CN111641677A (en) Message reminding method, message reminding device and electronic equipment
CN110830813A (en) Video switching method and device, electronic equipment and storage medium
CN106055249A (en) Control method for mobile terminal and mobile terminal
CN111045540A (en) Screen touch response method, device, equipment and computer readable medium
EP4270938A1 (en) Video call method and device
CN108877742A (en) Luminance regulating method and device
CN108924308A (en) Call method, device, storage medium and the terminal of camera
US20120159397A1 (en) Apparatus and method for providing application information and executing application in mobile terminal having touch screen
CN113093932B (en) Touch detection processing method and device, terminal equipment and storage medium
CN114510188A (en) Interface processing method, intelligent terminal and storage medium
CN111221710B (en) Method, device and equipment for identifying user type
CN109144643B (en) Fingerprint identification prompting method and device, storage medium and electronic equipment
CN110417987B (en) Operation response method, device, equipment and readable storage medium
CN108037875B (en) Method, device and storage medium for switching input modes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination