CN108717325B - Operation gesture setting method and device and mobile terminal - Google Patents

Operation gesture setting method and device and mobile terminal Download PDF

Info

Publication number
CN108717325B
CN108717325B CN201810350865.9A CN201810350865A CN108717325B CN 108717325 B CN108717325 B CN 108717325B CN 201810350865 A CN201810350865 A CN 201810350865A CN 108717325 B CN108717325 B CN 108717325B
Authority
CN
China
Prior art keywords
area
setting
pressing
user
operation gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810350865.9A
Other languages
Chinese (zh)
Other versions
CN108717325A (en
Inventor
黄千洋
段丽霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810350865.9A priority Critical patent/CN108717325B/en
Publication of CN108717325A publication Critical patent/CN108717325A/en
Priority to PCT/CN2019/081526 priority patent/WO2019201102A1/en
Application granted granted Critical
Publication of CN108717325B publication Critical patent/CN108717325B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention discloses an operation gesture setting method and device and a mobile terminal. The method comprises the following steps: displaying a setting area and a response area on an operation gesture setting interface; acquiring a setting instruction input by a user in the setting area; responding to the setting instruction, and displaying first feedback information corresponding to the setting instruction in the response area, wherein different setting instructions correspond to different first feedback information. According to the operation gesture setting method and device and the mobile terminal, the user can more visually know whether the operation gesture is suitable for self use or not by giving feedback to the user in real time in the setting process of the operation gesture, and the user experience is improved while the setting efficiency is remarkably improved.

Description

Operation gesture setting method and device and mobile terminal
Technical Field
The invention relates to the technical field of mobile terminals, in particular to an operation gesture setting method and device and a mobile terminal.
Background
Mobile terminals, such as mobile phones, have become one of the most common consumer electronics products in people's daily life. At present, in various gesture operations of a mobile terminal, a user is required to preset an operation gesture by himself. However, in the setting process of the user, the system does not give real-time feedback, and the user cannot judge whether the setting is proper or not, so that the user experience is low.
Disclosure of Invention
In view of the above problems, the present invention provides an operation gesture setting method, an operation gesture setting device and a mobile terminal, so as to provide a more efficient operation gesture setting manner and improve user experience.
In a first aspect, an embodiment of the present invention provides an operation gesture setting method, where the method includes: displaying a setting area and a response area on an operation gesture setting interface; acquiring a setting instruction input by a user in the setting area; responding to the setting instruction, and displaying first feedback information corresponding to the setting instruction in the response area, wherein different setting instructions correspond to different first feedback information.
In a second aspect, an embodiment of the present invention provides an apparatus for setting an operation gesture, where the apparatus includes: the display module is used for displaying an operation gesture setting interface, and displaying a setting area and a response area on the operation gesture setting interface; the first acquisition module is used for acquiring a setting instruction input by a user in the setting area; and the processing module is used for responding to the setting instruction and displaying first feedback information corresponding to the setting instruction in the response area.
In a third aspect, an embodiment of the present invention provides a mobile terminal, where the mobile terminal includes: a touch screen, a memory, and a processor, the touch screen and the memory coupled to the processor, the memory storing instructions that, when executed by the processor, the processor performs the method described above.
In a fourth aspect, the present invention also provides a computer readable storage medium having a program code executable by a processor, where the program code causes the processor to execute the above method.
Compared with the prior art, the operation gesture setting method, the operation gesture setting device and the mobile terminal provided by the invention have the advantages that the user can more intuitively know whether the operation gesture is proper or not by giving feedback to the user in real time in the operation gesture setting process of the user, so that the setting efficiency is obviously improved and the user experience is improved.
These and other aspects of the invention are apparent from and will be elucidated with reference to the embodiments described hereinafter.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a flowchart of an operation gesture setting method proposed by a first embodiment of the present invention;
FIG. 2 is a flow chart of an operation gesture setting method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of an interface of an operation gesture setting method according to a second embodiment of the present invention;
FIG. 4 is a flowchart of an operation gesture setting method according to a third embodiment of the present invention;
FIG. 5 is an interface diagram illustrating an operation gesture setting method according to a third embodiment of the present invention;
fig. 6 is a block diagram showing a configuration of an operation gesture setting apparatus according to a fourth embodiment of the present invention;
fig. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 8 is a block diagram of a mobile terminal for performing an operation gesture setting method according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
At present, in the process of presetting the operation gesture of the mobile terminal by a user, a system does not give real-time feedback, the user cannot judge whether the setting is proper, and the user experience is low. Therefore, the inventor provides an operation gesture setting method and device and a mobile terminal in the embodiment of the invention. Embodiments of the present invention will be described in detail below with reference to the drawings, wherein the terms "first," "second," and "third" are used merely for distinguishing between the descriptions and are not intended to imply relative importance.
First embodiment
Referring to fig. 1, a first embodiment of the present invention provides an operation gesture setting method, which is applied to a mobile terminal to perform corresponding operation gesture setting. In the embodiment of the present invention, the operation gesture may include, for example, a large-area press, a continuous long press, a long press with two fingers, a copy operation, a multi-finger continuous click, and the like. The large-area pressing means that the pressing area exceeds a preset area, the continuous long pressing means that the continuous pressing time length exceeds a preset time length, the double-finger long pressing means that the simultaneous pressing time length of the double fingers exceeds a preset time length, the copied content is copied to the pasting board, and the multi-finger continuous connection means that the number of continuous clicks exceeds a preset number.
In a specific embodiment, the operation gesture setting method is applied to the apparatus 400 for setting an operation gesture as shown in fig. 6 and the mobile terminal 100 (fig. 7) configured with the apparatus 400 for setting an operation gesture, and is used for setting an operation gesture on the mobile terminal 100. The setting method of the operation gesture specifically includes the following steps:
step S110: and displaying a setting area and a response area on the operation gesture setting interface.
The operation gesture setting interface is an interactive interface used by the mobile terminal for setting corresponding parameters of operation gestures, and is used for setting corresponding operation gesture parameters according to personal requirements by a user.
The setting area means: the user sets an information input area of a gesture parameter corresponding to the operation gesture, and the area can comprise a touch button, an adjusting control, a text input box, a voice input button and the like.
The response area is an area for displaying corresponding information in response to an operation of a user, and may be all or a part of the area of the display screen other than the setting area. Or may be entirely overlapped or partially overlapped with the setting region, which is not particularly limited. The information displayed by the response area can comprise characters, figures, patterns, voice and the like.
Meanwhile, the setting region may be disposed above the response region, below the response region, on the left side of the response region, on the right side of the response region, or around the response region to surround the response region. Further, the size of the setting region may be greater than, equal to, or smaller than the size of the response region, and optionally, in this embodiment, the size of the setting region is smaller than the size of the response region. However, the setting region and the response region have other positional relationships and dimensional relationships, which are not limited herein.
In step S110, after entering the operation gesture setting interface, the interface may display the setting area and the display response area at the same time, or call out the response area and/or the setting area in a manner of responding to a user instruction.
Step S120: and acquiring a setting instruction input by a user in the setting area.
And after the user inputs a setting instruction in the setting area, acquiring the setting instruction. The setting instruction may be a touch operation instruction of the user on the control in the setting area, or a voice operation instruction input by the user in the setting area, and the like, which is not specifically limited herein.
Step S130: and responding to the setting instruction, and displaying first feedback information corresponding to the setting instruction in the response area.
The first feedback information may be text, graphics, patterns, voice, or any combination thereof, and is not particularly limited herein. The first feedback information may fully occupy the response area, or may use only a part of the response area. The first feedback information may be feedback information corresponding to an operation gesture, for example, if the user sets an operation gesture related to an area, the first feedback information may feed back the corresponding area, if the user sets an operation gesture related to a duration, the first feedback information may feed back the corresponding duration, and if the user sets an operation gesture related to a click frequency, the first feedback information may feed back the corresponding frequency.
And the user judges whether to adapt to the use requirement of the user according to the first feedback information, and if the use requirement is not met, the user can set the setting area again until the user is satisfied. The operation gesture setting method gives more visual operation experience to the user, avoids the problem that the user exits the operation gesture setting interface after the first setting, finds that the setting parameters are not suitable after the operation interface is reached, and needs to return to the operation gesture setting interface again for resetting, and improves the setting efficiency and the user experience.
Second embodiment
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating an operation gesture setting method according to a second embodiment of the present application. With reference to the flow shown in fig. 2, taking a large-area press as an example, the operation gesture setting method of the present application will be described in detail below, where the method may specifically include the following steps:
step S210: and displaying a setting area and a response area on an operation gesture setting interface, wherein the setting area comprises a pressing area adjusting control.
In this embodiment, the setting area includes a pressing area adjustment control, the pressing area adjustment control may be as shown in fig. 3, the operation gesture setting interface displays a setting area 201 and a response area 202, the setting area 201 includes a sliding adjustment control 203, the sliding adjustment control 203 is provided with a sliding button 204, and the sliding adjustment control 203 adjusts the size of the pressing area parameter by dragging the sliding button 204.
It should be understood that the press area adjustment control may take other forms, such as: the press area adjustment control is a text dialog box displaying the size of the press area and corresponding increase and decrease buttons, such as: adjustment buttons in the form of "+" and "-". At this time, the user adjusts the size of the pressed area parameter by clicking the "+" and "-" adjustment buttons. The following steps are repeated: the pressing area adjusting control is in a fan-shaped or circular instrument panel form, and a user adjusts the size of the pressing area parameter by adjusting the position of the pointer.
Before the setting instruction is acquired, the response region may display first feedback information corresponding to the initial pressing area parameter, or may not display the feedback information.
In this embodiment, the setting area may be located above the response area, and the pressing area adjustment controls are displayed in the transverse direction, and adjustment of the pressing area parameter is triggered by a transverse dragging manner. As an implementation manner, the progress bar of the pressing area adjustment control is sequentially provided with five gears, namely, a minimum gear, a small gear, a medium gear, a large gear and a maximum gear from left to right, and then the pressing area can be adjusted in different sizes by sliding the progress bar to different gears.
Optionally, the pressing area adjustment control is displayed with a text and a graph is displayed on the text in an overlapping manner, wherein the text is used for a user to perform pressing area testing, the graph is used for increasing or decreasing according to the leftward and rightward sliding of the progress bar, specifically, when the progress bar slides leftward, the graph is correspondingly decreased, and when the progress bar slides rightward, the graph is correspondingly increased so as to enable the user to visually set the pressing area and adjust the pressing area, and the user can conveniently set the pressing area.
Step S220: responding to a setting instruction acted on the pressing area adjusting control by the user.
And according to the regulation and control method of the pressing area regulation control, the user responds to the corresponding operation of the user after performing the corresponding operation.
As an example, as shown in fig. 3, when the user slides the slide-type button 204 to the right, the pressed area parameter set value is increased, and when the user slides the slide-type button 204 to the left, the pressed area parameter set value is decreased.
Step S230: acquiring a pressing area parameter set value according to the operation gesture;
step S240: and displaying first feedback information corresponding to the set value of the pressing area parameter in the response area.
In this embodiment, the first feedback information is a first circular pattern corresponding to the pressing parameter setting value. In the process of adjusting the pressing area parameter by the user, the size of the first circular pattern is adjusted in real time for adaptation according to the adjustment of the parameter, and the adjustment process includes but is not limited to being performed in the following manner, the circle center of the first circular pattern is kept still, and the radius of the first circular pattern is changed. The first circular pattern is preferably displayed directly below the press area adjustment control.
In some embodiments, the first feedback information may also be displayed in a rectangular pattern, an oval pattern, or other shaped pattern.
During the setting process, there may be an error between the expected value and the actual value of the pressing area suitable for the user, for example: the situation may occur that a user sets a certain pressing area parameter value according to self expectation, and according to the feedback of the first circular pattern, the user makes the pressing area parameter value corresponding to the first circular pattern suitable for the user, but the pressing area parameter value is not suitable for the user when the user actually uses the circular pattern. At this time, the user can only re-enter the operation gesture setting interface to perform the re-setting, which is troublesome, and therefore in this embodiment, optionally, after step S230, steps S240 and S250 are performed.
Third embodiment
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating an operation gesture setting method according to a second embodiment of the present application. As will be described in detail with respect to the flow shown in fig. 4, the method may specifically include the following steps:
step S310: and displaying a setting area and a response area on the operation gesture setting interface.
Step S320: responding to an operation gesture acted on the pressing area adjusting control by a user.
Step S330: and acquiring a pressing area parameter set value according to the operation gesture.
Step S340: and displaying a first circular pattern corresponding to the set value of the pressing area parameter in the response area.
Step S350: responding to a pressing gesture of a user on the response area.
In response to a user performing a press gesture at any location in the response region, the press gesture may be performed over all or a portion of the first circular pattern, or may be performed at any other location in the response region.
Step S360: and acquiring the touch area of the pressing gesture.
And acquiring a pressing gesture parameter of the pressing operation based on the pressing gesture of the user, and acquiring a touch area of the pressing gesture based on the pressing gesture parameter.
In this embodiment, step S370 is executed after step S360.
Step S370: and the response area displays second feedback information corresponding to the touch area.
Based on the obtained touch area, displaying second feedback information corresponding to the touch area in the response area, in some embodiments, the second feedback information may be a text, a graphic, a pattern, a voice, or any combination thereof, which is not limited herein.
In this embodiment, the second feedback information is a second circular pattern equal to the touch area, where the second circular pattern may be displayed in an overlapping manner with the first circular pattern, for example, in an overlapping manner with the center of circle, so that the user may visually compare the first circular pattern with the first circular pattern. Of course, it should be understood that the second circular pattern may be displayed in an area of the response area that does not display the first circular pattern, such as directly below the first circular pattern, for easy user contrast.
In some embodiments, the second circular pattern may be filled in any one or more colors, and in some embodiments, the first circular pattern and the second circular pattern are displayed in different colors to create a distinction, such as: the first circular pattern is filled with green and the second circular pattern is filled with red.
In some embodiments, steps S350-S370 may be repeated twice, three times, or more than three times, that is, performing the pressing gesture in response to the user multiple times, obtaining the pressing area corresponding to each pressing gesture, and displaying the pressing area in the response area. The user can conveniently make the decision whether to change the pressing area parameter according to the actual operation results for a plurality of times. In the process of repeating steps S350-S370, the response area may simultaneously display the second circular pattern corresponding to the area pressed each time in the response area, or may only display the second circular pattern corresponding to the latest pressing gesture. The user can make a comparison evaluation based on the second circular pattern obtained a plurality of times with the first circular pattern,
during the setting process, there may be an error between the expected value and the actual value of the pressing area suitable for the user, for example: the situation may occur that a user sets a certain pressing area parameter value according to self expectation, and according to the feedback of the first circular pattern, the user makes the pressing area parameter value corresponding to the first circular pattern suitable for the user, but the pressing area parameter value is not suitable for the user when the user actually uses the circular pattern. At this time, the user can only re-enter the operation gesture setting interface to perform the re-setting, which is troublesome, and therefore in this embodiment, after step S370, step S380 is executed.
Step S380: and judging whether the area of the first circular pattern is smaller than or equal to the area of the second circular pattern.
When the area of the first circular pattern is smaller than or equal to the area of the second circular pattern, which indicates that the current actual pressing area of the user is larger than the pressing area corresponding to the pressing area parameter set value, a large-area pressing operation can be performed, and thus the pressing area parameter set value is appropriate. Otherwise, it indicates that the set value of the pressing area parameter set by the user may be too large to be adjusted, and then step S390 is executed.
Step S390: and outputting prompt information.
The prompt message can be a vibration, voice, text, pattern, graph, photoelectric signal, etc. message for reminding or warning the user. For example, when the area of the first circular pattern is larger than the area of the second circular pattern, for example: and outputting text information of 'the pressing area is too small' as prompt information in the response area, and triggering signals such as vibration of the mobile terminal or flashing of a flash lamp to prompt the user. In some embodiments, the hint information may be provided by color marking the outer edge of the second circular pattern, for example, with red or other colors.
As an example, as shown in fig. 5, the current setting value of the pressing area adjustment control 301 corresponds to a first circular pattern 302, and when the area of a second circular pattern 303 corresponding to the touch area is smaller than the area of the first circular pattern 302, a prompt message 304 of "the pressing area is too small" is output.
In this embodiment, when the area of the first circular pattern is smaller than or equal to the area of the second circular pattern, no operation is required. In some embodiments, when the area of the first circular pattern is less than or equal to the area of the second circular pattern, a prompt message is also output, for example, text message of "press area appropriate" is output as the prompt message. Similarly, the presentation information to be output at this time may be other information for informing the user of vibration, voice, text, pattern, graphics, photoelectric signal, or the like.
And the user corrects or does not operate the set value of the pressing area parameter according to the presented prompt information. The setting efficiency and the user experience degree of the set value of the pressing area parameter are improved.
Fourth embodiment
Referring to fig. 6, fig. 6 shows a block diagram of a device 400 for setting an operation gesture according to the present embodiment, and the following describes the block diagram shown in fig. 4. The apparatus 400 for setting an operation gesture includes a display module 403, a first obtaining module 401, and a processing module 402. The display module 403 is configured to display an operation gesture setting interface, where the operation gesture setting interface displays a setting area and a response area.
Specifically, the display module 403 includes a display screen, and the display screen can display an operation gesture setting page.
A first obtaining module 401, configured to obtain a setting instruction input by the user in the setting area. The first obtaining module 401 includes a pressing area parameter adjusting control, and the pressing area parameter adjusting control adjusts the pressing area parameter through sliding dragging. The first obtaining module 401 is displayed in the setting area and can trigger corresponding operation by the operation of the user in the display area.
And the processing module 402 is configured to respond to the setting instruction and display first feedback information corresponding to the setting instruction in the response area.
Optionally, the apparatus for setting an operation gesture further includes a second obtaining module 404, where the second obtaining module 404 is configured to respond to a pressing gesture of a user and obtain a touch area corresponding to the pressing gesture. And the processing module 402 displays second feedback information corresponding to the touch area in the response area according to the acquired touch area corresponding to the pressing gesture.
To sum up, the operation gesture setting device 400 and the mobile terminal that an embodiment of the application provided show user setting interface, should set up the interface including setting up region and response area, set up regional demonstration operation gesture adjustment control, receive the setting instruction that triggers on setting up the region, according to this setting instruction, show the first feedback information that corresponds with the setting instruction in response area to audio-visually give the user and set up the feedback, convenience of customers operates, improves and sets up efficiency, promotes user experience.
It should be noted that, in the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. For any processing manner described in the method embodiment, in the apparatus embodiment, all the processing manners can be implemented by the corresponding processing module 402, and details in the apparatus embodiment are not described again.
Referring to fig. 7 again, based on the operation gesture setting method and apparatus, the embodiment of the present application further provides a mobile terminal 100, which includes an electronic body 10, where the electronic body 10 includes a housing 12 and a main display 120 disposed on the housing 12. The housing 12 may be made of metal, such as steel or aluminum alloy. In this embodiment, the main display 120 generally includes a display panel 111, and may also include a circuit or the like for responding to a touch operation performed on the display panel 111. The Display panel 111 may be a Liquid Crystal Display (LCD) panel, and in some embodiments, the Display panel 111 is also a touch screen 109.
Referring to fig. 8, in a practical application scenario, the mobile terminal 100 can be used as an intelligent mobile phone terminal, in which case the electronic body 10 generally further includes one or more processors 102 (only one is shown), a memory 104, an RF (radio Frequency) module 106, an audio circuit 110, a sensor 114, an input module 118, and a power module 122. It will be understood by those skilled in the art that the structure shown in fig. 8 is merely illustrative and is not intended to limit the structure of the electronics body portion 12. For example, the electronics body section 10 may also include more or fewer components than shown in FIG. 8, or have a different configuration than shown in FIG. 8.
Those skilled in the art will appreciate that all other components are peripheral devices with respect to the processor 102, and the processor 102 is coupled to the peripheral devices through a plurality of peripheral interfaces 124. The peripheral interface 124 may be implemented based on the following criteria: universal Asynchronous Receiver/Transmitter (UART), General purpose input/Output (GPIO), Serial Peripheral Interface (SPI), and Inter-Integrated Circuit (I2C), but are not limited to the above standards. In some examples, the peripheral interface 124 may include only a bus; in other examples, the peripheral interface 124 may also include other elements, such as one or more controllers, for example, a display controller for interfacing with the display panel 111 or a memory controller for interfacing with a memory. These controllers may also be separate from the peripheral interface 124 and integrated within the processor 102 or a corresponding peripheral.
The memory 104 may be used to store software programs and modules, and the processor 102 executes various functional applications and data processing by running the software programs and modules stored in the memory 104. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory. In some examples, the memory 104 may further include memory remotely located from the processor 102, which may be connected to the electronic body portion 10 or the primary display 120 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The RF module 106 is used for receiving and transmitting electromagnetic waves, and implementing interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices. The RF module 106 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The RF module 106 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices via a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network described above may use various Communication standards, protocols and technologies, including, but not limited to, Global System for Mobile Communication (GSM), Enhanced Mobile Communication (Enhanced Data GSM Environment, EDGE), wideband Code division multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (WiFi) (e.g., Institute of Electrical and Electronics Engineers (IEEE) standard IEEE 802.10A, IEEE802.11 b, IEEE802.11g and/or IEEE802.11 n), Voice over internet protocol (VoIP), world wide mail Access (Microwave for Wireless communications), Wi-Max, and any other suitable Communication protocols, and may even include those protocols that have not yet been developed.
The audio circuitry 110, earpiece 101, sound jack 103, microphone 105 collectively provide an audio interface between a user and the electronic body portion 10 or the main display 120. Specifically, the audio circuit 110 receives sound data from the processor 102, converts the sound data into an electrical signal, and transmits the electrical signal to the earpiece 101. The earpiece 101 converts the electrical signal into sound waves that can be heard by the human ear. The audio circuitry 110 also receives electrical signals from the microphone 105, converts the electrical signals to sound data, and transmits the sound data to the processor 102 for further processing. Audio data may be retrieved from the memory 104 or through the RF module 106. In addition, audio data may also be stored in the memory 104 or transmitted through the RF module 106.
The sensor 114 is disposed in the electronic body portion 10 or the main display 120, examples of the sensor 114 include, but are not limited to: light sensors, operational sensors, pressure sensors, gravitational acceleration sensors, and other sensors.
Specifically, the light sensors may include a light sensor 114F, a pressure sensor 114G. Among them, the pressure sensor 114G may detect a pressure generated by pressing on the mobile terminal 100. That is, the pressure sensor 114G detects pressure generated by contact or pressing between the user and the mobile terminal, for example, contact or pressing between the user's ear and the mobile terminal. Accordingly, the pressure sensor 114G may be used to determine whether contact or pressing has occurred between the user and the mobile terminal 100, and the magnitude of the pressure.
Referring to fig. 8 again, in the embodiment shown in fig. 8, the light sensor 114F and the pressure sensor 114G are disposed adjacent to the display panel 111. The light sensor 114F may turn off the display output when an object is near the main display 120, for example, when the electronic body portion 10 moves to the ear.
As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in various directions (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping) and the like for recognizing the attitude of the mobile terminal 100. In addition, the electronic body 10 may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer and a thermometer, which are not described herein,
in this embodiment, the input module 118 may include the touch screen 109 disposed on the main display 120, and the touch screen 109 may collect touch operations of the user on or near the touch screen 109 (for example, operations of the user on or near the touch screen 109 using any suitable object or accessory such as a finger, a stylus, etc.) and drive the corresponding connection device according to a preset program. Optionally, the touch screen 109 may include a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 102, and can receive and execute commands sent by the processor 102. In addition, various types such as resistive, capacitive, infrared, and surface acoustic wave can be used to implement the touch detection function of the touch screen 109. In addition to the touch screen 109, in other variations, the input module 118 may include other input devices, such as keys 107. The keys 107 may include, for example, character keys for inputting characters, and control keys for triggering control functions. Examples of such control keys include a "back to home" key, a power on/off key, and the like.
The main display 120 is used to display information input by a user, information provided to the user, and various graphic user interfaces of the electronic body section 10, which may be composed of graphics, text, icons, numbers, videos, and any combination thereof, and in one example, the touch screen 109 may be provided on the display panel 111 so as to be integrated with the display panel 111.
The power module 122 is used to provide power supply to the processor 102 and other components. Specifically, the power module 122 may include a power management system, one or more power sources (e.g., batteries or ac power), a charging circuit, a power failure detection circuit, an inverter, a power status indicator light, and any other components associated with the generation, management, and distribution of power within the electronic body portion 10 or the primary display 120.
The mobile terminal 100 further comprises a locator 119, the locator 119 being configured to determine an actual location of the mobile terminal 100. In this embodiment, the locator 119 implements the positioning of the mobile terminal 100 by using a positioning service, which is understood to be a technology or a service for obtaining the position information (e.g., longitude and latitude coordinates) of the mobile terminal 100 by using a specific positioning technology and marking the position of the positioned object on an electronic map.
It should be understood that the mobile terminal 100 described above is not limited to a smartphone terminal, but it should refer to a computer device that can be used in mobility. Specifically, the mobile terminal 100 refers to a mobile computer device equipped with an intelligent operating system, and the mobile terminal 100 includes, but is not limited to, a smart phone, a smart watch, a tablet computer, and the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "some embodiments," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above-described terms are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, those skilled in the art will be able to combine and combine various embodiments or examples and features of various embodiments or examples described in this specification without contradictory guidance.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes additional implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (mobile terminal) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments. In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present application.

Claims (6)

1. An operational gesture setting method, the method comprising:
displaying a setting area and a response area on an operation gesture setting interface, wherein the operation gesture setting interface is an interactive interface for setting operation gesture parameters;
acquiring a setting instruction input by a user in the setting area;
responding to an operation gesture acted on a pressing area adjusting control in a set area by a user, and acquiring a pressing area parameter set value according to the operation gesture;
displaying first feedback information corresponding to the set value of the pressing area parameter in the response area, wherein different setting instructions correspond to different first feedback information;
responding to a pressing gesture acted on the response area by a user, and acquiring a touch area of the pressing gesture according to the pressing gesture;
displaying second feedback information corresponding to the touch area in the response area;
when the touch area is smaller than the pressing area corresponding to the pressing area parameter set value, prompt information is displayed in the response area to prompt that the pressing area is too small, and the pressing area parameter set value is a threshold value for triggering large-area pressing operation.
2. The method of claim 1, wherein the first feedback information is a first circular pattern corresponding to the compression parameter setting.
3. The method of claim 1, wherein the second feedback information is a second circular pattern equal to the touch area.
4. An apparatus for setting an operation gesture, comprising:
the display module is used for displaying an operation gesture setting interface, setting an interface display setting area and a response area by the operation gesture, wherein the operation gesture setting interface is an interactive interface for setting operation gesture parameters;
the first acquisition module is used for acquiring a setting instruction input by a user in the setting area;
the processing module responds to an operation gesture acted on the pressing area adjusting control in the setting area by a user and obtains a pressing area parameter set value according to the operation gesture;
displaying first feedback information corresponding to the set value of the pressing area parameter in the response area;
the second acquisition module is used for responding to a pressing gesture acted on the response area by a user and acquiring the touch area of the pressing gesture according to the pressing gesture; displaying second feedback information corresponding to the touch area in the response area; when the touch area is smaller than the pressing area corresponding to the pressing area parameter set value, prompt information is displayed in the response area to prompt that the pressing area is too small, and the pressing area parameter set value is a threshold value for triggering large-area pressing operation.
5. A mobile terminal, comprising: a touch screen, a memory, and a processor, the touch screen and the memory coupled to the processor, the memory storing instructions that, when executed by the processor, the processor performs the method of any of claims 1-3.
6. A computer-readable storage medium storing program code executable by a processor, the program code causing the processor to perform the method according to any one of claims 1-3.
CN201810350865.9A 2018-04-18 2018-04-18 Operation gesture setting method and device and mobile terminal Active CN108717325B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810350865.9A CN108717325B (en) 2018-04-18 2018-04-18 Operation gesture setting method and device and mobile terminal
PCT/CN2019/081526 WO2019201102A1 (en) 2018-04-18 2019-04-04 Operation gesture setting method and apparatus, and mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810350865.9A CN108717325B (en) 2018-04-18 2018-04-18 Operation gesture setting method and device and mobile terminal

Publications (2)

Publication Number Publication Date
CN108717325A CN108717325A (en) 2018-10-30
CN108717325B true CN108717325B (en) 2020-08-25

Family

ID=63899175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810350865.9A Active CN108717325B (en) 2018-04-18 2018-04-18 Operation gesture setting method and device and mobile terminal

Country Status (2)

Country Link
CN (1) CN108717325B (en)
WO (1) WO2019201102A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717325B (en) * 2018-04-18 2020-08-25 Oppo广东移动通信有限公司 Operation gesture setting method and device and mobile terminal
CN110321050B (en) * 2019-07-10 2021-06-22 成都终身成长科技有限公司 Interactive operation method and device, electronic equipment and storage medium
CN113626301A (en) * 2020-05-06 2021-11-09 北京字节跳动网络技术有限公司 Method and device for generating test script
CN114697448A (en) * 2022-03-29 2022-07-01 南京点明软件科技有限公司 Mobile phone input method applied to blind person operation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841712A (en) * 2012-07-11 2012-12-26 广州市久邦数码科技有限公司 Method and system for identifying and editing gestures
CN103677223A (en) * 2012-09-03 2014-03-26 宏景科技股份有限公司 Touch gesture rapid input device
EP2843535A2 (en) * 2013-09-03 2015-03-04 Samsung Electronics Co., Ltd Apparatus and method of setting gesture in electronic device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106775377B (en) * 2016-11-23 2020-12-18 北京小米移动软件有限公司 Gesture recognition device, equipment and control method of gesture recognition device
CN108717325B (en) * 2018-04-18 2020-08-25 Oppo广东移动通信有限公司 Operation gesture setting method and device and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841712A (en) * 2012-07-11 2012-12-26 广州市久邦数码科技有限公司 Method and system for identifying and editing gestures
CN103677223A (en) * 2012-09-03 2014-03-26 宏景科技股份有限公司 Touch gesture rapid input device
EP2843535A2 (en) * 2013-09-03 2015-03-04 Samsung Electronics Co., Ltd Apparatus and method of setting gesture in electronic device

Also Published As

Publication number Publication date
WO2019201102A1 (en) 2019-10-24
CN108717325A (en) 2018-10-30

Similar Documents

Publication Publication Date Title
CN108717325B (en) Operation gesture setting method and device and mobile terminal
US11237703B2 (en) Method for user-operation mode selection and terminals
CN108234875B (en) Shooting display method and device, mobile terminal and storage medium
CN108777731B (en) Key configuration method and device, mobile terminal and storage medium
CN108710456B (en) Application icon processing method and device and mobile terminal
CN108021642B (en) Application program recommendation method and device, mobile terminal and storage medium
US11829581B2 (en) Display control method and terminal
CN108512997B (en) Display method, display device, mobile terminal and storage medium
CN115525383B (en) Wallpaper display method and device, mobile terminal and storage medium
CN108769299B (en) Screen control method and device and mobile terminal
CN108932102B (en) Data processing method and device and mobile terminal
CN107766548B (en) Information display method and device, mobile terminal and readable storage medium
CN108804005B (en) Terminal control method and device and mobile terminal
CN109104521B (en) Method and device for correcting approaching state, mobile terminal and storage medium
CN110221882B (en) Display method, display device, mobile terminal and storage medium
CN111371705B (en) Download task execution method and electronic device
CN111078108A (en) Screen display method and device, storage medium and mobile terminal
CN110221736B (en) Icon processing method and device, mobile terminal and storage medium
CN108803961B (en) Data processing method and device and mobile terminal
CN109101163B (en) Long screen capture method and device and mobile terminal
CN109032465B (en) Data processing method and device and mobile terminal
CN107918517B (en) Screen rotation response method and device, mobile terminal and storage medium
CN107819938B (en) Corner mark configuration method and device, mobile terminal and server
CN108683812B (en) Volume adjusting method and device and mobile terminal
CN108650413B (en) Projection method, projection device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant