CN110377266B - Audio play control method and device, mobile terminal and storage medium - Google Patents

Audio play control method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN110377266B
CN110377266B CN201910657312.2A CN201910657312A CN110377266B CN 110377266 B CN110377266 B CN 110377266B CN 201910657312 A CN201910657312 A CN 201910657312A CN 110377266 B CN110377266 B CN 110377266B
Authority
CN
China
Prior art keywords
mobile terminal
audio
touch
sliding
sensing area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910657312.2A
Other languages
Chinese (zh)
Other versions
CN110377266A (en
Inventor
蔺百杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910657312.2A priority Critical patent/CN110377266B/en
Publication of CN110377266A publication Critical patent/CN110377266A/en
Application granted granted Critical
Publication of CN110377266B publication Critical patent/CN110377266B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Telephone Function (AREA)
  • Position Input By Displaying (AREA)

Abstract

The embodiment of the application discloses an audio playing control method, an audio playing control device, a mobile terminal and a storage medium, and relates to the technical field of mobile terminals. The method is applied to a mobile terminal, the mobile terminal comprises a terminal body and a touch sensor arranged on the terminal body, and the method comprises the following steps: determining whether the touch sensor detects a touch operation when the mobile terminal is playing audio; if touch operation is detected, determining whether the mobile terminal is in a screen-off state; and if the mobile terminal is in the screen-off state, generating touch information according to the touch operation, and updating the audio playing action of the mobile terminal according to the touch information. The integrated mobile terminal body can meet the integrated requirement of the mobile terminal body, and simultaneously, the audio control of a user on the mobile terminal is performed in a screen-off state, so that the power consumption of the whole mobile terminal is effectively reduced, the service time of the whole mobile terminal under the same endurance is prolonged, and the user can realize various audio control functions through different touch operations.

Description

Audio play control method and device, mobile terminal and storage medium
Technical Field
The present application relates to the field of mobile terminals, and in particular, to an audio playback control method, an audio playback control device, a mobile terminal, and a storage medium.
Background
At present, a user often realizes audio playing control by clicking a mechanical key arranged on a mobile terminal, and because the number of the mechanical keys which can be arranged on the mobile terminal is limited, if the audio playing is operated by the mechanical key arranged on the mobile terminal, the audio playing operation which can be performed by the mobile terminal is also relatively limited, so that too many audio control functions cannot be realized.
Disclosure of Invention
In view of the above problems, the present application provides an audio playing control method, an audio playing control device, a mobile terminal and a storage medium, which can control audio playing of the mobile terminal in a screen-off state, thereby reducing power consumption of the mobile terminal.
In a first aspect, an embodiment of the present application provides an audio playing control method, which is applied to a mobile terminal, where the mobile terminal includes a terminal body and a touch sensor disposed on the terminal body, and the method includes: determining whether a touch operation is detected by the touch sensor when the mobile terminal is playing audio; if touch operation is detected, determining whether the mobile terminal is in a screen-off state; if the mobile terminal is in the screen-off state, touch information is generated according to the touch operation, and the audio playing action of the mobile terminal is updated according to the touch information.
In a second aspect, an embodiment of the present application provides an audio play control device, which is applied to a mobile terminal, where the mobile terminal includes a terminal body and a touch sensor disposed on the terminal body, and the device includes: the touch operation detection module, the screen extinction detection module and the audio play control module; the touch operation detection module is used for determining whether the touch sensor detects touch operation or not when the mobile terminal is playing audio; the screen-off detection module is used for determining whether the mobile terminal is in a screen-off state or not if touch operation is detected; and the audio playing control module is used for generating touch information according to the touch operation if the mobile terminal is in the screen-off state, and updating the audio playing action of the mobile terminal according to the touch information.
In a third aspect, an embodiment of the present application provides a mobile terminal, including: the audio playback control system comprises one or more processors, memory, and one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, and wherein the one or more program is configured to perform the audio playback control method described above.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium having program code stored therein, the program code being capable of being invoked by a processor to perform the above-described audio playback control method.
According to the audio playing control method, the audio playing control device, the mobile terminal and the storage medium, when the mobile terminal is playing audio, touch operation of a user is detected in real time through the touch sensor arranged on the terminal body, touch information is generated according to the touch operation when the mobile terminal is in a screen-off state, and then audio playing action of the mobile terminal is controlled through the touch information, so that audio control of the user on the mobile terminal can be performed in the screen-off state, power consumption of the whole machine is effectively reduced, and service time of the whole machine under the same continuous voyage is prolonged.
These and other aspects of the application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 illustrates a schematic structure of a mobile terminal according to an embodiment of the present application.
Fig. 2 shows a flowchart of an audio play control method according to an embodiment of the present application.
Fig. 3 shows a flowchart of an audio play control method according to another embodiment of the present application.
Fig. 4 illustrates a flowchart of a method of updating an audio play file according to first touch information according to one embodiment of the present application.
Fig. 5 shows a flowchart of an audio play control method according to still another embodiment of the present application.
FIG. 6 illustrates a schematic diagram of a user's operation on a first sensing region according to one embodiment of the application.
Fig. 7 shows a flowchart of an audio play control method according to still another embodiment of the present application.
FIG. 8 illustrates a schematic diagram of a user's operation on a second sensing area according to one embodiment of the application.
Fig. 9 shows a flowchart of an audio play control method according to still another embodiment of the present application.
FIG. 10 illustrates a flowchart of a method of determining whether a touch operation is valid, according to one embodiment of the application.
FIG. 11 shows a flowchart of a method of determining whether a touch operation is valid according to yet another embodiment of the application.
Fig. 12 shows a flowchart of an audio play control method according to still another embodiment of the present application.
Fig. 13 shows a schematic diagram of a parasitic capacitance structure without finger pressing according to an embodiment of the present application.
Fig. 14 shows a schematic diagram of a parasitic capacitance structure with finger presses according to one embodiment of the application.
FIG. 15 illustrates a processing chip internal frame diagram according to one embodiment of the application.
FIG. 16 illustrates a change in data delta as a finger presses according to one embodiment of the application.
FIG. 17 is a schematic diagram of incremental changes in data during slide detection according to one embodiment of the application.
Fig. 18 is a schematic diagram illustrating a distribution of a sliding touch area on a mobile terminal according to an embodiment of the present application.
Fig. 19 shows a schematic structural diagram of an audio play control device according to an embodiment of the present application.
Fig. 20 is a block diagram of a mobile terminal for performing an audio play control method according to an embodiment of the present application.
Fig. 21 is a storage unit for storing or carrying program code for implementing an audio playback control method according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the present application, the following description will make clear and complete descriptions of the technical solutions according to the embodiments of the present application with reference to the accompanying drawings.
With the continuous development of mobile terminals, more and more modes are used for controlling the mobile terminals by users. Currently, when a user performs audio control on a mobile terminal, audio playing is generally controlled by clicking or dragging an audio control identifier on a display interface of the mobile terminal. Alternatively, the audio may be adjusted by a mechanical key provided on the mobile terminal, for example, by a volume adjustment key.
However, when controlling audio playback by clicking or dragging an audio control icon on the display interface of the mobile terminal, the mobile terminal must be maintained in a bright screen state. Therefore, if the user operates the audio playback for a long time, the mobile terminal is left in the bright screen state for a long time, which necessarily increases the power consumption of the mobile terminal. Alternatively, the user may operate the audio played very frequently, resulting in the mobile terminal having to switch between off and on frequently, which may also increase the power consumption of the mobile terminal. In addition, because the number of the mechanical keys that can be set on the mobile terminal is limited, if the audio playing is operated through the mechanical keys that are set on the mobile terminal, the audio playing operation that can be performed by the mobile terminal is also relatively limited, so that too many audio control functions cannot be realized, and moreover, the requirement of the integrated body of the mobile terminal cannot be met by using the mechanical keys, so that the user experience is poor.
The inventor finds that if the audio playing of the mobile terminal can be controlled in a touch mode when the mobile terminal is in a screen-off state, the power consumption of the mobile terminal caused by screen brightness can be avoided, and the mobile terminal can be ensured to have various modes for operating the audio playing.
Therefore, the inventor provides an audio play control method, an audio play control device, a mobile terminal and a storage medium.
Referring to fig. 1, a mobile terminal 100 may include a mobile terminal body 101 and a touch sensor 102 disposed on the terminal body. Wherein the touch sensor 102 may be one or more. The touch sensor 102 may be a capacitive touch sensor, an ultrasonic touch sensor, or an infrared touch sensor. In some embodiments, the touch sensor 102 may be disposed on the rear cover of the terminal body or on the bezel of the terminal body 101.
Referring to fig. 2, the method may include:
in step S110, it is determined whether a touch operation is detected by the touch sensor while the mobile terminal is playing audio.
The mobile terminal may play audio in the background through audio playing software, where the audio playing software may be, for example, a music player, foreign language hearing software, voice assistant software, video playing software that may play audio in the background, and the like. The audio may be music, recordings, broadcasts, etc. While the audio is being played, a touch sensor provided on the mobile terminal detects in real time whether a user has performed a touch operation on the mobile terminal, wherein it is understood that the touch operation refers to an operation of the user touching the mobile terminal with different actions through the finger, such as a click action, a double click action, a long press action, a sliding action, etc. made by the user on the mobile terminal.
Step S120, if touch operation is detected, determining whether the mobile terminal is in a screen off state.
And when the mobile terminal detects the touch operation made by the user, continuously detecting whether the current mobile terminal is in a screen-off state. Alternatively, when the mobile terminal does not detect the touch operation of the user, it may return to step S110 to re-detect whether the touch operation of the user is performed on the mobile terminal. The detection of whether the mobile terminal is in the off-screen state may be determined by detecting the brightness of the mobile terminal, and specifically, when the brightness of the mobile terminal is 0, it may be determined that the mobile terminal is in the off-screen state.
Step S130, if the mobile terminal is in the screen-off state, touch information is generated according to the touch operation, and the audio playing action of the mobile terminal is updated according to the touch information.
When the mobile terminal is detected to be in the screen-off state currently, corresponding touch information is generated according to the touch operation detected by the touch sensor, and then the audio playing action of the mobile terminal can be updated according to the touch information. Specifically, when controlling the audio playing action, a touch may correspond to an audio playing action, for example, the user may pause the playing of the audio through a double click action made on the mobile terminal, may start the playing of the audio through a single click action made on the mobile terminal, may select the progress of the audio playing through a sliding action made on the mobile terminal, and so on. Because the user updates the audio playing action by touching the mobile terminal in the screen-off state, the situation that the power consumption is increased because the user needs to lighten the screen of the mobile terminal when controlling the audio playing can be avoided, and the service time of the mobile terminal under the same endurance can be effectively prolonged. Meanwhile, the screen unlocking step during audio control of the mobile terminal is omitted, so that the control is more convenient and simpler. Moreover, the audio control is performed through touch operation, so that the problem that too many audio control functions cannot be realized due to the fact that the mechanical keys are used for audio control is avoided, the requirement of an integrated body of the mobile terminal can be met, and therefore user experience can be improved.
It can be understood that generating touch information according to a touch operation refers to extracting some action features generated when a user touches a mobile terminal, most of the action features are analog signals, and then converting the analog signals into digital signals that can be identified by a processor of the mobile terminal, i.e. touch information, for example, when the user clicks an area (hereinafter referred to as an induction area) where a touch sensor is arranged on the mobile terminal, the user can obtain features such as a clicking position, a clicking frequency, a clicking force, and the like, and can generate feature values corresponding to the clicking action according to the features, i.e. touch information. For example, when the touch operation of the user is to press any point in the sensing area of the mobile terminal for a long time, the characteristics of the long time, such as the position, the force and the duration of the long time, can be obtained, and the characteristics are converted into corresponding values, namely the touch information corresponding to the long time pressing operation. For example, the user's touch operation is to slide a distance in the sensing area of the mobile terminal, at this time, features such as the sliding distance and the touch force during sliding can be obtained, and these features are converted into corresponding values, that is, the touch information corresponding to the sliding operation.
In some embodiments, the mobile terminal includes a first sensing area, where the first sensing area is a shape area with a border, and may be a long strip area, a round area, or a rectangular area, and optionally, the first sensing area may be located on a back cover or a frame of the mobile terminal, and when the first sensing area is located on the frame, the first sensing area may be disposed along a length direction of the mobile terminal. The touch sensors are disposed in the first sensing area, wherein the number of the touch sensors may be plural, and the plurality of touch sensors may be uniformly distributed on the first sensing area. Referring to fig. 3, in some embodiments, an audio playback control method may include the steps of:
in step S210, it is determined whether a touch operation is detected by the touch sensor while the mobile terminal is playing audio.
Step S220, if touch operation is detected, whether the mobile terminal is in a screen-off state is determined.
In step S230, first touch information on the first sensing area is acquired, the first touch information corresponds to a touch operation acting on the first sensing area, and the first touch information includes a first sliding start point and a first sliding end point.
As an example, the first sensing area corresponds to updating of the audio playing file, and the touch operation made by the user in the first sensing area can control the audio playing file. The mobile terminal detects first touch information on the first sensing area in real time, wherein the first touch information can be touch information corresponding to sliding action, it can be understood that the first touch information and the touch information are the same type of information and can be generated by touch operation, and the generation process of the first touch information is approximately the same as that of the touch information generated according to the touch operation. Specifically, the first touch information may include a first sliding start point and a second sliding end point, and it is understood that the first sliding start point refers to a position where the user first touches the first sensing area when making a sliding motion in the first sensing area, and the first sliding end point refers to a position where the user leaves the first sensing area when making a sliding motion in the first sensing area, or a position where the user is in the first sensing area when stopping the sliding motion.
Step S240, obtaining the corresponding endpoint coordinates of the first sliding endpoint on the first sensing region.
In one embodiment, a two-dimensional plane coordinate system may be established in advance with an arbitrary point on the first sensing region (for example, a center point of the first sensing region) as an origin, and then the coordinate endpoint may be determined according to a position of the first sliding endpoint in the two-dimensional plane coordinate system. In another embodiment, when the first sensing region is in a long strip shape, a one-dimensional linear coordinate system may be established in advance with one end of the first sensing region as an origin, and then the coordinate endpoint may be determined according to a position of the first sliding endpoint in the one-dimensional linear coordinate system.
Step S250, if the end point coordinate is at the first designated position of the first sensing area, updating the audio playing file according to the first touch information.
When the mobile terminal detects that the end point coordinate is at the first designated position, an audio playing file updating action can be correspondingly triggered, wherein the first designated position can be one designated coordinate point or a plurality of designated coordinate points in the first sensing area, and specifically can be designated coordinate points on the boundary of the first area. For example, the first designated position includes a plurality of first designated coordinate points, each coordinate point corresponds to a playing trigger of an audio file, when the terminal point coordinate is consistent with the coordinate of one of the first designated coordinate points, the audio file currently played by the mobile terminal can be switched to the audio file corresponding to the first designated coordinate point for playing, wherein the corresponding relationship between the first designated coordinate point and the audio file can be prestored in the mobile terminal. In the embodiment, the user can switch different audio files to play by making a sliding action in the first sensing area, even if the mobile terminal is in a screen-off state, the mobile terminal can be accurately switched, and the control of the user on audio playing is simplified.
In some embodiments, in which the playing order of the plurality of audio files is preset in the mobile terminal, referring to fig. 4, in step S250, updating the audio playing file according to the first touch information may include the following steps:
step S251, determining a first sliding direction according to the first sliding start point and the first sliding end point.
The mobile terminal can determine the first sliding direction according to the coordinates of the first sliding starting point and the second sliding ending point in the coordinate system of the first sensing area.
In step S252, an audio file arranged before or after the currently played audio file is selected from the plurality of audio files according to the first sliding direction for playing.
The mobile terminal stores different first sliding directions in advance corresponding to the switching directions of different audio files, for example, when the first sensing area is in a long strip shape, one end of the first sensing area is used as the head end, the other end is used as the tail end, if the first sliding direction is set to be the direction towards the tail end of the first sensing area, the first sliding direction corresponds to the action of selecting the audio files arranged behind the currently played audio file for playing, when the user slides towards the tail end of the first sensing area, the mobile terminal switches to the audio files arranged behind the currently played audio file for playing, and when the user slides towards the head end of the first sensing area, the mobile terminal switches to the audio files arranged behind the currently played audio file for playing. In the embodiment, the user can quickly realize the switching of the audio playing file only by sliding, and the mobile terminal can accurately switch even in the screen-off state, thereby being convenient for the user to control the audio playing.
Referring to fig. 5, in some embodiments, an audio playback control method may include the steps of:
in step S310, it is determined whether a touch operation is detected by the touch sensor while the mobile terminal is playing audio.
Step S320, if touch operation is detected, determining whether the mobile terminal is in a screen off state.
In step S330, first touch information on the first sensing area is acquired, the first touch information corresponds to a touch operation acting on the first sensing area, and the first touch information includes a first sliding start point and a first sliding end point.
Step S340, obtaining the corresponding endpoint coordinates of the first sliding endpoint on the first sensing region.
In step S350, if the endpoint coordinate is at the first designated position of the first sensing area, the audio playing file is updated according to the first touch information.
In some embodiments, the first designated location may be a plurality of first designated coordinate points located on a boundary of the first sensing region. If the terminal point coordinate of the first sliding terminal point is consistent with the coordinate of the first designated coordinate point on the boundary of the first sensing area, it can be indicated that the finger part slides to or slides out of the boundary of the first sensing area when the user slides in the first sensing area, at this time, the audio control action of the mobile terminal can update the audio playing file, and when the finger part of the user slides to or slides out of the boundary of the first sensing area, the user switches to the corresponding audio file according to the sliding direction to play. The audio playing file can be switched by detecting whether the finger of the user slides out of the boundary of the first sensing area, so that the switching accuracy can be ensured.
In step S360, if the end point coordinate is at the second designated position of the first sensing area, the first sliding distance and the first sliding direction are determined according to the first sliding start point and the first sliding end point.
In some embodiments, the second designated location may be a plurality of second designated coordinate points located within the boundary of the first sensing region, and it is understood that the first designated coordinate point and the second designated coordinate point are in the same coordinate system, and the coordinate system may be established with the center point of the first sensing region as the origin. If the end point coordinate of the first sliding end point is consistent with the coordinate of the second designated coordinate point in the boundary of the first sensing area, it can be indicated that the finger part is always positioned in the boundary of the first sensing area when the user slides in the first sensing area, and at this time, the audio control action of the mobile terminal can update the audio playing progress.
Step S370, determining the increasing direction of the audio playing progress according to the first sliding direction.
Among them, it is understood that the increasing direction of the audio playing progress includes a reverse direction and a forward direction. When the audio playing progress increases towards the reversing direction, reversing the audio playing progress to the front of the current playing progress; when the audio playing progress increases forward, the audio playing progress advances to the back of the current playing progress. For example, the current audio playing progress is 2 minutes and 20 seconds, when the audio playing progress increases in the backward direction, the playing progress can be switched to a certain time between the beginning of audio and 2 minutes and 20 seconds, and when the audio playing progress increases in the forward direction, the playing progress can be switched to a certain time between 2 minutes and 20 seconds and the end of audio.
In step S380, an increasing amount of the audio playing progress in the increasing direction is determined according to the first sliding distance.
Step S390, the audio playing progress is updated according to the increasing direction and increasing amount of the audio playing progress.
As an example, when the first sensing area is in a long strip shape and the first sensing area is disposed on the mobile terminal along the width direction of the mobile terminal, referring to fig. 6, if the user slides the finger portion on the first sensing area toward the head end of the first sensing area, the audio playing progress may be increased toward the reversing direction, and the increase amount of the progress reversing is proportional to the first sliding distance. If the user slides the finger part on the first sensing area towards the tail end of the first sensing area, the audio playing progress can be increased towards the advancing direction, and the increasing amount of the progress is proportional to the first sliding distance. In this embodiment, the user can realize control of two audio play actions only by making a touch operation on one sensing area, so that audio control is more flexible and faster.
In other embodiments, the mobile terminal includes a second sensing area, where the second sensing area may be an elongated area, alternatively, the second sensing area may be located on a rear cover of the mobile terminal or on a frame, and when the second sensing area is located on the frame, the second sensing area may be disposed along a length direction of the mobile terminal. The touch sensors are disposed in the second sensing area, wherein the number of the touch sensors may be plural, and the plurality of touch sensors may be uniformly distributed on the second sensing area.
Referring to fig. 7, in some embodiments, an audio playback control method may include the steps of:
in step S410, it is determined whether a touch operation is detected by the touch sensor while the mobile terminal is playing audio.
Step S420, if touch operation is detected, determining whether the mobile terminal is in a screen off state.
In step S430, second touch information on the second sensing area is acquired, where the second touch information corresponds to a touch operation performed on the second sensing area, and the second touch information includes a second sliding start point and a second sliding end point.
The second sensing region may be substantially the same as the first sensing region, and the second touch information may be substantially the same as the first touch information.
Step S440, determining a second sliding distance and a second sliding direction according to the second sliding start point and the second sliding end point.
Step S450, determining the adjusting direction of the audio playing volume according to the second sliding direction. It will be appreciated that the direction of adjustment of the audio play volume includes either a volume increase or a volume decrease.
Step S460, determining the adjusting quantity of the audio playing volume in the adjusting direction according to the second sliding distance. It is understood that the adjustment amount of the audio play volume in the adjustment direction refers to an increase amount of the volume or a decrease amount of the volume.
Step S470, the audio playing volume is updated according to the adjusting direction and the adjusting amount of the audio playing volume.
As an example, referring to fig. 8, when the second sensing area is elongated and is disposed on the rear cover or the frame of the mobile terminal along the length direction of the mobile terminal, one section of the second sensing area is taken as the top end, and the other section is taken as the low end. When the finger part for the user slides towards the top end of the second sensing area, namely the second sliding direction faces towards the top end of the second sensing area, the volume is increased; when the finger part slides towards the bottom end of the second sensing area, the volume is reduced, the volume increment or the volume reduction is proportional to the sliding distance, and the sliding distance of the finger part on the second sensing area is the second sliding distance. In the embodiment, the user can quickly adjust the audio playing volume only by sliding, and even if the mobile terminal is in a screen-off state, the mobile terminal can also be accurately adjusted, so that the control of the user on the audio playing is facilitated.
In some embodiments, referring to fig. 1 again, the first sensing area 103 and the second sensing area 104 are both disposed on the rear cover of the terminal body 101, the first sensing area 103 and the second sensing area 104 are both elongated, and the first sensing area 103 and the second sensing area 104 are perpendicular to each other. Alternatively, the first sensing region 103 and the second sensing region 104 may or may not intersect.
In some embodiments, in the first sensing area 103 and the second sensing area 104 which are elongated, a series of capacitive sensors may be placed adjacently, so that an increasing or decreasing of input may be realized, and further, a sliding detection process may be realized, and since the first sensing area 103 and the second sensing area 104 are perpendicular to each other, the capacitive sensors are arranged in a horizontal-vertical order, so that a sliding operation of the two-dimensional sensor may be realized. When the finger touch is detected, the detection of the sliding is realized by analyzing the position of the sensor touched by the finger. This sensor placement is referred to as a slider. When a user touches a certain slide bar segment, the adjacent slide bar segments are activated, and the corresponding control integrated circuits (Integrated Circuit, IC) of the mobile terminal 100 calculate the geometric center position of the finger touch by processing the original counts of the touched slide bar segment and the adjacent slide bar segments, and the digital count value of each sensor is used for calculating the center position, so that the resolution of the center position can be effectively improved, and the resolution is far higher than the number of the slide bar segments. For example, the inclusion of five slides may resolve at least 100 physical locations of the finger. If the detection data of the central position pressed by the finger is larger than the geometric edge position touched by the finger, after the position of the finger is detected, the control integrated circuit records the position, when the finger moves, the received data can change continuously along with the movement of the finger, so that the update of the central position is judged, the size of the displacement of the finger can be judged by recording the central position in real time, the sliding detection is realized, and the control integrated circuit can calculate the sliding direction and the sliding displacement according to the position information.
In some embodiments, referring to fig. 9, an audio playing control method may include:
in step S510, it is determined whether a touch operation is detected by the touch sensor while the mobile terminal is playing audio.
In step S520, if a touch operation is detected, it is determined whether the touch operation is valid.
In some embodiments, referring to fig. 10, determining whether the touch operation is valid in step S520 may include:
in step S521, the touch force and the sliding distance corresponding to the touch operation are obtained.
In step S522, it is determined whether the touch force during the sliding is greater than or equal to the touch force threshold value, and whether the sliding distance is greater than or equal to the sliding distance threshold value.
In step S523, if the touch strength is greater than or equal to the touch strength threshold and the sliding distance is greater than or equal to the sliding distance threshold, it is determined that the touch operation is valid. In the present embodiment, it is possible to accurately determine whether a touch operation is effective by simultaneously detecting the touch force and the sliding distance.
In some embodiments, referring to fig. 11, determining whether the touch operation is valid in step S520 may further include:
in step S501, a touch force and a sliding distance corresponding to the touch operation are obtained.
Step S502, determining whether the touch force in the sliding process is greater than or equal to a touch force threshold.
In step S503, if the sliding distance is greater than or equal to the touch force threshold, it is determined whether the sliding distance is greater than or equal to the sliding distance threshold.
In step S504, if the sliding distance is greater than or equal to the sliding distance threshold, it is determined that the touch operation is valid. In this embodiment, whether the touch operation is effective is determined by determining whether the touch force is greater than or equal to the touch force threshold, and determining that the sliding distance is greater than or equal to the sliding distance threshold when the touch force is greater than or equal to the touch force threshold.
In step S530, if the touch operation is valid, it is determined whether the mobile terminal is in the off-screen state. Alternatively, if the touch operation is detected to be invalid, the process may return to step S210 to re-detect the touch operation, so as to ensure that the mobile terminal detects whether there is a touch operation in real time.
Step S540, if the mobile terminal is in the screen-off state, touch information is generated according to the touch operation, and the audio playing action of the mobile terminal is updated according to the touch information.
In other embodiments, referring to fig. 12, an audio playing control method may include:
step S610, when the mobile terminal is playing audio, determining whether a touch operation is detected by a touch sensor;
step S620, if touch operation is detected, determining whether the mobile terminal is in a screen off state;
in step S630, if the touch screen is in the off-screen state, touch information is generated according to the touch operation.
Step S640, determining whether there is an audio play action associated with the touch information in the mobile terminal.
Specifically, the audio play action associated with the touch information may be stored in the mobile terminal in advance, for example, a sliding distance in the touch information may be associated with an adjustment amount of the audio play volume, for example, a sliding direction in the touch information may be associated with a switching direction of the audio play file. By detecting whether the audio playing action associated with the touch information exists in the mobile terminal, the user can be further ensured to accurately perform audio playing control.
Step S650, if yes, updating the audio playing action of the mobile terminal according to the touch information; if not, returning to the step of determining whether the touch operation is detected by the touch sensor.
In some embodiments, if the audio playing action associated with the touch information does not exist in the mobile terminal, a reminding message for reminding the user that the current touch operation is an invalid operation may be displayed on the mobile terminal. In other embodiments, if there is no audio action associated with the touch information, the mobile terminal may send information for indicating to the user that the current touch operation is associated with the audio play action, for example, the mobile terminal reminds the user whether to associate the current touch operation with the audio play action, if so, a plurality of audio play actions are listed for the user to select, and after the selection, when the user makes the touch operation again, the mobile terminal performs the audio play action corresponding to the touch operation, so as to implement the user-defined setting of the touch operation.
In some embodiments, when pressure detection is performed using a capacitive touch sensor, finger pressure detection is to detect a change in substrate capacitance using self-capacitance sensing to determine whether there is a finger press. As shown in fig. 13 and 14, a General-purpose input/output (GPIO) pin is connected to the sensor substrate, i.e., to the printed circuit board, via traces and vias, and separated from other sensors and their traces around the grid of sensing layers, which may be sensor pads, fig. 13 shows a state without finger pressing, and parasitic capacitance C exists between the substrate and the grid, which is a minute value that is fixed without changing the surrounding environment. Fig. 14 shows a state in which there is a finger press, the conductive property and the large mass of the human body constitute a grounded ground layer (parallel to the sensor pad), and this operation constitutes one parallel plate capacitor. The capacitance between the sensor pad and the finger can be calculated by the following formula:
Wherein ε0=air dielectric constant; epsilonr = relative permittivity of the cover layer; a = contact area of the finger and sensor pad cover, cover layer; d = cover layer thickness; c (C) F Is the finger capacitance. Due to parasitic capacitance C P And finger capacitance C F Both can represent the capacitance of the sensor pin and ground, so both are identical in nature.
Thus, when a finger contacts the sensor, the sensor total capacitance C S Equal to C P And C F And (3) summing.
Referring to fig. 15, each GPIO of the capacitance control IC has a switched capacitor circuit for converting the sensor capacitance value to an equivalent current. The analog converter (Analog Multiplexer) then selects one of the current signals and sends the signal To the current-To-digital converter (current-To-Digital Converter). The principle of operation of the current-to-digital converter is similar to that of a Delta Sigma ADC. The output count of the current-to-digital converter is a digital value proportional to the self-capacitance between the individual electrodes. Thus, the capacitance sensor control IC may convert Cs into a corresponding digital count value in real time, and identify whether a finger touch is present by analyzing the increment of the digital count value, as shown in fig. 16.
Specifically, in this embodiment, the mobile phone battery cover may be in the form of a key, the pad of the key is implemented by a flexible circuit board (Flexible Printed Circuit, FPC) coil, the coil is attached to the back of the battery cover by double sided tape, and the coil is connected to the GPIO of the control IC by an FPC flat cable. Inside the IC, cs is first charged and discharged by driving the GPIO, the capacitance value of Cs is converted into an analog voltage waveform, and then the voltage waveform is converted into a digital signal by the sigma-delta ADC and recorded as a digital count value corresponding to Cs.
In some embodiments, when the sliding detection is performed, a series of capacitive sensors are adjacently placed based on a pressing principle, so that the input increment or decrement can be realized, and further, the sliding detection process is realized, and if the capacitive sensors are arranged in sequence in a horizontal-vertical manner, the sliding operation of the two-dimensional sensor can be realized. As shown in fig. 17, the double V-shaped pattern areas respectively represent pads of six sensors, and when a finger touch is detected, detection sliding is realized by analyzing the sensor position of the finger touch, and such a sensor placement form is technically called a slider. Touching a certain slide bar section activates adjacent slide bar sections, the control IC calculates the geometric center position of finger touch by processing the original counts of the touched slide bar section and the adjacent slide bar sections, and the digital count value of each sensor is used for calculating the center position, so that the resolution of the center position can be effectively improved, and the resolution is far higher than the number of the slide bar sections. For example, the inclusion of five slides may resolve at least 100 physical locations of the finger.
When the central position detection data pressed by the finger is larger than the edge position, the IC records the position after the finger position is detected, and when the finger moves, the received data can change continuously along with the movement of the finger, so that the update of the central position is judged, the size of the finger displacement can be judged by recording the central position in real time, the sliding detection is realized, and the control IC can calculate the sliding direction and the sliding displacement according to the position information.
The application of the sliding detection on the mobile phone can be as shown in fig. 18, the battery cover at the back of the mobile phone is attached to the two sensors in the horizontal direction and the vertical direction, each slide bar adopts 6 capacitance sensors, and the total number of the capacitance sensors is 12, so that the two-dimensional sliding detection is realized. The sensor is connected with the control IC through the FPC, and the control IC is welded at the main board end and communicated with the AP through the IIC. In practical use, the number of sensors required may be 3 or more to ensure the size and accuracy of the sliding touch area.
Referring to fig. 19, a block diagram of an audio play control device 500 according to an embodiment of the application may include: a touch operation detection module 510, a screen off detection module 530, and an audio playback control module 540; the touch operation detection module 510 is configured to determine, when the mobile terminal is playing audio, whether the touch sensor detects a touch operation; the screen-off detection module 530 is configured to determine whether the mobile terminal is in a screen-off state if a touch operation is detected; the audio playing control module 540 is configured to generate touch information according to the touch operation if the mobile terminal is in the off-screen state, and update an audio playing action of the mobile terminal according to the touch information.
Further, the off-screen detection module 530 is further configured to determine whether the touch operation is valid; if so, determining whether the mobile terminal is in a screen-off state.
Further, the screen-off detection module 530 is further configured to obtain a touch force and a sliding distance corresponding to the touch operation; determining whether the touch force in the sliding process is greater than or equal to a touch force threshold value and whether the sliding distance is greater than or equal to a sliding distance threshold value; and if the touch force is greater than or equal to the touch force threshold value and the sliding distance is greater than or equal to the sliding distance threshold value, determining that the touch operation is effective.
Further, the mobile terminal includes a first sensing area, the touch sensor is disposed in the first sensing area, and the audio playing control module 540 includes:
the first touch information detection unit is used for acquiring first touch information on the first sensing area, the first touch information corresponds to touch operation acted on the first sensing area, and the first touch information comprises a first sliding starting point and a first sliding ending point.
And the terminal coordinate acquisition unit is used for acquiring the terminal coordinate corresponding to the first sliding terminal on the first sensing area.
And the audio playing file updating unit is used for updating the audio playing file according to the first touch information if the end point coordinate is positioned at the first appointed position of the first sensing area.
Further, the mobile terminal sets a playing sequence of a plurality of audio files in advance, and the audio playing file updating unit is further configured to: determining a first sliding direction according to the first sliding starting point and the first sliding ending point; and selecting an audio file arranged before or after the currently played audio file from the plurality of audio files according to the first sliding direction for playing.
Further, the audio play file updating unit is further configured to: if the terminal point coordinate is at the second appointed position of the first induction area, determining a first sliding distance and a first sliding direction according to the first sliding starting point and the first sliding terminal point; determining the increasing direction of the audio playing progress according to the first sliding direction; determining the increment of the audio playing progress in the increasing direction according to the first sliding distance; and updating the audio playing progress according to the increasing direction and the increasing amount of the audio playing progress.
Further, the mobile terminal further comprises a second sensing area, and the touch sensor is further arranged in the second sensing area; the audio control unit further includes:
and the second touch information acquisition unit is used for acquiring second touch information on a second sensing area, the second touch information corresponds to touch operation acted on the second sensing area, and the second touch information comprises a second sliding starting point and a second sliding ending point.
And a second sliding distance and second sliding direction judging unit for determining a second sliding distance and a second sliding direction according to the second sliding start point and the second sliding end point.
And the volume adjusting direction judging unit is used for determining the adjusting direction of the audio playing volume according to the second sliding direction.
And the volume adjustment amount judging unit is used for determining the adjustment amount of the audio playing volume in the adjustment direction according to the second sliding distance.
And the volume updating unit is used for updating the audio playing volume according to the adjusting direction and the adjusting amount of the audio playing volume.
Further, the audio playing control unit is further used for determining whether an audio playing action associated with the touch information exists in the mobile terminal; if yes, updating the audio playing action of the mobile terminal according to the touch information; if not, return to determining whether the touch sensor detects a touch operation.
Further, the first sensing area and the second sensing area are both arranged on the rear cover of the terminal body, the first sensing area and the second sensing area are both long-strip-shaped, and the first sensing area and the second sensing area are mutually perpendicular.
Further, the audio play action includes at least one of audio pause, audio fast forward, audio fast rewind, audio play progress update, audio play volume update, and audio play file selection.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus and modules described above may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
In the several embodiments provided by the present application, the illustrated or discussed coupling or direct coupling or communication connection of the modules to each other may be through some interfaces, indirect coupling or communication connection of devices or modules, electrical, mechanical, or other forms.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
Referring to fig. 20, a block diagram of a mobile terminal 600 according to an embodiment of the present application is shown. The mobile terminal 600 may be an electronic device capable of running applications such as a smart phone, tablet computer, electronic book, etc. The mobile terminal 600 of the present application may include one or more of the following components: a processor 610, a memory 620, and one or more application programs, wherein the one or more application programs may be stored in the memory 620 and configured to be executed by the one or more processors 610, the one or more program(s) configured to perform the method as described in the foregoing method embodiments.
Processor 610 may include one or more processing cores. The processor 610 connects various parts within the overall mobile terminal 600 using various interfaces and lines, performs various functions of the mobile terminal 600 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 620, and invoking data stored in the memory 620. Alternatively, the processor 610 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 610 may integrate one or a combination of several of a central processor 610 (Central Processing Unit, CPU), an image processor 610 (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing of display content; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 610 and may be implemented solely by a single communication chip.
The Memory 620 may include a random access Memory 620 (Random Access Memory, RAM) or a Read-Only Memory 620 (Read-Only Memory). Memory 620 may be used to store instructions, programs, code sets, or instruction sets. The memory 620 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described below, etc. The storage data area may also store data created by the terminal in use (such as phonebook, audio-video data, chat-record data), etc.
In addition, the mobile terminal 600 may further include: the touch sensor 630 is electrically connected to the processor 610, wherein the touch sensor 630 may be one or a combination of several of a capacitive touch sensor 630, an ultrasonic touch sensor 630 and an infrared touch sensor 630.
Referring to fig. 21, a block diagram of a computer readable storage medium according to an embodiment of the present application is shown. The computer readable medium 700 has stored therein program code which is adapted to be invoked by a processor to perform the method described in the method embodiments described above.
The computer readable storage medium 700 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Optionally, the computer readable storage medium 700 comprises a non-transitory computer readable medium (non-transitory computer-readable storage medium). The computer readable storage medium 700 has memory space for program code 710 that performs any of the method steps described above. The program code can be read from or written to one or more computer program products. Program code 710 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be appreciated by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not drive the essence of the corresponding technical solutions to depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (12)

1. The audio playing control method is characterized by being applied to a mobile terminal, wherein the mobile terminal comprises a terminal body and a touch sensor arranged on the terminal body, the mobile terminal comprises a first sensing area, and the touch sensor is arranged in the first sensing area, and the method comprises the following steps:
determining whether the touch sensor detects a touch operation when the mobile terminal is playing audio;
if touch operation is detected, determining whether the mobile terminal is in a screen-off state;
if the touch screen is in the off-screen state, touch information is generated according to the touch operation;
acquiring first touch information on the first sensing area, wherein the first touch information corresponds to touch operation acted on the first sensing area, and the first touch information comprises a first sliding starting point and a first sliding ending point;
acquiring an end point coordinate corresponding to the first sliding end point on the first sensing area;
if the terminal point coordinate is at the first appointed position of the first induction area, updating an audio playing file according to the first touch information;
the updating the audio playing file according to the first touch information includes:
And if the end point coordinates are consistent with first specified coordinate points in a first specified position of the first sensing area, playing the audio file corresponding to the first specified coordinate points, wherein the first specified position comprises a plurality of first specified coordinate points.
2. The method of claim 1, wherein the determining whether the mobile terminal is in an off-screen state comprises:
determining whether the touch operation is valid;
if so, determining whether the mobile terminal is in a screen-off state.
3. The method of claim 2, wherein the determining whether the touch operation is valid comprises:
acquiring touch force and sliding distance corresponding to the touch operation;
determining whether the touch force in the sliding process is greater than or equal to a touch force threshold value and whether the sliding distance is greater than or equal to a sliding distance threshold value;
and if the touch force is greater than or equal to a touch force threshold and the sliding distance is greater than or equal to a sliding distance threshold, determining that the touch operation is effective.
4. The method of claim 1, wherein a playing order of a plurality of audio files is preset in the mobile terminal, and updating the audio playing file according to the first touch information comprises:
Determining a first sliding direction according to the first sliding starting point and the first sliding ending point;
and selecting an audio file arranged before or after the currently played audio file from the plurality of audio files according to the first sliding direction for playing.
5. The method according to claim 1, wherein the method further comprises:
if the end point coordinate is at the second designated position of the first sensing area, determining a first sliding distance and a first sliding direction according to the first sliding starting point and the first sliding end point;
determining the increasing direction of the audio playing progress according to the first sliding direction;
determining an increase of the audio playing progress in the increasing direction according to the first sliding distance;
and updating the audio playing progress according to the increasing direction of the audio playing progress and the increasing amount.
6. The method of claim 1, wherein the mobile terminal further comprises a second sensing area, the touch sensor further disposed in the second sensing area; the updating the audio playing action of the mobile terminal according to the touch information further comprises:
acquiring second touch information on the second sensing area, wherein the second touch information corresponds to touch operation acted on the second sensing area, and the second touch information comprises a second sliding starting point and a second sliding ending point;
Determining a second sliding distance and a second sliding direction according to the second sliding starting point and the second sliding ending point;
determining an adjusting direction of the audio playing volume according to the second sliding direction;
determining the adjustment amount of the audio playing volume in the adjustment direction according to the second sliding distance;
and updating the audio playing volume according to the adjusting direction of the audio playing volume and the adjusting quantity.
7. The method of claim 6, wherein the first sensing area and the second sensing area are both disposed on the rear cover of the terminal body, the first sensing area and the second sensing area are both elongated, and the first sensing area and the second sensing area are perpendicular to each other.
8. The method of claim 1, wherein the audio play action comprises at least one of an audio pause, an audio fast forward, an audio fast rewind, an audio play progress update, an audio play volume update, and an audio play file selection.
9. The method according to any one of claims 1 to 7, wherein before the updating the audio playing action of the mobile terminal according to the touch information, comprising:
Determining whether an audio playing action associated with the touch information exists in the mobile terminal;
if yes, updating the audio playing action of the mobile terminal according to the touch information;
and if not, returning to the step of determining whether the touch operation is detected by the touch sensor.
10. An audio play controlling means, characterized in that is applied to mobile terminal, mobile terminal includes the terminal body and sets up touch sensor on the terminal body, mobile terminal includes first response district, touch sensor set up in first response district, the device includes:
the touch operation detection module is used for determining whether the touch sensor detects touch operation or not when the mobile terminal is playing audio;
the screen-off detection module is used for determining whether the mobile terminal is in a screen-off state or not if touch operation is detected;
the audio playing control module is used for generating touch information according to the touch operation if the audio playing control module is in a screen-off state;
the audio play control module comprises: the device comprises a first touch information detection unit, an end point coordinate acquisition unit and an audio play file updating unit, wherein the first touch information detection unit is used for acquiring first touch information on the first sensing area, the first touch information corresponds to touch operation acted on the first sensing area, and the first touch information comprises a first sliding starting point and a first sliding end point;
The terminal coordinate acquisition unit is used for acquiring terminal coordinates corresponding to the first sliding terminal on the first sensing area;
the audio playing file updating unit is used for updating the audio playing file according to the first touch information if the terminal point coordinate is at the first appointed position of the first induction area; the updating the audio playing file according to the first touch information includes: and if the end point coordinates are consistent with first specified coordinate points in a first specified position of the first sensing area, playing the audio file corresponding to the first specified coordinate points, wherein the first specified position comprises a plurality of first specified coordinate points.
11. A mobile terminal, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more applications configured to perform the method of any of claims 1 to 9.
12. A computer readable storage medium having stored therein program code which is callable by a processor to perform the method of any one of claims 1 to 9.
CN201910657312.2A 2019-07-19 2019-07-19 Audio play control method and device, mobile terminal and storage medium Active CN110377266B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910657312.2A CN110377266B (en) 2019-07-19 2019-07-19 Audio play control method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910657312.2A CN110377266B (en) 2019-07-19 2019-07-19 Audio play control method and device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110377266A CN110377266A (en) 2019-10-25
CN110377266B true CN110377266B (en) 2023-11-03

Family

ID=68254304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910657312.2A Active CN110377266B (en) 2019-07-19 2019-07-19 Audio play control method and device, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110377266B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110955376A (en) * 2019-11-26 2020-04-03 Oppo广东移动通信有限公司 Audio control method and device and electronic equipment
CN113608639A (en) * 2021-08-19 2021-11-05 东莞明轩技术有限公司 Three-dimensional touch signal induction module, three-dimensional touch panel and three-dimensional touch method
CN114070931B (en) * 2021-11-25 2023-08-15 咪咕音乐有限公司 Sound effect adjusting method, device, equipment and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105407222A (en) * 2015-11-23 2016-03-16 东莞市金铭电子有限公司 Volume adjustment method and terminal
CN106383645A (en) * 2016-10-31 2017-02-08 维沃移动通信有限公司 Music playing control method and mobile terminal
WO2017032012A1 (en) * 2015-08-27 2017-03-02 广东欧珀移动通信有限公司 Volume adjusting method and user terminal
CN108549518A (en) * 2018-04-17 2018-09-18 Oppo广东移动通信有限公司 A kind of method, apparatus that music information is shown and terminal device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017032012A1 (en) * 2015-08-27 2017-03-02 广东欧珀移动通信有限公司 Volume adjusting method and user terminal
CN105407222A (en) * 2015-11-23 2016-03-16 东莞市金铭电子有限公司 Volume adjustment method and terminal
CN106383645A (en) * 2016-10-31 2017-02-08 维沃移动通信有限公司 Music playing control method and mobile terminal
CN108549518A (en) * 2018-04-17 2018-09-18 Oppo广东移动通信有限公司 A kind of method, apparatus that music information is shown and terminal device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于AU7850的触摸式多媒体播放器设计;郭涵雅;《电子技术》;20120925(第09期);全文 *

Also Published As

Publication number Publication date
CN110377266A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN110377266B (en) Audio play control method and device, mobile terminal and storage medium
KR101932210B1 (en) Method, system for implementing operation of mobile terminal according to touching signal and mobile terminal
CN101859214B (en) Input device and input processing method using the same
KR101021440B1 (en) Touch-input device, mobile device and control method thereof
CN107066158B (en) Touch-sensitive button with two gears
US9323379B2 (en) Electronic device with a user interface that has more than two degrees of freedom, the user interface comprising a touch-sensitive surface and contact-free detection means
US20110310049A1 (en) Information processing device, information processing method, and information processing program
CN107294522A (en) Response method, device, storage medium and electronic equipment based on touch key-press
US8508500B2 (en) Touch panel electrical device and method for operating thereof
CN109766054B (en) Touch screen device and control method and medium thereof
CN104461344A (en) Floating touch method and float touch device
JP2009258946A (en) Capacitive touch sensor
KR102344353B1 (en) Device operated through opaque cover and system
CN107402712A (en) A kind of response method of touch operation, device, storage medium and terminal
CN108704307A (en) Processing method, device, storage medium and the electronic device of touch information
US20140348334A1 (en) Portable terminal and method for detecting earphone connection
CN109074211B (en) Method for adjusting interface scrolling speed and related equipment
CN106325471B (en) Device and control method
JP2013008263A (en) Input method and input device using input pad
JP2012137800A (en) Portable terminal
CN114356153A (en) Control method, control device, electronic equipment and storage medium
CN109388316B (en) Drawing method, drawing device, storage medium and electronic equipment
US20240143086A1 (en) Button with mechanical switch, electromagnetic sensor and haptic feedback, and method
US20190369732A1 (en) Systems and methods for providing localized pressure sensing and haptic effects for a touch surface
CN115550778A (en) Wireless earphone and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant