CN107203322B - Touch control type sound equipment key and control method thereof - Google Patents

Touch control type sound equipment key and control method thereof Download PDF

Info

Publication number
CN107203322B
CN107203322B CN201710335208.2A CN201710335208A CN107203322B CN 107203322 B CN107203322 B CN 107203322B CN 201710335208 A CN201710335208 A CN 201710335208A CN 107203322 B CN107203322 B CN 107203322B
Authority
CN
China
Prior art keywords
touch
ito
event
determining
sliding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710335208.2A
Other languages
Chinese (zh)
Other versions
CN107203322A (en
Inventor
杨坤
郭奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201710335208.2A priority Critical patent/CN107203322B/en
Publication of CN107203322A publication Critical patent/CN107203322A/en
Application granted granted Critical
Publication of CN107203322B publication Critical patent/CN107203322B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The invention provides a touch control type sound key and a control method thereof. The method comprises the following steps: determining a corresponding touch event according to touch operation on an ITO touch pad of the sound equipment; determining the corresponding relation between the touch event type and the control mode according to the placing posture of the sound equipment; determining a control intention indicated by the touch event based on the corresponding relation between the touch event type and a control mode; and controlling the working state of the sound equipment according to the control intention indicated by the touch event. The technical scheme of the invention realizes the identification of complex touch gestures under different sound equipment placing states, and automatically controls the working state of the sound equipment based on the identification result, so that the keys of the sound equipment are further intelligentized.

Description

Touch control type sound equipment key and control method thereof
Technical Field
The invention relates to the field of intelligent sound key interaction control strategies, in particular to a touch sound key and a control method thereof.
Background
The traditional sound key consists of a mechanical structure, the mechanical structure key is high in cost, the shell structure is complex, and the function is single. Generally, simple functions such as volume adjustment, pause, start and the like all need fixed keys to be executed, and accordingly, a large amount of controller pins are used. With the appearance of the intelligent sound box and the continuous change of the UI interaction mode, keys of the sound box are continuously simplified and innovated.
Currently, audio buttons with touch functionality are emerging. However, these acoustic keys have a single touch function and cannot recognize a complicated touch gesture.
Disclosure of Invention
The invention provides a touch control type sound key and a control method thereof, which are used for identifying complex touch gestures and automatically controlling the working state of a sound according to an identification result so that the sound key is further intelligentized.
The invention provides a touch control type sound key, comprising:
an ITO touch panel for detecting a touch operation;
an inertia detector for detecting the sound placement posture;
the micro control unit is used for controlling the working state of the sound according to the touch operation detected by the ITO touch pad and the sound placing posture detected by the inertia detector; and the flexible circuit board connector is connected with the ITO touch pad and the micro control unit.
Further optionally, the method further comprises: and the temperature and humidity sensor is used for detecting the temperature and humidity of the environment where the sound equipment is located and is electrically connected with the micro control unit.
Further optionally, the ITO touch panel includes: the touch detection device comprises an ITO conductive film, a capacitor array arranged below the ITO conductive film and at least one touch detection channel connected to the capacitor array; the at least one touch detection channel is connected with the flexible circuit board connector.
Further optionally, the method further comprises: a key shell provided with a groove; the ITO touch pad is embedded in the groove, at least two mounting holes are further formed in the key shell, and the at least two mounting holes are located on two sides of the groove.
The invention also provides a control method of the touch control type sound key, which comprises the following steps:
determining a corresponding touch event according to touch operation on an ITO touch pad of the sound equipment;
determining the corresponding relation between the touch event type and the control mode according to the placing posture of the sound equipment;
determining a control intention indicated by the touch event based on the corresponding relation between the touch event type and a control mode;
and controlling the working state of the sound equipment according to the control intention indicated by the touch event.
Further optionally, before the capturing the touch operation on the ITO touch pad, the method further includes: acquiring temperature and humidity information of the environment where the sound equipment is located; and setting the touch parameters of the ITO touch pad according to the temperature and humidity information.
Further optionally, the setting of the touch parameter of the ITO touch pad includes at least one of: setting the sensitivity of the ITO touch panel; setting the anti-interference level of the ITO touch pad; setting the oversampling level of the ITO touch pad; and setting the gain coefficient of the ITO touch pad.
Further optionally, determining a corresponding touch event according to a touch operation on the ITO touch pad of the stereo includes: and if the touch duration of the touch operation is within the duration range corresponding to the clicking event, determining that the clicking event corresponds to the touch operation.
Further optionally, if the touch duration of the touch operation exceeds the duration range corresponding to the click event, the method further includes: calculating the offset of the current touch point corresponding to the touch operation on the ITO touch pad relative to the starting point in real time; and if the offset is larger than or equal to a preset deviation range, determining a sliding event corresponding to the touch operation.
Further optionally, determining that the touch operation corresponds to a sliding event further includes: determining a corresponding sliding direction of the sliding event according to the direction of a corresponding termination point on the ITO touch pad relative to the starting point of the touch operation; and/or determining the sliding distance corresponding to the sliding event according to the offset of the end point relative to the starting point.
Further optionally, determining that the touch operation corresponds to a sliding event further includes: acquiring inflection points of the touch operation in the corresponding sliding process on the ITO touch pad; determining the sliding direction of the sliding event according to the corresponding termination point, the starting point and the inflection point of the touch operation on the ITO touch pad; and/or determining the sliding distance of the sliding event according to the starting point, the ending point and the offset of two adjacent points between the inflection points.
Further optionally, if the offset is smaller than the deviation range, it is determined that the touch operation corresponds to a long-press event.
According to the touch control type sound key and the control method thereof, the working state of the sound is controlled according to the sound placing posture detected by the inertia detector and the touch event detected by the ITO touch pad, the complex touch gesture is recognized under different sound placing states, the working state of the sound is automatically controlled based on the recognition result, and the sound key is further intelligentized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1a is a schematic structural diagram of a touch-sensitive audio button according to an embodiment of the present invention;
fig. 1c is a schematic structural diagram of another touch-sensitive audio button according to an embodiment of the present invention;
FIG. 1b is a schematic structural diagram of an ITO touch panel 10 according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating a method for controlling a touch-sensitive audio button according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating another control method for a touch-sensitive audio button according to an embodiment of the present invention;
fig. 4 is a flowchart illustrating a method for determining a touch event according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1a is a schematic structural diagram of a touch-sensitive audio button according to an embodiment of the present invention. Referring to fig. 1a, the touch-sensitive audio key includes an ITO touch panel 10 for detecting a touch operation; an Inertial detector (IMU) 11 for detecting a sound placement posture; a Micro Controller Unit (MCU) 12 for controlling the operating state of the audio according to the touch operation detected by the ITO touch panel 10 and the audio placement posture detected by the inertia detector 11; and a flexible circuit board connector 13 connecting the ITO touch panel 10 and the micro control unit 12.
Fig. 1b is a schematic structural diagram of an ITO touch panel 10 according to an embodiment of the present invention. As shown in fig. 1b, the ITO touch panel 10 includes: the touch panel comprises an ITO conductive film 101, a capacitor array 102 arranged below the ITO conductive film 101, and at least one touch detection channel 103 connected to the capacitor array 102. Wherein at least one touch detection channel 103 is connected to the flexible circuit board connector 13. When a touch operation is performed on the ITO conductive film 101, the capacitance array 102 generates a capacitance change, and the micro control unit 12 sequentially detects the horizontal capacitance array and the vertical capacitance array of the capacitance array 102 to determine a capacitance change point and coordinates thereof.
The number of the touch detection channels 103 is related to the accuracy of touch detection, and may be set according to the type of the audio key product and the size of the ITO touch pad 10, which is not limited in the embodiments of the present invention. In an alternative embodiment, when the length of the ITO touch pad 10 is 55mm and the width is 7.3mm, 3 touch detection channels may be provided to avoid mutual interference between multiple touch detection channels while ensuring that the ITO touch pad 10 has high detection accuracy.
The inertial detector 11 detects the placing posture of the sound, which may be a forward placing posture or a backward placing posture of the sound. It should be understood that the pose detected by the inertia detector 11 is not limited to the forward pose or the backward pose, as the shape of the external appearance of the audio can satisfy other forms of poses. For example, when any one of the faces of a sound device having a 12-face appearance is a placing face, the inertia detector 11 may determine the current placing posture of the sound device by detecting an angle between the placing face and a predefined forward placing face.
The micro control unit 12 is electrically connected to the inertia detector 11 to receive the sound box placement posture data detected by the inertia detector 11. After receiving the speaker placement posture data, the micro control unit 12 automatically adjusts the correspondence between the pre-configured touch event type and the control mode according to the speaker placement posture.
In an optional embodiment, in the initial configuration, when the sound box is placed in the forward direction, a control mode of increasing the volume is corresponding to a sliding event in which the sliding direction is directed from the first side to the second side of the sound box, and a control mode of decreasing the volume is corresponding to a sliding event in which the sliding direction is directed from the second side to the first side. When the mcu 12 knows that the sound is placed in the reverse direction, the sliding event of the sliding direction from the first side to the second side corresponds to a control mode of decreasing the volume, and the sliding event of the sliding direction from the second side to the first side corresponds to a control mode of increasing the volume.
Optionally, in the embodiment of the present invention, a Flexible Printed Circuit Connector (Flexible Printed Circuit Connector)13 is used to connect the ITO touchpad 10 and the micro control unit 12. The flexible circuit board connector 13 is small, light and flexible, facilitating assembly in confined spaces.
In this embodiment, the main structure of the touch-control type audio button is composed of an ITO touch panel 10, an inertia detector 11, and a micro control unit 12. The micro control unit 12 can receive a complicated touch operation detected by the ITO touch panel 10 and resolve the touch operation into different types of touch events. Based on different types of touch events, the micro control unit 12 further analyzes the control intention of the user in combination with the sound placement posture detected by the inertia detector 11, and controls the working state of the sound according to the control intention obtained by analysis, thereby realizing the recognition of complex touch gestures under different sound placement states, and automatically controlling the working state of the sound based on the recognition result, so that the sound key is further intelligentized.
Fig. 1c is a schematic structural diagram of another touch-sensitive audio button according to an embodiment of the present invention. As shown in fig. 1b, the touch-sensitive button for audio provided in the embodiment of the present invention further includes a temperature and humidity sensor 16 for detecting the temperature and humidity of the environment where the audio is located and electrically connected to the micro control unit 12.
After the temperature and humidity sensor 16 detects the temperature and humidity of the environment where the sound equipment is located, a temperature and humidity signal is sent to the micro control unit 12. The micro control unit 12 sets the touch parameters of the ITO touch panel 10 according to the temperature and humidity information of the current environment indicated by the received temperature and humidity signals, so that the performance of the ITO touch panel 10 is adaptively adjusted in different temperature and humidity environments, and further, the detection accuracy is improved and the user experience is improved.
Optionally, the micro control unit 12 sets a touch parameter of the ITO touch panel 10 according to the temperature and humidity information of the current environment, including setting at least one parameter of sensitivity of the ITO touch panel, an anti-interference level of the ITO touch panel, an oversampling level, and a gain coefficient. For example, when the temperature and humidity information is greater than a certain threshold, the mcu 12 may raise the interference rejection level and oversampling level of the ITO touch panel 10 and increase the gain factor.
In this embodiment, through add temperature and humidity sensor 16 in touch-control formula stereo set button, and then little the control unit 12 can be according to the humiture automatic configuration touch pad's of stereo set place environment touch parameter, further promoted the adaptability of touch-control formula stereo set button to the environment, promoted user experience.
Further optionally, as shown in fig. 1a and 1c, the touch-sensitive audio button includes a button housing 14 with a groove; the ITO touch panel 10 is embedded in the groove. The specific embedded mode may be a clamping or sticking mode, and the embodiment of the invention is not limited. In fig. 1a and fig. 1c, at least two mounting holes 15 are further formed in the key housing 14, and the at least two mounting holes 15 are located at two sides of the groove. The at least two mounting holes 15 are used for mounting the touch-sensitive audio button on the audio.
Fig. 2 is a flowchart illustrating a method for controlling a touch-sensitive audio button according to an embodiment of the present invention. In conjunction with fig. 2, the method includes:
step 201, determining a corresponding touch event according to a touch operation on an ITO touch pad of the stereo.
Step 202, determining the corresponding relation between the touch event type and the control mode according to the placing posture of the sound.
Step 203, determining the control intention indicated by the touch event based on the corresponding relation between the touch event type and the control mode.
And step 204, controlling the working state of the sound equipment according to the control intention indicated by the touch event.
In step 201, the ITO touch pad is located on a touch key of the audio, and a user may control the audio to a target state by performing a touch operation on the ITO touch pad.
The touch operation refers to an operation that can cause a capacitance change in a capacitance array on the ITO touch panel, for example, a pressing operation for a certain coordinate point on the ITO touch panel, a simultaneous pressing operation for a plurality of coordinate points, or a pressing operation in which coordinates continuously change.
The touch event may refer to an event that can be recognized by the MCU and corresponds to a certain touch operation. These touch operations corresponding to the touch event generally need to meet a preconfigured identification rule, for example, the touch duration reaches a certain threshold, the touch direction meets a certain direction requirement, or the touch operation path meets a certain path rule. According to different recognition rules met by the touch operation, the touch events corresponding to the touch operation can be distinguished into different types of events, such as single-click events, long-press events or sliding events. Further, for the sliding event, attributes such as the sliding direction, the sliding distance and the sliding path of the sliding event can be identified.
For step 202, the pose of the sound may be a forward pose or a backward pose or other possible poses, as described in the previous embodiments.
The control modes are different control modes of the MCU for the working state of the sound equipment, such as on/off, playing/pausing, track up/down switching, fast forward/fast backward, volume increasing/decreasing and the like. The corresponding relation between the touch event type and the control mode is configured in the MCU in advance.
In an optional embodiment, when the corresponding relationship between the touch event type and the control mode is configured in advance, the sound device may be placed in the forward direction as a reference placement posture, and the control modes corresponding to different touch event types are defined in an initialization manner.
For example, the correspondence between the touch event type and the control mode is as follows: clicking the event to correspond to a pause/play mode; a sliding event with the direction from the first side edge to the second side edge of the ITO touch pad as the sliding direction corresponds to the volume increasing mode, and the volume increasing amplitude and the sliding distance of the sliding event have a linear relation; a sliding event with the direction from the second side edge of the ITO touch pad to the first side edge as the sliding direction corresponds to the volume reduction mode, and the volume reduction amplitude and the sliding distance of the sliding event have a linear relation; the long press event corresponds to a power on/off mode; correspondingly switching to the next song mode by using the sliding event with the direction from the third side to the fourth side of the ITO touch pad as the sliding direction; and correspondingly switching the sliding event taking the direction of the fourth side edge of the ITO touch pad pointing to the third side edge as the sliding direction to the previous song mode and the like. The first side edge, the second side edge, the third side edge and the fourth side edge are not limited herein. For example, the first side edge may be an upper edge of the ITO touch panel when the ITO touch panel is normally placed, and the second side edge is a lower edge corresponding to the upper edge; the third side edge may be a left side of the ITO touch panel when it is normally placed, and the fourth side edge is a right side corresponding to the left side.
It should be understood that the above definitions are for illustrative purposes only and are not limiting upon the embodiments of the present invention. In the embodiment of the present invention, the corresponding relationship between the touch event type and the control mode may also be set by a user in a user-defined manner, which is not described herein.
Generally, when the preset corresponding relationship between the touch event type and the control mode is kept unchanged, if the sound placement postures are different, the gesture of the user performing touch operation on the touch sound key needs to be adaptively adjusted along with the change of the sound placement postures. For example, when the audio is placed in the forward direction, the user may slide up and down the ITO touch panel in order to increase the volume of the audio. When the sound equipment is placed in the inverted direction, if the volume of the sound equipment is increased by a user, the sound equipment can slide from top to bottom on the ITO touch pad.
In a possible situation, the sound is designed to be completely symmetrical up and down for better appearance, so that the user cannot distinguish the forward direction from the backward direction, and the user is bothered when using gestures. Meanwhile, even if the user can distinguish the forward direction and the backward direction of the sound, when the gesture is used, the user needs to think which gesture should be used according to the placing posture of the sound, and the memory complexity of the gesture is increased.
In order to solve the above defect, in this embodiment, after determining the touch event corresponding to the touch operation, the MCU further obtains the pose of the audio, and updates the corresponding relationship between the pre-configured touch event type and the control mode according to the pose of the audio.
Alternatively, obtaining the pose of the sound may be achieved by an inertial detector. The IMU can monitor the change of the sound placing posture in real time and upload the detected change signal of the placing posture to the MCU. The MCU analyzes the change signal, determines the placing posture of the sound at the current moment, and determines the corresponding relation between the touch event type and the control mode.
Optionally, if the current placing posture of the sound equipment is forward placing, the preset corresponding relation between the touch event type and the control mode can be directly used; if the current placing posture of the sound equipment is placed in the reverse direction, the corresponding relation between the touch event type and the control mode needs to be updated. For example, after the correspondence between the touch event type and the control mode is updated, a sliding event in which the direction in which the second side of the ITO touchpad points to the first side is taken as a sliding direction corresponds to a volume-up event, and a sliding event in which the direction in which the first side of the ITO touchpad points to the second side is taken as a sliding direction corresponds to a volume-down event. Therefore, the user can directly control the working state of the sound through the touch sound key according to the corresponding relation between the predefined gesture and the control mode without considering the placing posture of the sound.
With respect to step 203, after determining the touch event corresponding to the touch operation in step 201 and determining the corresponding relationship between the touch event type and the control mode in step 202, the corresponding relationship between the touch event type and the control mode may be queried, so as to determine the control mode corresponding to the touch event determined in step 201, and determine the control intention of the user indicated by the touch event based on the determined control mode. The user's control intentions may be to turn the audio power on/off, play/pause a track, switch up/down a track, fast forward/rewind a currently playing track, increase/decrease the playing volume, etc.
It should be understood that, in the control mode corresponding to a certain touch event type, the control intentions of the user indicated by the touch event corresponding to the touch event type are in one-to-one correspondence, and then the MCU may determine the control intentions of the user indicated by the touch event according to the correspondence between the touch event type and the control mode. For example, clicking on this touch event type corresponds to playing/pausing this control mode, and the user's control intent indicated by the clicking event is to play/pause the current track. For another example, if the long press touch event type corresponds to a control mode of turning on/off the audio power, the user's control intention indicated by the long press event is to turn on/off the audio power.
In step 204, the operating state of the audio is controlled after the control intention of the user is acquired. For example, the user's control intention is to switch the currently played song to the next song, and then the MCU can control the stereo to switch the songs.
In the embodiment of the invention, the working state of the sound is controlled according to the sound placing posture and the touch event, the identification of complex touch gestures under different sound placing states is realized, and the working state of the sound is automatically controlled based on the identification result, so that the keys of the sound are further intelligentized.
Fig. 3 is a flowchart illustrating another control method for a touch-sensitive audio button according to an embodiment of the present invention. In conjunction with fig. 3, the method includes:
step 301, acquiring temperature and humidity information of the environment where the sound equipment is located.
And 302, setting touch parameters of the ITO touch pad according to the temperature and humidity information.
Step 303, determining a corresponding touch event according to the touch operation on the ITO touch pad of the stereo.
And step 304, determining the corresponding relation between the touch event type and the control mode according to the placing posture of the sound equipment.
And 305, determining the control intention indicated by the touch event based on the corresponding relation between the touch event type and the control mode.
And step 306, controlling the working state of the sound equipment according to the control intention indicated by the touch event.
For step 301, acquiring temperature and humidity information of the environment where the sound device is located may be implemented by a temperature and humidity sensor. After the temperature and humidity sensor detects the temperature and humidity, the temperature and humidity sensor converts the temperature and humidity into an electric signal which can be processed by the MCU, and the MCU analyzes the electric signal to acquire temperature and humidity information of the environment where the sound equipment is located.
The ITO touch pad is directly exposed in air, and the temperature and the humidity of the air influence the capacitance of the touch pad. Such as water droplets or grit on the surface of an ITO touch panel, can directly cause a false touch. In this embodiment, after the temperature and humidity of the environment where the ITO touch panel is located are obtained by the temperature and humidity sensor, step 301 may be executed to overcome the above-mentioned defects.
In step 301, the touch parameters of the ITO touch panel include sensitivity, interference rejection level, oversampling level, gain coefficient, and the like of the ITO touch panel. According to the temperature and humidity information, the at least one touch parameter of the ITO touch pad can be set, and therefore high performance of the touch type sound key under different temperature and humidity conditions is guaranteed.
For example, when the humidity in the environment increases, the sensitivity, the anti-interference level, the oversampling level or the gain coefficient of the ITO touch panel may be increased to ensure normal detection of the touch operation in the case of a small amount of water on the ITO touch panel.
And executing the steps 303 to 306 based on the touch parameters of the ITO touch pad set in the steps 301 and 302, and controlling the working state of the sound. For the specific implementation of step 303 to step 306, reference may be made to the description of the corresponding embodiment in fig. 2, which is not repeated herein.
In this embodiment, through the humiture information that detects the stereo set environment, and then according to the humiture automatic configuration touch pad's of stereo set environment touch parameter, further promoted the adaptability of touch-control formula stereo set button to the environment, promoted user experience.
Fig. 4 is a schematic flowchart of a method for determining a touch event according to an embodiment of the present invention, and with reference to fig. 4, in an embodiment of the present invention, a flow of determining a touch event by an ITO touchpad is as follows:
step 401, responding to the touch operation of the ITO touch panel, and recording the touch duration of the touch operation.
Step 402, judging whether the touch duration is in a duration range corresponding to the clicking event; if so, go to step 403; if not, go to step 404.
And step 403, determining that the touch operation corresponds to a click event.
And step 404, calculating the offset of the current touch point corresponding to the touch operation on the ITO touch pad relative to the starting point in real time.
Step 405, judging whether the calculated offset is larger than or equal to a preset deviation range; if so, go to step 406; if not, go to step 407.
And step 406, determining that the touch operation corresponds to the sliding event.
Step 407, determining that the touch operation corresponds to a long press event.
In step 401, the MCU monitors the capacitance change of the ITO touch pad in real time, and determines that a touch operation is performed on a coordinate point where the capacitance change occurs if the capacitance change is detected at the coordinate point. And continuously recording the time length of the capacitance change as the touch time length of the touch operation before the capacitance change stops.
For step 402, the duration range corresponding to the click event is predefined, and in this step, it needs to be compared whether the touch duration of the touch operation detected in the previous step is within the predefined duration range corresponding to the click event.
In step 403, if the touch duration is within the duration range corresponding to the click event, it may be determined that the touch operation corresponds to the click event. After the click event is determined, the MCU can inquire the control intention corresponding to the click event and control the working state of the sound according to the preset corresponding relation between the touch event type and the control mode.
For example, in the MCU, a preconfigured click event corresponds to play/pause of a currently playing track, and the MCU may pause the playing track or play the paused track according to the control intention of the user.
In step 404, the corresponding starting point of the touch operation on the ITO touch pad is the point where the capacitance change is first detected for the current touch operation. And (4) touching a corresponding current touch point on the ITO touch pad, namely a coordinate point of capacitance change generated on the ITO touch pad at the current moment.
If the touch duration exceeds the duration range corresponding to the clicking event, the touch duration can be continuously recorded, and the offset of the current touch point relative to the starting point is calculated according to the coordinate change condition of the point generating the capacitance change.
For step 405, the deviation range is predefined. In this step, after the offset of the current touch point with respect to the starting point is obtained, it is necessary to determine whether the current offset reaches a predefined offset range.
Referring to step 406, if the current offset amount has reached the predefined deviation range, it may be determined that the touch operation corresponds to a sliding event. And the MCU continuously monitors the coordinate change of the capacitance change point generated on the ITO touch pad and calculates the offset of the current touch point relative to the starting point in real time. If there is a decreasing change in the offset amount while continuing to increase, the point at which the decreasing change occurs is designated as an inflection point. Similarly, if there is an increasing change in the offset amount while continuing to decrease, the point at which the increasing change occurs is designated as an inflection point. And when the MCU no longer detects the point with the capacitance change on the ITO touch pad, marking the last point with the capacitance change as a termination point of the touch operation on the ITO touch pad.
Optionally, in the process of determining that the touch operation corresponds to the sliding event, the MCU may determine the sliding direction and/or the sliding distance corresponding to the sliding event according to the detected start point and end point of the sliding operation.
In an application scenario, when the touch operation has no direction change on the ITO touch pad, that is, no inflection point is detected, the MCU may determine the corresponding sliding direction of the sliding event according to the direction of the corresponding end point on the ITO touch pad relative to the start point. In addition, the MCU can also determine the sliding distance corresponding to the sliding event according to the offset of the detected end point relative to the starting point. For example, if the starting point is located near the first side of the touch-sensitive audio key and the ending point is located near the second side of the touch-sensitive audio key, it may be determined that the sliding direction corresponding to the sliding event is: determining a sliding distance corresponding to the sliding event from the direction from the first side edge to the second side edge as follows: the coordinate offset between the end point and the start point.
In another application scenario, the touch operation has a direction change on the ITO touch pad, which means that there is an inflection point in the corresponding sliding process of the touch operation on the ITO touch pad. Based on the method, the MCU determines the sliding direction and/or the sliding distance corresponding to the sliding event according to the detected starting point, the detected ending point and the detected inflection point of the sliding operation.
For example, the MCU may acquire an inflection point of a touch operation in a corresponding sliding process on the ITO touch pad. After the inflection point is obtained, the sliding direction of the sliding event is determined according to the corresponding termination point, starting point and inflection point of the touch operation on the ITO touch pad. In addition, the MCU can also determine the sliding distance of the sliding event according to the starting point, the ending point and the offset of two adjacent points between inflection points.
Optionally, in a sliding event, there may be a plurality of inflection points, and correspondingly, there may also be a plurality of sliding directions. Theoretically, if there are N inflection points in a slip event, where N is a positive integer, the slip event may include N +1 slip directions.
The following section will explain how to determine the sliding direction of the sliding event according to the corresponding end point, start point and inflection point of the touch operation on the ITO touch pad in combination with a specific example. For example, in a sliding event, the sliding path is from the start point to the inflection point and from the inflection point to the end point by the second offset amount. Then, the swipe event comprises a first swipe direction with the start point pointing to the inflection point and a second swipe direction with the inflection point pointing to the end point. For another example, in a sliding event, the sliding path is to slide a first offset amount from the starting point to a first inflection point, a second offset amount from the first inflection point to a second inflection point, and a third offset amount from the second inflection point to an ending point. Then, the sliding event includes a first sliding direction in which the start point points to the first inflection point, a second sliding direction in which the first inflection point points to the second inflection point, and a third sliding direction in which the second inflection point points to the end point.
Optionally, when an inflection point is detected, the sliding distance of the sliding event is: in the process from the beginning to the end of sliding, overlapping the offsets of two adjacent points among the starting point, at least one inflection point and the ending point; if the sliding direction indicated by two adjacent points is consistent with the defined positive direction, the offset between the two adjacent points is a positive value when the two adjacent points are superposed. If two adjacent points indicate a sliding direction opposite to the defined positive direction, the offset between the two adjacent points is negative when superimposed.
After the sliding event is determined, the MCU can inquire the control intention corresponding to the sliding event and control the working state of the sound according to the preset corresponding relation between the type of the touch event and the control mode.
Suppose, in the correspondence between the touch event type and the control mode pre-configured in the MCU: a sliding event with the direction from the first side edge to the second side edge of the ITO touch pad as the sliding direction corresponds to the volume increasing mode, and the volume increasing amplitude and the sliding distance of the sliding event have a linear relation; a sliding event with the direction from the second side edge of the ITO touch pad to the first side edge as the sliding direction corresponds to the volume reduction mode, and the volume reduction amplitude and the sliding distance of the sliding event have a linear relation; correspondingly switching to the next song mode by using the sliding event with the direction from the third side to the fourth side of the ITO touch pad as the sliding direction; and correspondingly switching to the previous song mode by using the sliding event taking the direction of the fourth side edge of the ITO touch pad pointing to the third side edge as the sliding direction.
In an alternative embodiment, after determining that the touch operation corresponds to the sliding event, the overall direction corresponding to the sliding event may be determined preliminarily according to the direction of the current touch point relative to the starting point, that is, the sliding direction is parallel to the direction in which the first side points to the second side, or the sliding direction is parallel to the direction in which the third side points to the fourth side, so as to determine that the control intention of the user is to control the increase/decrease of the sound playing volume, or the switching of the up/down tracks.
If the overall direction corresponding to the sliding event is parallel to the direction in which the third side points to the fourth side, it can be determined that the control intention of the user is switching between the up/down tracks. The MCU can further judge that the concrete direction is the direction that the third side points to the fourth side, or the direction that the fourth side points to the third side and according to the control intention of the user, switch the current playing song to the previous song or the next song.
If the overall direction corresponding to the sliding event is parallel to the direction in which the first side points to the second side, it can be determined that the control intention of the user is to control the volume increase/decrease of the sound playing volume. At this time, the MCU needs to control the playing volume of the audio in real time according to the direction and offset of the current touch point relative to the starting point.
For example, if the direction corresponding to the sliding event is a direction in which the first side points to the second side, it may be determined that the control intention of the user is to increase the volume. The MCU can increase the playing volume of the sound according to the volume increase amplitude indicated by the offset of the current touch point relative to the starting point, and control the playing volume of the sound to adaptively change along with the change of the offset generated by the movement of the current touch point so as to generate the effect that the volume changes gradually along with the change of the sliding distance.
Similarly, if the direction corresponding to the sliding event is the direction in which the second side points to the first side, it may be determined that the control intention of the user is to decrease the volume. The MCU can reduce the playing volume of the sound according to the volume reduction amplitude indicated by the offset of the current touch point relative to the starting point, and control the playing volume of the sound to adaptively change along with the change of the offset generated by the movement of the current touch point so as to generate the effect that the volume changes gradually along with the change of the sliding distance.
Optionally, the effect that the volume changes gradually with the change of the sliding distance may be that, in the process of sliding along the direction from the first side edge to the second side edge, the volume is continuously increased along with the increase of the offset of the current touch point relative to the starting point; when the inflection point is detected, the touch screen slides along the direction in which the second side points to the first side, and the volume is gradually reduced along with the increase of the offset of the current touch point relative to the inflection point. The volume may be continuously decreased as the offset of the current touch point with respect to the starting point increases in the process of continuously sliding in the direction in which the second side edge points to the first side edge; when the inflection point is detected, the touch screen slides along the direction in which the first side points to the second side, and the volume is gradually increased along with the increase of the offset of the current touch point relative to the inflection point. In the above process, if the inflection point is detected again, the size of the sound is adjusted according to the direction adaptability of the inflection point relative to the previous inflection point, which is not described again.
It should be understood that after the above-mentioned process of gradually changing the volume with the change of the sliding distance, as for the overall change of the volume from the sliding start time to the sliding end time, the overall volume difference matches the volume increase/decrease amplitude indicated by the sliding distance corresponding to the sliding event, and details are not repeated.
In step 407, when the touch duration exceeds the duration range corresponding to the click event and the offset between the start point and the end point is smaller than the preset deviation range, it is determined that the touch operation is a long-time press operation. After the long press event is determined, the MCU can inquire the control intention corresponding to the long press event and control the working state of the sound according to the preset corresponding relation between the type of the touch event and the control mode.
For example, in the MCU, a preset long press event corresponds to a power on/off mode of the audio, and the MCU may turn on the audio power being turned off or turn off the audio power being turned on according to a control intention of a user.
In this embodiment, the MCU realizes the recognition of the complex gesture by monitoring the time of the capacitance change on the ITO touch panel and the coordinate change of the point where the capacitance change occurs. Further, recognized complex gestures may be resolved into different types of touch events to enable multi-modal control of the sound. Therefore, only one ITO touch pad is needed to replace a plurality of traditional complicated mechanical keys, occupation of the MCU port is greatly reduced, and user interaction experience is improved.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (11)

1. A touch-sensitive audio button, comprising:
an ITO touch panel for detecting a touch operation;
an inertia detector for detecting the sound placement posture;
the micro control unit is used for controlling the working state of the sound according to the touch operation detected by the ITO touch pad and the sound placing posture detected by the inertia detector; and the number of the first and second groups,
the flexible circuit board connector is used for connecting the ITO touch pad and the micro control unit;
wherein the micro control unit is specifically configured to: determining a corresponding touch event according to touch operation on an ITO touch pad of the sound equipment; determining the corresponding relation between the touch event type and the control mode according to the placing posture of the sound equipment; determining a control intention indicated by the touch event based on the corresponding relation between the touch event type and a control mode; and controlling the working state of the sound equipment according to the control intention indicated by the touch event.
2. The touch-sensitive audio button of claim 1, further comprising:
and the temperature and humidity sensor is used for detecting the temperature and humidity of the environment where the sound equipment is located and is electrically connected with the micro control unit.
3. The touch-sensitive audio button of claim 1, wherein the ITO touchpad comprises: the touch detection device comprises an ITO conductive film, a capacitor array arranged below the ITO conductive film and at least one touch detection channel connected to the capacitor array; the at least one touch detection channel is connected with the flexible circuit board connector.
4. The touch-sensitive audio button of any one of claims 1-3, further comprising: a key shell provided with a groove; the ITO touch pad is embedded in the groove; the key shell is further provided with at least two mounting holes, and the at least two mounting holes are located on two sides of the groove.
5. A control method of a touch-control sound key, wherein the touch-control sound key is installed on a sound, the method comprising:
determining a corresponding touch event according to touch operation on an ITO touch pad of the sound equipment;
determining the corresponding relation between the touch event type and the control mode according to the placing posture of the sound equipment;
determining a control intention indicated by the touch event based on the corresponding relation between the touch event type and a control mode;
and controlling the working state of the sound equipment according to the control intention indicated by the touch event.
6. The method of claim 5, wherein before determining the corresponding touch event according to the touch operation on the ITO touch pad of the stereo, further comprising:
acquiring temperature and humidity information of the environment where the sound equipment is located;
and setting the touch parameters of the ITO touch pad according to the temperature and humidity information.
7. The method according to claim 6, wherein the setting of the touch parameters of the ITO touch panel comprises at least one of the following:
setting the sensitivity of the ITO touch panel;
setting the anti-interference level of the ITO touch pad;
setting the oversampling level of the ITO touch pad;
and setting the gain coefficient of the ITO touch pad.
8. The method of claim 5, wherein determining the corresponding touch event based on a touch operation on an acoustic ITO touch pad comprises:
and if the touch duration of the touch operation is within the duration range corresponding to the clicking event, determining that the clicking event corresponds to the touch operation.
9. The method of claim 8, wherein if the touch duration of the touch operation exceeds the duration range corresponding to the single-click event, the method further comprises:
calculating the offset of the current touch point corresponding to the touch operation on the ITO touch pad relative to the starting point in real time;
and if the offset is larger than or equal to a preset deviation range, determining a sliding event corresponding to the touch operation.
10. The method of claim 9, wherein determining that the touch operation corresponds to a slide event further comprises:
determining a corresponding sliding direction of the sliding event according to the direction of the corresponding end point of the touch operation on the ITO touch pad relative to the starting point, and/or,
and determining the sliding distance corresponding to the sliding event according to the offset of the end point relative to the starting point.
11. The method of claim 9, wherein determining that the touch operation corresponds to a slide event further comprises:
acquiring inflection points of the touch operation in the corresponding sliding process on the ITO touch pad;
determining the sliding direction of the sliding event according to the corresponding termination point, the starting point and the inflection point of the touch operation on the ITO touch pad; and/or the presence of a gas in the gas,
and determining the sliding distance of the sliding event according to the starting point, the ending point and the offset of two adjacent points between the inflection points.
CN201710335208.2A 2017-05-12 2017-05-12 Touch control type sound equipment key and control method thereof Active CN107203322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710335208.2A CN107203322B (en) 2017-05-12 2017-05-12 Touch control type sound equipment key and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710335208.2A CN107203322B (en) 2017-05-12 2017-05-12 Touch control type sound equipment key and control method thereof

Publications (2)

Publication Number Publication Date
CN107203322A CN107203322A (en) 2017-09-26
CN107203322B true CN107203322B (en) 2021-05-18

Family

ID=59905912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710335208.2A Active CN107203322B (en) 2017-05-12 2017-05-12 Touch control type sound equipment key and control method thereof

Country Status (1)

Country Link
CN (1) CN107203322B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108111893B (en) * 2017-12-29 2021-06-15 深圳Tcl新技术有限公司 Sliding direction estimation method of touch remote controller, remote controller and computer storage medium
CN108200494B (en) * 2017-12-29 2020-04-07 上海传英信息技术有限公司 Earphone touch control method and earphone
CN112703745A (en) * 2018-11-27 2021-04-23 深圳市柔宇科技股份有限公司 Sound box
CN112817513A (en) * 2021-01-20 2021-05-18 歌尔科技有限公司 Audio playing device touch control method, audio playing device and storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102375581B (en) * 2010-08-11 2016-06-08 深圳富泰宏精密工业有限公司 Touch type electronic device and the method improving its touch accuracy
CN201805414U (en) * 2010-09-10 2011-04-20 江苏惠通集团有限责任公司 ITO (Indium Tin Oxide) film approach inductive key
CN102768572A (en) * 2011-05-04 2012-11-07 宏碁股份有限公司 Hand-hold device and key function setting method of hand-hold device
CN103576578B (en) * 2013-11-05 2017-04-12 小米科技有限责任公司 Method, device and equipment for adopting earphone wire to control terminal
US20150363026A1 (en) * 2014-06-16 2015-12-17 Touchplus Information Corp. Control device, operation mode altering method thereof, control method thereof and battery power warning method thereof
CN105373299A (en) * 2014-08-25 2016-03-02 深圳富泰宏精密工业有限公司 Electronic apparatus and display interface adjustment method therefor
KR20160065503A (en) * 2014-12-01 2016-06-09 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN204362249U (en) * 2014-12-26 2015-05-27 歌尔声学股份有限公司 A kind of wireless sound system
EP3311627B1 (en) * 2015-06-22 2021-05-05 Loose Cannon Systems, Inc. Portable group communication device

Also Published As

Publication number Publication date
CN107203322A (en) 2017-09-26

Similar Documents

Publication Publication Date Title
CN107203322B (en) Touch control type sound equipment key and control method thereof
US10101874B2 (en) Apparatus and method for controlling user interface to select object within image and image input device
KR102209910B1 (en) Coordinate measuring apparaturs which measures input position of coordinate indicating apparatus and method for controlling thereof
WO2017080110A1 (en) Touch signal-based mobile terminal operation method, system and mobile terminal
CN103888703B (en) Strengthen image pickup method and the camera head of recording
US20140270414A1 (en) Auxiliary functionality control and fingerprint authentication based on a same user input
KR101240406B1 (en) program operation control method of portable information or communication terminal using force sensor
US9715823B2 (en) Remote control device
US20140270413A1 (en) Auxiliary device functionality augmented with fingerprint sensor
CN105573538B (en) Sliding broken line compensation method and electronic equipment
US20150109221A1 (en) Method, device, and electronic terminal for unlocking
CN104364745A (en) Apparatus and method for proximity touch sensing
CN109074221B (en) Selective attenuation of sound for display devices
KR20150107755A (en) Using distance between objects in touchless gestural interfaces
KR20170107987A (en) Information processing apparatus, information processing method, program, and system
US10082878B2 (en) Method for controlling and calibrating a device with a gesture
KR102186815B1 (en) Method, apparatus and recovering medium for clipping of contents
CN108376030B (en) Electronic equipment control method and device and electronic equipment
KR101210538B1 (en) Apparatus and method of interface for mobile device, and recording medium for the same
TWI536033B (en) Object detection method and device
CN103324410A (en) Method and apparatus for detecting touch
EP3282680B1 (en) Blowing action-based method for operating mobile terminal and mobile terminal
KR101134245B1 (en) Electronic device including 3-dimension virtualized remote controller and driving methed thereof
KR20140137629A (en) Mobile terminal for detecting earphone connection and method therefor
CN105677124A (en) Method for display control based on finger hovering detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant