CN113364914A - Control device, mobile terminal, electronic apparatus, and computer storage medium - Google Patents

Control device, mobile terminal, electronic apparatus, and computer storage medium Download PDF

Info

Publication number
CN113364914A
CN113364914A CN202010148666.7A CN202010148666A CN113364914A CN 113364914 A CN113364914 A CN 113364914A CN 202010148666 A CN202010148666 A CN 202010148666A CN 113364914 A CN113364914 A CN 113364914A
Authority
CN
China
Prior art keywords
control
mobile terminal
gesture
near field
field communication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010148666.7A
Other languages
Chinese (zh)
Inventor
黄沛雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010148666.7A priority Critical patent/CN113364914A/en
Publication of CN113364914A publication Critical patent/CN113364914A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Abstract

The embodiment of the invention provides a control device, a mobile terminal, electronic equipment and a computer storage medium. The control device includes: the sensing module senses a first gesture input by a user in the touch area; and the control module indicates the mobile terminal to carry out audio control and/or call control through first near field communication according to the first gesture. According to the scheme of the embodiment of the invention, the gesture can be acquired on the touch area of the control device, and further control is carried out based on the gesture, so that uncomfortable experience caused by direct touch of a user on an entity such as an earphone or an entity key of a mobile terminal is avoided, and more convenient audio control is realized.

Description

Control device, mobile terminal, electronic apparatus, and computer storage medium
Technical Field
Embodiments of the present invention relate to the field of communications technologies, and in particular, to a control device, a mobile terminal, an electronic device, and a computer storage medium.
Background
The earphone product can be combined with a mobile phone to bring relatively private audio-visual experience to a user, meanwhile, the call receiving and making functions of the earphone enable the user to conveniently establish communication, and convenient service is provided in scenes such as driving.
The existing earphone products enter a new generation of wireless connection mode, and the defects that the earphone wire is inconvenient to carry or easy to damage and the like which troubles users for a long time are overcome. Typically, wireless headsets enable interactive control with the handset by means such as tapping or pressing buttons on the headset. However, there is still room for improvement in the control of audio.
Disclosure of Invention
In view of this, embodiments of the present invention provide a control device, a mobile terminal, an electronic device, and a computer storage medium, which can implement more convenient audio control.
According to a first aspect of embodiments of the present invention, there is provided a control apparatus including: the sensing module senses a first gesture input by a user in the touch area; and the control module is used for carrying out audio control on the mobile terminal through first near field communication according to the first gesture.
According to a second aspect of the embodiments of the present invention, there is provided a mobile terminal, including: the control device comprises a receiving module, a display module and a control module, wherein the receiving module is used for receiving a control message sent by the control device through first near field communication according to a first gesture, and the first gesture is input by a user in a touch area of the control device; and the control module is used for controlling the mobile terminal according to the control message.
According to a third aspect of embodiments of the present invention, there is provided an electronic apparatus, the apparatus including:
one or more processors; a computer readable medium configured to store one or more programs that, when executed by the one or more processors, cause the one or more processors to implement: sensing a first gesture input by a user in a touch area; according to the first gesture, carrying out audio control on the mobile terminal through first near field communication, or receiving a control message sent by a control device through first near field communication according to the first gesture, wherein the first gesture is input by a user in a touch area of the control device; and controlling the mobile terminal according to the control message.
According to a fourth aspect of embodiments of the present invention, there is provided a computer readable medium having stored thereon a computer program which when executed by a processor implements: sensing a first gesture input by a user in a touch area; according to the first gesture, carrying out audio control on the mobile terminal through first near field communication, or receiving a control message sent by a control device through first near field communication according to the first gesture, wherein the first gesture is input by a user in a touch area of the control device; and controlling the mobile terminal according to the control message.
According to the scheme of the embodiment of the invention, the gesture can be acquired on the touch area of the control device, and further control is carried out based on the gesture, so that uncomfortable experience caused by direct touch of a user on an entity such as an earphone or an entity key of a mobile terminal is avoided, and more convenient audio control is realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
FIG. 1A is a schematic block diagram of a control device of one embodiment of the present invention;
FIG. 1B is a schematic diagram illustrating an implementation manner of a touch gesture according to another embodiment of the invention;
fig. 1C is a schematic block diagram of an example of a connection relationship related to a control device according to another embodiment of the present invention.
Fig. 1D is a schematic diagram of an example of an arrangement of touch areas on a wireless headset according to another embodiment of the present invention.
Fig. 1E is a schematic block diagram of another example of an arrangement of touch areas on a wireless headset according to another embodiment of the present invention.
FIG. 2 is a schematic block diagram of a mobile terminal of another embodiment of the present invention;
FIG. 3 is a schematic block diagram of an electronic device according to another embodiment of the invention;
fig. 4 is a hardware configuration of an electronic device according to another embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The following further describes specific implementation of the embodiments of the present invention with reference to the drawings.
Fig. 1A is a schematic block diagram of a control apparatus according to an embodiment of the present invention. The control device of fig. 1 includes:
the sensing module 110 senses a first gesture input by a user in the touch area;
and the control module 120 instructs the mobile terminal to perform audio control and/or call control through the first near field communication according to the first gesture.
It should be understood that the control device herein may be, for example, a portable control device. The form, material, color, size and weight of the control device are not limited, that is, the control devices presented in the above-mentioned various ways are all within the scope of the embodiments of the present invention. Preferably, the control device includes a touch area having a compact design so as to facilitate touch operation. For example, the touch area is disposed on a surface of the control device, for example, as a sensing area of a touch module of the control device. For the above audio control, the audio playback control and the audio playback control of the video are included, but not limited. The call control includes but is not limited to answering control or hanging up control of voice call. In addition, the audio playing control includes, but is not limited to, volume control, playing content switching control, and the like.
According to the scheme of the embodiment of the invention, the gesture can be acquired on the touch area of the control device, and further control is carried out based on the gesture, so that uncomfortable experience caused by direct touch of a user on an entity such as an earphone or an entity key of a mobile terminal is avoided, and more convenient audio control is realized.
According to the scheme of the embodiment of the invention, because the special touch area is adopted, the mobile terminal usually adopts a multi-point touch technology, and is usually provided with a plurality of application programs, the interfaces are various, and the interaction is complex, so that compared with the control of the mobile terminal on the audio frequency, such as multi-point touch, the possibility of misoperation can be reduced.
In one implementation mode of the invention, the audio control is realized through a wired earphone or a wireless earphone, so that the touch area is arranged on one surface of the control device, thereby being more beneficial to the industrial design integration and simplification of the earphone part design and the convenient interactive experience.
In one implementation of the present invention, the first gesture includes, but is not limited to, at least one of a sliding touch, a clicking touch, a long-press touch, and a short-press touch performed in the touch area, and the embodiment of the present invention includes, but is not limited to, any form of touch operation, and includes combinations between different forms.
In one implementation of the present invention, the control module is specifically configured to: and according to the first gesture, sending a play control message to the mobile terminal through the first near field communication so as to control the play of the mobile terminal. ,
in one implementation of the present invention, the control module is specifically configured to: and according to the first gesture, sending a voice call control message to the mobile terminal through the first near field communication so as to control the voice call of the mobile terminal.
In one implementation of the present invention, the control module is specifically configured to: and according to the first gesture, sending a volume control message to the mobile terminal through the first near field communication so as to control the volume of the call of the mobile terminal and/or the play of the mobile terminal.
It should be appreciated that for the implementation of the first gesture, as a specific example, reference may be made to the schematic diagram of the implementation of the touch gesture as shown in fig. 1B. FIG. 1B illustrates a curved trajectory-based gesture (a), e.g., a clockwise swipe gesture or a counterclockwise swipe gesture; gestures based on the one-way trajectory (b), e.g., slide left, slide right, slide up, slide down; and (c) gestures based on touch time (e.g., determined based on a preset time threshold) and touch strength (e.g., determined based on a preset threshold), such as a light touch, a heavy touch, a long press (e.g., 3 seconds or more), a short press (e.g., 3 seconds or less, e.g., 1 second), and the like. It should be understood that by any combination of the above three modes, more kinds of touch operations can be obtained, and the functions of the different touch operations can be implemented in software or hardware so as to correspond to the respective functions of the mobile terminal, including but not limited to the functions of the application installed on the mobile terminal, and the functions of the operating system installed on the mobile terminal, including the call to the interface.
For the definition of different touch operations or different gestures, the above operations and corresponding functions may be combined in any manner, and the setting of the left-hand mode or the right-hand mode may be set, so as to facilitate the personalized operation. As one example, a clockwise swipe gesture is defined as a headset volume up control. A counterclockwise swipe gesture is defined as a headset volume down control. Sliding from right to left is defined as the control where the song played switches to the previous one. A swipe from left to right is defined as a control where a played song is switched to the next one. The upward slide is positioned to receive the call. A slide down is defined as rejecting/hanging up a phone. The tap touch area is defined as pause/play control. The long press touch area is defined as a smart wake-up wireless headset. The tap short-press touch area is defined as entering a pairing mode with a wireless headset or a mobile terminal. It should be appreciated that the multi-modal steering design described above greatly improves the convenience of the headset in different application scenarios.
In an implementation manner of the present invention, the mobile terminal is connected to an earphone, the mobile terminal includes a first volume control module and a second volume control module, the first volume control module responds to a connection establishment message between the mobile terminal and the earphone, and switches from an electrical connection with a speaker of the mobile terminal to an electrical connection with the speaker of the earphone, and the second volume control module is electrically connected to the speaker of the earphone, where the earphone volume control message instructs the second volume control module to perform volume control on the speaker of the earphone.
In an implementation manner of the present invention, a wireless headset is connected to a mobile terminal through a second near field communication, and the control message instructs the mobile terminal to perform on-off control on a voice interaction microphone disposed in the wireless headset through the second near field communication.
The communication in the embodiment of the invention is realized by adopting the first near field communication and the second near field communication, thereby ensuring the flexibility of movement.
It should be understood that wireless headsets herein include any headset controlled or used in a wireless transmission manner, including but not limited to smart headsets such as true wireless intelligent (TWS) headsets. The wireless headset may or may not include a microphone. The wireless headset may have functions such as audio playing or video listening, functions to assist the mobile terminal in making a phone call or voice-even-call, etc. by making a communication connection with the mobile terminal.
It should also be understood that the first near field communication and the second near field communication may be in the same or different communication manners, which is not limited by the embodiment of the present invention. For example, any of the first and second near field communication terminals may be any of BLE, WIFI, ZigBee, Z-wave, remote wide area network, A2DP, NFC, or the like. Preferably, the second near field communication has a higher bandwidth than the first near field communication. For example, the second near field communication may be realized by A2 DP. For example, the first near field communication may be realized by BLE.
In other words, embodiments of the present invention can isolate the interaction between the headset and the controls, particularly the control functions of taps and keys. The embodiment of the invention is realized through a special touch area, so that the comfort of the earphone worn on the ear is not influenced.
In other words, the scheme of the embodiment of the invention can improve the operability of the earphone without being influenced by the size of the earphone.
In one implementation of the invention, the headset control message instructs the mobile terminal to volume control a speaker disposed in the wireless headset via the second near field communication.
In one implementation of the invention, the headset control message instructs the mobile terminal to switch control a voice interaction microphone disposed in the wireless headset via the second near field communication. For example, the voice interaction microphone is switched between a voice standby state and a sleep state by a switching signal. It should be understood that in the sleep state or the off state, the power consumed by the voice interaction microphone is extremely small and much smaller than the real-time standby state. It should also be appreciated that controlling the microphone of the smart headset via gestures quickly enables voice interaction to be turned on and off, and does not require the headset to be in a real-time standby state, enabling better coordination with the functionality of the smart interaction itself, and saving power consumption. In one implementation, the control device is a charging box, which has a stronger power storage capacity than the mobile terminal and the wireless headset, thereby saving power consumption for the mobile terminal and the headset. Especially, the power consumption of the intelligent earphone with the voice interaction function is very large, and the embodiment of the invention gives consideration to the energy consumption and the rapidness.
FIG. 1C is a schematic diagram relating to wireless connection of a control device according to one embodiment of the present invention. In the embodiment, the wireless headset is a smart headset, and the communication with the mobile terminal is realized by using A2 DP. On the other hand, the charging box of the smart headset has a touch area as shown in the figure, and communication with the mobile terminal is realized through BLE. In addition, the smart headset has a speaker for the user to listen to and a voice interaction microphone, e.g., for voice interaction between the user and the smart headset. In the embodiment, the control of the intelligent headset through the gestures is realized, and different audio controls are realized by using different gestures, so that the user experience is improved, the power consumption is saved, and the mobility is kept.
In one implementation of the present invention, the sensing module is further configured to sense a second gesture input by the user in the touch area,
the control module is further configured to establish the first near field communication according to the second gesture.
In one implementation manner of the present invention, the mobile terminal is connected to a wireless headset through the second near field communication, and the control module is specifically configured to: according to the second gesture, sending a request for establishing the first near field communication to the mobile terminal; establishing the first near field communication in response to a notification sent by the mobile terminal indicating that a second near field communication of the mobile terminal with the wireless headset is established.
In one implementation manner of the present invention, the mobile terminal is connected to a wireless headset through the second near field communication, and the control module is specifically configured to: establishing the first near field communication in response to the second gesture, in the event that a notification has been received sent by the mobile terminal indicating that second near field communication of the mobile terminal with the wireless headset has been established.
In one implementation manner of the embodiment of the present invention, the control device further includes: the first charging module is used for wirelessly controlling charging of the wireless earphone.
In one implementation manner of the embodiment of the present invention, the apparatus is an earphone charging box, and includes a second charging module, where the second charging module has a charging port for charging the wireless earphone. For example, the control means may be implemented as a charging box for a wireless headset. It should be understood that the charging box is firstly used as a control device, which has considerable mobility, and the operability and the mobility are further considered by using the touch control on the charging box.
In one implementation manner of the embodiment of the present invention, the control device further includes: a third charging module, the second charging module having a charging port for charging the mobile terminal. For example, the control means may be implemented as a charging box of the mobile terminal.
According to the scheme of the embodiment of the invention, the gesture can be acquired on the touch area of the control device, and further control is carried out based on the gesture, so that uncomfortable experience caused by direct touch of a user on the earphone entity is avoided, and more convenient interaction is realized. In addition, the communication in the embodiment of the invention is realized by adopting the first near field communication and the second near field communication, thereby ensuring the flexibility of movement. The control device may be implemented as a charging box of a wireless headset or a charging box (e.g., a charger) of a mobile terminal, etc.
In another implementation manner of the present invention, the touch area is disposed on a side surface of the earphone charging box. For example, fig. 1D is a schematic diagram of an example of an arrangement of touch areas on a wireless headset according to another embodiment of the present invention. As shown in fig. 1D, the earphone charging box is flat, wherein a charging port for charging the wireless earphone is disposed on a top surface of the earphone charging box. For example, at least one of a slide touch, a click touch, a long-press touch, and a short-press touch may be performed on the side surface. For example, a slide touch may be performed on the side. For example, at least one of a click touch, a long press touch, and a short press touch may be performed on the top surface or the bottom surface. As shown in fig. 1D, a clockwise slide and a counterclockwise slide may trigger different operations. It should be understood that the above examples are only exemplary, and the present invention is not limited thereto, and any similar arrangement of the touch areas and combinations of the various arrangements described herein are within the scope of the present invention.
In another implementation of the invention, the earphone charging box has a cylindrical shape, wherein a charging port for charging the wireless earphone is provided on a top surface of the cylindrical shape or on a bottom surface of the cylindrical shape. For example, fig. 1E is a schematic block diagram of another example of an arrangement of touch areas on a wireless headset according to another embodiment of the present invention. As shown in fig. 1E, the wireless headset is a pair of two headsets, and two charging ports for respectively charging the two headsets are respectively disposed on the top surface of the cylinder and the top surface of the cylinder. For example, at least one of a slide touch, a click touch, a long-press touch, and a short-press touch may be performed on the side surface. For example, a slide touch may be performed on the side. For example, at least one of a click touch, a long press touch, and a short press touch may be performed on the top surface or the bottom surface. Sliding along the sides of the cylinder in one direction and sliding in the other direction may trigger different operations, as shown in fig. 1E. It should be understood that the above examples are only exemplary, and the present invention is not limited thereto, and any similar arrangement of the touch areas and combinations of the various arrangements described herein are within the scope of the present invention.
FIG. 2 is a schematic block diagram of a mobile terminal of another embodiment of the present invention; the mobile terminal of fig. 2 includes:
the receiving module 210 receives a control message sent by the control device through a first near field communication according to a first gesture, wherein the first gesture is input by a user in a touch area of the control device;
and the control module 220 performs audio control and/or call control according to the control message.
In an implementation manner of the embodiment of the present invention, the receiving module is further configured to receive a request for establishing the first near field communication, which is sent by the control device according to a second gesture, where the second gesture is input by the user in the touch area of the control device; in response to the request, a notification is sent to a control device indicating that a second near-field communication of the mobile terminal with the wireless headset is established to establish the first near-field communication.
According to the scheme of the embodiment of the invention, the gesture can be acquired on the touch area of the control device, and further control is carried out based on the gesture, so that uncomfortable experience caused by direct touch of a user on an entity such as an earphone or an entity key of a mobile terminal is avoided, and more convenient audio control is realized.
In the scheme of the embodiment of the invention, when the audio interaction is realized by utilizing the earphone of the mobile terminal, the gesture can be acquired on the touch area of the control device, and the further control is carried out based on the gesture, so that the uncomfortable experience caused by the fact that a user directly touches the earphone entity is avoided, and the more convenient interaction is realized. In addition, the communication in the embodiment of the invention is realized by adopting the first near field communication and the second near field communication, thereby ensuring the flexibility of movement.
In an implementation manner of the present invention, the mobile terminal is connected to a wireless headset through a second near field communication, and the control message instructs the mobile terminal to perform on-off control on a voice interaction microphone disposed in the wireless headset through the second near field communication.
Fig. 3 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present application; the electronic device may include:
one or more processors 301;
a computer-readable medium 302, which may be configured to store one or more programs,
when executed by the one or more processors, cause the one or more processors to perform: sensing a first gesture input by a user in a touch area; performing audio control on the mobile terminal through the first near field communication according to the first gesture, or,
receiving a control message sent by a control device through first near field communication according to a first gesture, wherein the first gesture is input by a user in a touch area of the control device; and controlling the mobile terminal according to the control message.
Fig. 4 is a hardware structure of an electronic device according to a seventh embodiment of the present application; as shown in fig. 4, the hardware structure of the electronic device may include: a processor 401, a communication interface 402, a computer-readable medium 403, and a communication bus 404;
wherein the processor 401, the communication interface 402, and the computer-readable medium 403 are in communication with each other via a communication bus 404;
alternatively, the communication interface 402 may be an interface of a communication module;
the processor 401 may be specifically configured to: sensing a first gesture input by a user in a touch area; performing audio control on the mobile terminal through the first near field communication according to the first gesture, or,
receiving a control message sent by a control device through first near field communication according to a first gesture, wherein the first gesture is input by a user in a touch area of the control device; and controlling the mobile terminal according to the control message.
Processor 401 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The computer-readable medium 403 may be, but is not limited to, a Random Access Memory (RAM), a Read-Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code configured to perform the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium. The computer program, when executed by a Central Processing Unit (CPU), performs the above-described functions defined in the method of the present application. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer readable medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access storage media (RAM), a read-only storage media (ROM), an erasable programmable read-only storage media (EPROM or flash memory), an optical fiber, a portable compact disc read-only storage media (CD-ROM), an optical storage media piece, a magnetic storage media piece, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code configured to carry out operations for the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may operate over any of a variety of networks: including a Local Area Network (LAN) or a Wide Area Network (WAN) -to the user's computer, or alternatively, to an external computer (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions configured to implement the specified logical function(s). In the above embodiments, specific precedence relationships are provided, but these precedence relationships are only exemplary, and in particular implementations, the steps may be fewer, more, or the execution order may be modified. That is, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present application may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes a receiving module and a first presentation module. Wherein the names of the modules do not in some cases constitute a limitation of the module itself.
As another aspect, the present application also provides a computer readable medium having stored thereon a computer program which, when executed by a processor, implements: sensing a first gesture input by a user in a touch area; according to the first gesture, the mobile terminal is indicated to carry out audio control and/or call control through first near field communication, or a control message sent by a control device through first near field communication according to the first gesture is received, wherein the first gesture is input by a user in a touch area of the control device; and carrying out audio control and/or call control according to the control message.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: sensing a first gesture input by a user in a touch area; according to the first gesture, the mobile terminal is indicated to carry out audio control and/or call control through first near field communication, or a control message sent by a control device through first near field communication according to the first gesture is received, wherein the first gesture is input by a user in a touch area of the control device; and carrying out audio control and/or call control according to the control message.
The expressions "first", "second", "said first" or "said second" used in various embodiments of the present disclosure may modify various components regardless of order and/or importance, but these expressions do not limit the respective components. The above description is only configured for the purpose of distinguishing elements from other elements. For example, the first user equipment and the second user equipment represent different user equipment, although both are user equipment. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure.
When an element (e.g., a first element) is referred to as being "operably or communicatively coupled" or "connected" (operably or communicatively) to "another element (e.g., a second element) or" connected "to another element (e.g., a second element), it is understood that the element is directly connected to the other element or the element is indirectly connected to the other element via yet another element (e.g., a third element). In contrast, it is understood that when an element (e.g., a first element) is referred to as being "directly connected" or "directly coupled" to another element (a second element), no element (e.g., a third element) is interposed therebetween.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (22)

1. A control device, comprising:
the sensing module senses a first gesture input by a user in the touch area;
and the control module indicates the mobile terminal to carry out audio control and/or call control through first near field communication according to the first gesture.
2. The apparatus of claim 1, wherein the control module is specifically configured to: and according to the first gesture, sending a play control message to the mobile terminal through the first near field communication so as to control the play of the mobile terminal.
3. The apparatus of claim 1, wherein the control module is specifically configured to: and according to the first gesture, sending a call control message to the mobile terminal through the first near field communication so as to control the call of the mobile terminal.
4. The apparatus of claim 1, wherein the control module is specifically configured to: and according to the first gesture, sending a volume control message to the mobile terminal through the first near field communication so as to control the volume of the call of the mobile terminal and/or the play of the mobile terminal.
5. The apparatus of claim 4, wherein the mobile terminal is connected to an earphone, the mobile terminal comprises a first volume control module and a second volume control module, the first volume control module switches from electrical connection with the speaker of the mobile terminal to electrical connection with the speaker of the earphone in response to a connection setup message of the mobile terminal and the earphone, and the second volume control module is electrically connected to the speaker of the earphone, wherein the earphone volume control message instructs the second volume control module to perform volume control on the speaker disposed in the earphone.
6. The apparatus of claim 1, wherein the mobile terminal has a wireless headset connected thereto via a second near field communication, and the control message instructs the mobile terminal to switch control a voice interaction microphone disposed within the wireless headset via the second near field communication.
7. The apparatus of claim 1, wherein the sensing module is further to: sensing a second gesture input by a user in the touch area,
the control module is further configured to: establishing the first near field communication according to the second gesture.
8. The apparatus of claim 7, wherein the mobile terminal is connected to a wireless headset via a second near field communication, and the control module is specifically configured to: according to the second gesture, sending a request for establishing the first near field communication to the mobile terminal; establishing the first near field communication in response to a notification sent by the mobile terminal indicating that a second near field communication of the mobile terminal with the wireless headset is established.
9. The apparatus of claim 7, wherein the mobile terminal is connected to a wireless headset via a second near field communication, and the control module is specifically configured to: establishing the first near field communication in response to the second gesture, in the event that a notification has been received sent by the mobile terminal indicating that second near field communication of the mobile terminal with the wireless headset has been established.
10. The apparatus of claim 6, wherein the second near field communication has a higher bandwidth than the first near field communication.
11. The apparatus of claim 6, further comprising: and the first charging module wirelessly charges the wireless earphone through third near field communication.
12. The apparatus of claim 6, wherein the apparatus is a headset charging box comprising a second charging module having a charging port for charging the wireless headset.
13. The device of claim 1, wherein the touch area is disposed on a side of the headset charging box.
14. The device of claim 13, wherein the headset charging box is flat, wherein a charging port for charging the wireless headset is provided on a top surface of the headset charging box.
15. The device of claim 13, wherein the headset charging box is cylindrical, wherein a charging port for charging the wireless headset is provided at a top surface of the cylindrical shape or at a bottom surface of the cylindrical shape.
16. The apparatus of claim 15, wherein the wireless headset is a pair of two headsets, and two charging ports for respectively charging the two headsets are respectively disposed at the top surface of the cylinder and the top surface of the cylinder.
17. The apparatus of claim 1, further comprising: a third charging module, the second charging module having a charging port for charging the mobile terminal.
18. The apparatus of claim 1, wherein the first gesture is at least one of a slide touch, a click touch, a long press touch, and a short press touch in the touch area.
19. A mobile terminal, comprising:
the control device comprises a receiving module, a display module and a control module, wherein the receiving module is used for receiving a control message sent by the control device through first near field communication according to a first gesture, and the first gesture is input by a user in a touch area of the control device;
and the control module is used for carrying out audio control and/or call control according to the control message.
20. The mobile terminal of claim 19, wherein the mobile terminal has a wireless headset connected thereto via a second near field communication, and wherein the control message instructs the mobile terminal to switch-control a voice interaction microphone disposed within the wireless headset via the second near field communication.
21. An electronic device, the device comprising:
one or more processors;
a computer readable medium configured to store one or more programs,
when executed by the one or more processors, cause the one or more processors to perform: sensing a first gesture input by a user in a touch area; according to the first gesture, the mobile terminal is instructed to carry out audio control and/or call control through first near field communication, or,
receiving a control message sent by a control device through first near field communication according to a first gesture, wherein the first gesture is input by a user in a touch area of the control device;
and carrying out audio control and/or call control according to the control message.
22. A computer-readable medium, on which a computer program is stored which, when executed by a processor, implements:
sensing a first gesture input by a user in a touch area; according to the first gesture, the mobile terminal is instructed to carry out audio control and/or call control through first near field communication, or,
receiving a control message sent by a control device through first near field communication according to a first gesture, wherein the first gesture is input by a user in a touch area of the control device;
and carrying out audio control and/or call control according to the control message.
CN202010148666.7A 2020-03-05 2020-03-05 Control device, mobile terminal, electronic apparatus, and computer storage medium Pending CN113364914A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010148666.7A CN113364914A (en) 2020-03-05 2020-03-05 Control device, mobile terminal, electronic apparatus, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010148666.7A CN113364914A (en) 2020-03-05 2020-03-05 Control device, mobile terminal, electronic apparatus, and computer storage medium

Publications (1)

Publication Number Publication Date
CN113364914A true CN113364914A (en) 2021-09-07

Family

ID=77523783

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010148666.7A Pending CN113364914A (en) 2020-03-05 2020-03-05 Control device, mobile terminal, electronic apparatus, and computer storage medium

Country Status (1)

Country Link
CN (1) CN113364914A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222216A (en) * 2021-11-15 2022-03-22 新线科技有限公司 Voice software working state indication method, earphone kit and earphone component

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015036458A1 (en) * 2013-09-10 2015-03-19 Audiowings Ltd Wireless headset
CN109744682A (en) * 2019-02-01 2019-05-14 深圳科甲技术有限公司 Multimedia with storage TWS earphone is adapted to machine
TW201924363A (en) * 2017-10-30 2019-06-16 日商福爾德股份有限公司 Sound reproduction device
CN209517460U (en) * 2019-03-29 2019-10-18 深圳市烁讯电子有限公司 A kind of earphone charging box

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015036458A1 (en) * 2013-09-10 2015-03-19 Audiowings Ltd Wireless headset
TW201924363A (en) * 2017-10-30 2019-06-16 日商福爾德股份有限公司 Sound reproduction device
CN109744682A (en) * 2019-02-01 2019-05-14 深圳科甲技术有限公司 Multimedia with storage TWS earphone is adapted to machine
CN209517460U (en) * 2019-03-29 2019-10-18 深圳市烁讯电子有限公司 A kind of earphone charging box

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222216A (en) * 2021-11-15 2022-03-22 新线科技有限公司 Voice software working state indication method, earphone kit and earphone component

Similar Documents

Publication Publication Date Title
WO2019090726A1 (en) Method for selecting bluetooth device, terminal, and system
US8442435B2 (en) Method of remotely controlling an Ear-level device functional element
US10630826B2 (en) Information processing device
US20200245051A1 (en) Bluetooth headset control method, bluetooth headset, and computer readable storage medium
EP3011683A2 (en) Determining proximity for devices interacting with media devices
US20230138804A1 (en) Enhanced video call method and system, and electronic device
WO2022156662A1 (en) Method and apparatus for switching audio playing mode, and electronic device and storage medium
CN104038589A (en) Mobile phone with symmetrical user experience and IO (Input Output) device switching method thereof
CN104812020A (en) Method, device and system for controlling intelligent equipment to access network
CN115190197B (en) Bluetooth headset-based communication method and device and storage medium
CN105208089A (en) Information display method, apparatus and system
US20160192055A1 (en) Drive-by-wire earphone capable of being connected with input source for wireless communication
CN112806023B (en) Control method of wireless earphone and related product
CN105554811A (en) Method and device for conversation processing
CN113329389B (en) Service providing method, device, equipment and storage medium based on Bluetooth connection
CN113364914A (en) Control device, mobile terminal, electronic apparatus, and computer storage medium
CN103376975A (en) Control method and device and mobile communication terminal
CN109712380A (en) Find method and device, the storage medium, terminal device, remote control equipment of remote control equipment
CN113271385B (en) Call forwarding method
CN201860397U (en) Television with Bluetooth control module
CN107635272A (en) Control method of electronic device and device
CN110381418B (en) Loudspeaking device, relay device and mobile terminal
JP3197048U (en) Smart TV phone
US9084106B1 (en) Remote controller for mobile device
CN103856625B (en) Control device and the wireless communication terminal of doubleway output audio frequency

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210907