CN117331458A - Interactive control system of sound equipment power amplifier - Google Patents
Interactive control system of sound equipment power amplifier Download PDFInfo
- Publication number
- CN117331458A CN117331458A CN202311187245.5A CN202311187245A CN117331458A CN 117331458 A CN117331458 A CN 117331458A CN 202311187245 A CN202311187245 A CN 202311187245A CN 117331458 A CN117331458 A CN 117331458A
- Authority
- CN
- China
- Prior art keywords
- setting
- user
- function
- target
- interaction module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 20
- 230000006870 function Effects 0.000 claims abstract description 118
- 230000003993 interaction Effects 0.000 claims abstract description 82
- 230000001960 triggered effect Effects 0.000 claims description 12
- 238000012544 monitoring process Methods 0.000 claims description 6
- 238000000034 method Methods 0.000 description 12
- 230000000694 effects Effects 0.000 description 7
- 238000013507 mapping Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Control Of Amplification And Gain Control (AREA)
Abstract
The invention discloses an interactive control system of an audio power amplifier, which comprises: the system comprises a processor, a first memory, a second memory and a human-computer interaction module; the first memory is used for storing a computer readable program; the computer readable program, when executed by a processor, causes the processor to perform the steps comprising: confirming a user to trigger a function button according to feedback of the man-machine interaction module; when the function button is determined to be a mode setting function, a setting interface group is generated in the man-machine interaction module; when the function button is a mode selection function, displaying a recommended mode identifier and a custom mode identifier; when the recommendation mode identification is determined to be selected, setting parameters of the sound power amplifier according to a recommendation record table; and when the user-defined mode identification is determined to be selected, setting parameters of the audio power amplifier according to the user-defined recording table. The invention is mainly used in the technical field of sound equipment.
Description
Technical Field
The invention relates to the technical field of sound equipment, in particular to an interactive control system of a sound equipment power amplifier.
Background
For some home Karaoke (KTV) systems, it is mainly a sound system that serves the home. In order to achieve high quality sound output, existing sound systems all have complex arrangements. This arrangement requires a strong degree of expertise, since the home is a non-professional application. The user is difficult to intuitively set each parameter of the audio power amplifier, and the user cannot simply and quickly set the audio power amplifier. Therefore, how to realize the convenient setting of the parameters of the audio power amplifier by the user is a technical problem to be solved in the industry.
Disclosure of Invention
The invention provides an interactive control system of an audio power amplifier, which solves one or more technical problems in the prior art and at least provides a beneficial selection or creation condition.
The invention provides an interactive control system of an audio power amplifier, which comprises: a processor, a first memory and a second memory; the first memory is used for storing a computer readable program; the computer readable program, when executed by the processor, causes the processor to implement steps comprising:
step 1, confirming a user to trigger a function button according to feedback of a man-machine interaction module;
step 2, when the function button is determined to be a mode setting function, a setting interface group is generated in a man-machine interaction module;
the setting interface group includes: setting a main interface and a plurality of setting sub-interfaces, wherein the main interface is provided with a first-stage interaction area and a second-stage interaction area;
the first-stage interaction area is used for placing an exit button and a plurality of function sub-buttons, each function sub-button is associated with a corresponding setting sub-interface, when the function sub-button is triggered, the setting sub-interface corresponding to the function sub-button is presented in the second interaction area, wherein the function sub-button is marked as a target function sub-button, the setting sub-interface is marked as a target setting sub-interface, a plurality of setting parameters are arranged on the target sub-interface, and the setting parameters are used for setting acoustic parameters for a user;
when the target function sub-button is switched, setting parameters on the target setting sub-interface are stored to form a first record list; when the exit button is triggered, integrating all the first record tables to form a custom record table, and storing the custom record table in a second memory;
step 3, when the function button is determined to be a mode selection function, displaying a recommended mode identifier and a custom mode identifier in a man-machine interaction module;
step 4, when the user selects the recommended mode identification, a corresponding recommended record table stored in advance is called from a second memory, and parameter setting is carried out on the sound power amplifier according to the recommended record table;
and 5, when the user selects the custom mode identification, a corresponding custom record table is called from the second memory, and parameter setting is carried out on the audio power amplifier according to the custom record table.
Further, in step 1, confirming, according to feedback from the man-machine interaction module, that the user triggers the function button specifically includes: and monitoring the human-computer interaction module in real time to acquire coordinates touched by a user and fed back by the human-computer interaction module, wherein the sitting mark is a first target coordinate, the first target coordinate and a preset coordinate set representing the function button are subjected to judgment, and when the first target coordinate belongs to the coordinate set, the user is considered to trigger the function button.
Further, in step 2, when the function button is determined to be a mode setting function, generating a setting interface group in the man-machine interaction module specifically includes: the coordinate sets are grouped according to the functional attributes of the functional buttons and are divided into unit groups representing the functional attributes of the functional buttons, wherein the functional attributes of one unit group are mode setting functions, the unit groups are marked as first target unit groups, the first target coordinates and all the unit groups belong to one-to-one judgment, when the first target coordinates belong to the first target unit groups, the functional buttons are determined to be the mode setting functions, and a setting interface group is generated in the man-machine interaction module.
Further, in step 3, when the function button is determined to be a mode selection function, displaying the recommended mode identifier and the custom mode identifier in the man-machine interaction module specifically includes:
the coordinate set is divided into unit groups representing the functional attributes of the functional buttons according to the functional attributes of the functional buttons, wherein the functional attributes of one unit group are mode selection functions, the unit groups are marked as second target unit groups, the first target coordinates and all the unit groups are judged one by one, when the first target coordinates belong to the second target unit groups, the functional buttons are determined to be mode selection functions, and recommended mode identifications and custom mode identifications are displayed in the man-machine interaction module.
Further, the processor, the first memory and the second memory are all soldered on the same circuit board.
Further, the man-machine interaction module is a touch display screen.
Further, the first memory and the second memory are integrated in the same memory chip.
Further, the number of the recommended recording tables is two or more.
Further, in step 4, determining that the user selects the recommended mode identifier specifically includes: and monitoring the recommended mode identifier displayed in the man-machine interaction module in real time, acquiring coordinates touched by a user and fed back by the man-machine interaction module, wherein the sitting mark is a second target coordinate, the second target coordinate and a preset coordinate set representing the triggering recommended mode identifier are subjected to belonging judgment, and when the second target coordinate belongs to the coordinate set, the user is considered to select the recommended mode identifier.
Further, in step 5, determining that the user selects the custom mode identifier specifically includes: and monitoring the user-defined mode identifier displayed in the man-machine interaction module in real time to acquire coordinates touched by a user and fed back by the man-machine interaction module, wherein the sitting mark is a third target coordinate, the third target coordinate and a preset coordinate set representing the triggering user-defined mode identifier are subjected to belonging judgment, and when the third target coordinate belongs to the coordinate set, the user is considered to select the user-defined mode identifier.
The invention has at least the following beneficial effects: the invention realizes the rapid setting of the parameters of the sound power amplifier by the user through the man-machine interaction module, and simultaneously saves the parameters set by the user, so that the user can rapidly set the parameters of the sound power amplifier in a man-machine interaction mode when setting the parameters next time. Meanwhile, the invention also provides recommended parameters so as to facilitate a non-professional user to quickly set the parameters of the sound power amplifier. The invention is mainly used in the technical field of sound equipment.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate and do not limit the invention.
FIG. 1 is a flowchart of steps implemented by a processor;
fig. 2 is a schematic diagram of a system configuration of an interactive control system for an audio amplifier.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
It should be noted that although functional block diagrams are depicted as block diagrams, and logical sequences are shown in the flowchart, in some cases, the steps shown or described may be performed in a different order than the block diagrams in the system. The terms first, second and the like in the description and in the claims and in the above-described figures, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Referring to fig. 1, fig. 1 is a flowchart of steps implemented by a processor. Fig. 2 is a schematic diagram of a system configuration of an interactive control system for an audio amplifier. The main function of the method is to realize that a user can simply and quickly set the sound power amplifier.
To achieve this object, the present application provides an interactive control system for an audio power amplifier, including: the system comprises a processor, a first memory, a second memory and a human-computer interaction module; the first memory is used for storing a computer readable program; the computer readable program, when executed by the processor, causes the processor to implement steps comprising:
and step 1, confirming that a user triggers a function button according to feedback of the man-machine interaction module.
In order to facilitate the interactive operation of the user, the processor generates a plurality of function buttons on the man-machine interaction module. The user can trigger the function buttons by touching the sensing area of the man-machine interaction module. In some further embodiments, the function button is triggered in the following manner: the processor monitors the man-machine interaction module in real time, and for the man-machine interaction module, when the sensing area is touched, the output end of the man-machine interaction module generates a response signal. The processor monitors the man-machine interaction module, so that the processor senses the response signal. And when the processor receives the response signal, the human-computer interaction module is considered to be triggered. The processor communicates with the man-machine interaction module to obtain the triggered coordinate of the sensing area of the man-machine interaction module, and for convenience of description, the sitting mark is a first target coordinate. In order to determine whether the user has triggered a function button, it is necessary to make a determination of the first target coordinates to the set of coordinates representing the function button. When the function buttons are arranged, coordinates of the sensing areas belonging to the function buttons can be obtained through planning of the sensing areas of the man-machine interaction module, and the coordinates are formed into a coordinate set. The coordinate set is called a coordinate set representing a function button. Therefore, when the first target coordinates belong to the coordinate set representing the function button, it can be considered that the user touches the sensing area of the function button.
After determining that the user has touched the function button, the processor may proceed to step 2.
And 2, when the function button is determined to be a mode setting function, generating a setting interface group in the man-machine interaction module.
The man-machine interaction module generates a plurality of function buttons, wherein the function represented by one function button is a mode setting function. To determine a function button representing a mode setting function from the viewpoint of the coordinate set. In some further embodiments, the processor groups the resulting coordinate sets. Such that each function button corresponds to a set of coordinate sets, which are denoted as a set of cells for ease of description. Wherein the unit group where the function button representing the mode setting function is located is denoted as a first target unit group. In order to determine whether the first target coordinate belongs to the first target unit group, the processor performs a judgment on the first target coordinate and all the unit groups one by one. Thereby determining whether the first target coordinates belong to the first target unit group. When it is determined that the first target coordinates belong to the first target unit group, a mode setting function indicated by a function button triggered by the user is considered. From another perspective, the user is considered to need to set the mode of the audio power amplifier.
After determining that the user needs to set the mode of the audio power amplifier, the processor generates a setting interface group. The user can conveniently set parameters of the audio power amplifier by setting the interface group.
Wherein, set up the interface group and include: the method comprises the steps of setting a main interface and a plurality of setting sub-interfaces, wherein the main interface is provided with a first-stage interaction area and a second-stage interaction area.
The method comprises the steps of setting a main interface as a main man-machine interaction occasion, wherein the main interface is provided with two areas, the first area is a first-level interaction area, and the second area is a second-level interaction area. Wherein the second level interaction zone serves mainly the first level interaction zone. In the first level interaction zone, a plurality of touch buttons are placed. Wherein, at least comprises an exit button and a plurality of function sub-buttons. The function sub-buttons are correspondingly associated with a setting sub-interface, and the setting sub-interface is carried in the second-stage interaction area. And presenting the setting sub-interface through the second-stage interaction area. The processor performs association mapping on each function sub-button and each setting sub-interface through a setting association program. And if and only if the function sub-button is triggered, the processor responds to the trigger, finds a corresponding setting sub-interface according to the mapping relation, and loads the setting sub-interface in the second-stage interaction area. So that the second level interaction zone can present the setting sub-interface. When the function sub-button is triggered, a setting sub-interface corresponding to the function sub-button is presented in a second interaction area, wherein the function sub-button is marked as a target function sub-button, the setting sub-interface is marked as a target setting sub-interface, a plurality of setting parameters are arranged on the target sub-interface, and the setting parameters are used for setting sound parameters for a user;
the function sub-button mainly plays a role in setting parameters of the audio power amplifier. Thus, among the several functional sub-buttons, it includes at least: music input selection function, microphone setting function, output effect selection, main output function, and ultra-bass setting function.
When the user selects a function selection button representing a music input selection function, the processor finds a corresponding setting sub-interface through the mapping relationship, and the setting sub-interface is denoted as a first setting sub-interface for convenience of description. In the first settings sub-interface, it presents settings parameters related to the music channel. For example: a selection of music input channels including USB input, bluetooth input, wired network input, etc. And if the music excitation is started for the sound tone variation degree, the equalization selection of parameters comprises the following steps: equalization type selection, equalization gain selection, equalization frequency selection, equalization Q value, etc.
When the user selects a function button representing a microphone setting function, the processor finds a corresponding setting sub-interface through the mapping relationship, and the setting sub-interface is denoted as a second setting sub-interface for convenience of description. In the second settings sub-interface, it presents settings parameters related to the microphone. For example: howling suppression degree selection, high-pitch boost degree selection, high-pass frequency selection and other parameters, and for equalization selection of the parameters, the equalization selection comprises: equalization type selection, equalization gain selection, equalization frequency selection, equalization Q value, etc.
When the user selects the function button representing the selection of the output effect, the processor finds a corresponding setting sub-interface through the mapping relationship, and the setting sub-interface is marked as a third setting sub-interface for convenience of description. In the third setting sub-interface, it presents setting parameters concerning the music output effect. For example: an effect mode selection, the effect mode selection comprising echo a + echo B. The method comprises the steps of selecting the volume degree of an echo A effect, selecting the delay time degree of the echo A, selecting the repetition number of the echo A, selecting the volume degree of the echo B effect, selecting the delay time degree of the echo B and selecting the repetition number of the echo B. For the equalization selections of these parameters, these again include: equalization type selection, equalization gain selection, equalization frequency selection, equalization Q value, etc.
When the user selects the function button representing the main output function, the processor finds a corresponding setting sub-interface through the mapping relationship, and the setting sub-interface is denoted as a fourth setting sub-interface for convenience of description. In the fourth settings sub-interface, it is presented with the setting parameters of the main output. For example: output mode selection, music volume degree selection, microphone volume selection, echo A volume selection, echo B volume selection, high-pass frequency selection, low-pass frequency selection, left channel delay selection, right channel delay selection, left channel mute setting, right channel mute setting and power limit setting. For the equalization selections of these parameters, these again include: equalization type selection, equalization gain selection, equalization frequency selection, equalization Q value, etc.
When the user selects a function button representing the bass setting function, the processor finds a corresponding setting sub-interface through the mapping relation, and the setting sub-interface is marked as a fifth setting sub-interface for convenience of description. In the fifth settings sub-interface, it is presented with setting parameters for the subwoofer settings. For example: mode setting, music volume level selection, microphone volume level selection, echo A volume level selection, echo B volume level selection, high-pass frequency level selection, low-pass frequency level selection, channel delay level selection and channel silence setting. For the equalization selections of these parameters, these again include: equalization type selection, equalization gain selection, equalization frequency selection, equalization Q value, etc.
The user performs parameter setting through a setting sub-interface presented in the second level interaction zone. After the setting of the setting sub-interface corresponding to one function sub-button is finished by the user. It is achieved by switching the function sub-button, i.e. touching another function sub-button. The current function sub-button is switched by touching another function sub-button. For convenience of description, the function sub-buttons that the user needs to set are denoted as target function sub-buttons, and the setting sub-interfaces corresponding to the target function sub-buttons are denoted as target setting sub-interfaces. And when the target function sub-button is switched, saving the setting parameters on the target setting sub-interface to form a first record table.
When the user sets the parameters corresponding to the function sub-buttons, a plurality of first record tables are generated, and the first record tables are temporarily stored. After the user has set up, the exit button set in the first level interaction zone is touched. When the processor determines that the exit button is triggered, the processor considers that the user has set the parameters. At this time, the processor integrates all the temporarily stored first record tables in a preset integration manner, so as to form a custom record table. The custom record table completely records the custom settings of the parameters of the corresponding audio power amplifier of the user. The processor stores the obtained custom record table in the second memory.
Step 2 mainly presents the parameter setting process of the user for the audio power amplifier. When the user does not need to additionally set the audio power amplifier. Only the previous settings or recommended settings need to be invoked. At this time, the user can touch the function button. When the processor determines that the function of the function button touched by the user is a mode selection function. The processor displays the recommended mode identification and the custom mode identification in the human-computer interaction module.
And step 3, when the function button is determined to be a mode selection function, displaying a recommended mode identifier and a custom mode identifier in the man-machine interaction module.
And 4, when the user selects the recommended mode identification, a corresponding recommended record table stored in advance is called from the second memory, and parameter setting is carried out on the sound power amplifier according to the recommended record table.
When the user selects the recommended mode identification, then the user is deemed to wish to select the factory pre-stored set mode. At this time, the processor may retrieve a recommendation record table corresponding to the recommendation mode identifier stored in advance from the second memory. And setting parameters of the sound power amplifier according to the recommended record list. So as to realize the function of setting the sound equipment quickly by the user.
And 5, when the user selects the custom mode identification, a corresponding custom record table is called from the second memory, and parameter setting is carried out on the audio power amplifier according to the custom record table.
When the user selects the custom mode identification, then the user is considered to wish to select the set mode that he has previously set. At this time, the processor may retrieve a custom record table corresponding to the custom mode identifier stored in advance from the second memory. And setting parameters of the sound power amplifier according to the custom record table so as to realize the function of quick sound setting for a user.
The invention realizes the rapid setting of the parameters of the sound power amplifier by the user through the man-machine interaction module, and simultaneously saves the parameters set by the user, so that the user can rapidly set the parameters of the sound power amplifier in a man-machine interaction mode when setting the parameters next time. Meanwhile, the invention also provides recommended parameters so as to facilitate a non-professional user to quickly set the parameters of the sound power amplifier.
In some further embodiments, the processor, the first memory, and the second memory are all soldered to the same circuit board. Through welding first memory, treater and second memory rationally on same circuit board, can realize control system's better integration, be convenient for earlier stage production and later maintenance.
In some further embodiments, the human-machine interaction module is a touch display screen.
In some further embodiments, the first memory and the second memory are integrated in the same memory chip.
For the recommendation record table, in order to facilitate the user to have multiple choices, in some further specific embodiments, the recommendation record table has more than two. More than two corresponding recommended mode identifiers are provided, and the user can select the corresponding recommended mode identifiers to achieve the purpose of selecting the corresponding recommended record list.
For how to determine the user's selection of the recommendation mode identification, in some further specific embodiments, for how to determine the user's selection of the recommendation mode identification specifically comprises: the processor monitors the recommended mode identifier displayed in the man-machine interaction module in real time, acquires coordinates touched by a user and fed back by the man-machine interaction module, wherein the sitting marks are second target coordinates, the second target coordinates and a preset coordinate set representing the triggering recommended mode identifier are judged, and when the second target coordinates belong to the coordinate set, the user is considered to select the recommended mode identifier.
For how to determine the user's selection of the custom mode identification, in some further embodiments, the method specifically includes: the processor monitors the user-defined mode identifier displayed in the man-machine interaction module in real time, acquires coordinates touched by a user and fed back by the man-machine interaction module, wherein the sitting marks are third target coordinates, the third target coordinates and a preset coordinate set representing the triggering user-defined mode identifier are subjected to belonging judgment, and when the third target coordinates belong to the coordinate set, the user is considered to select the user-defined mode identifier.
The terms "first," "second," "third," "fourth," and the like in the description of the present application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be capable of operation in sequences other than those illustrated or described herein, for example. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in this application, "at least one" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or units, which may be in electrical, mechanical, or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Although the description of the present application has been described in considerable detail and with particularity with respect to several illustrated embodiments, it is not intended to be limited to any such detail or embodiments or any particular embodiment, but is to be construed as providing broad interpretation of such claims by reference to the appended claims, taking into account the prior art to which such claims are entitled to effectively encompass the intended scope of this application. Furthermore, the foregoing description of the embodiments contemplated by the inventors has been presented for the purpose of providing a useful description, and yet insubstantial changes to the invention that are not presently contemplated may represent equivalents of the invention.
Claims (10)
1. An interactive control system for an audio power amplifier, comprising: the system comprises a processor, a first memory, a second memory and a human-computer interaction module; the first memory is used for storing a computer readable program; the computer readable program, when executed by the processor, causes the processor to implement steps comprising:
step 1, confirming a user to trigger a function button according to feedback of a man-machine interaction module;
step 2, when the function button is determined to be a mode setting function, a setting interface group is generated in a man-machine interaction module;
the setting interface group includes: setting a main interface and a plurality of setting sub-interfaces, wherein the main interface is provided with a first-stage interaction area and a second-stage interaction area;
the first-stage interaction area is used for placing an exit button and a plurality of function sub-buttons, each function sub-button is associated with a corresponding setting sub-interface, when the function sub-button is triggered, the setting sub-interface corresponding to the function sub-button is presented in the second interaction area, wherein the function sub-button is marked as a target function sub-button, the setting sub-interface is marked as a target setting sub-interface, a plurality of setting parameters are arranged on the target sub-interface, and the setting parameters are used for setting acoustic parameters for a user;
when the target function sub-button is switched, setting parameters on the target setting sub-interface are stored to form a first record list; when the exit button is triggered, integrating all the first record tables to form a custom record table, and storing the custom record table in a second memory;
step 3, when the function button is determined to be a mode selection function, displaying a recommended mode identifier and a custom mode identifier in a man-machine interaction module;
step 4, when the user selects the recommended mode identification, a corresponding recommended record table stored in advance is called from a second memory, and parameter setting is carried out on the sound power amplifier according to the recommended record table;
and 5, when the user selects the custom mode identification, a corresponding custom record table is called from the second memory, and parameter setting is carried out on the audio power amplifier according to the custom record table.
2. The interactive control system of an audio power amplifier according to claim 1, wherein in step 1, the step of confirming that the user triggers the function button according to feedback from the man-machine interaction module specifically comprises: and monitoring the human-computer interaction module in real time to acquire coordinates touched by a user and fed back by the human-computer interaction module, wherein the sitting mark is a first target coordinate, the first target coordinate and a preset coordinate set representing the function button are subjected to judgment, and when the first target coordinate belongs to the coordinate set, the user is considered to trigger the function button.
3. The interactive control system of an audio power amplifier according to claim 2, wherein in step 2, when determining that the function button is a mode setting function, generating a setting interface group in a man-machine interaction module specifically includes: the coordinate sets are grouped according to the functional attributes of the functional buttons and are divided into unit groups representing the functional attributes of the functional buttons, wherein the functional attributes of one unit group are mode setting functions, the unit groups are marked as first target unit groups, the first target coordinates and all the unit groups belong to one-to-one judgment, when the first target coordinates belong to the first target unit groups, the functional buttons are determined to be the mode setting functions, and a setting interface group is generated in the man-machine interaction module.
4. The interactive control system of an audio power amplifier according to claim 2, wherein in step 3, when the function button is determined to be a mode selection function, displaying the recommended mode identifier and the custom mode identifier in the man-machine interaction module specifically includes:
the coordinate set is divided into unit groups representing the functional attributes of the functional buttons according to the functional attributes of the functional buttons, wherein the functional attributes of one unit group are mode selection functions, the unit groups are marked as second target unit groups, the first target coordinates and all the unit groups are judged one by one, when the first target coordinates belong to the second target unit groups, the functional buttons are determined to be mode selection functions, and recommended mode identifications and custom mode identifications are displayed in the man-machine interaction module.
5. The interactive control system of claim 1, wherein the processor, the first memory and the second memory are soldered on the same circuit board.
6. The interactive control system of an audio power amplifier according to claim 1, wherein the man-machine interaction module is a touch display screen.
7. The interactive control system of claim 1, wherein the first memory and the second memory are integrated in the same memory chip.
8. The interactive control system of claim 1, wherein the number of recommended recording tables is more than two.
9. The interactive control system of an audio amplifier according to claim 1, wherein in step 4, determining that the user selects the recommended mode identifier specifically includes: and monitoring the recommended mode identifier displayed in the man-machine interaction module in real time, acquiring coordinates touched by a user and fed back by the man-machine interaction module, wherein the sitting mark is a second target coordinate, the second target coordinate and a preset coordinate set representing the triggering recommended mode identifier are subjected to belonging judgment, and when the second target coordinate belongs to the coordinate set, the user is considered to select the recommended mode identifier.
10. The interactive control system of an audio power amplifier according to claim 1, wherein in step 5, determining that the user selects the custom mode identifier specifically includes: and monitoring the user-defined mode identifier displayed in the man-machine interaction module in real time to acquire coordinates touched by a user and fed back by the man-machine interaction module, wherein the sitting mark is a third target coordinate, the third target coordinate and a preset coordinate set representing the triggering user-defined mode identifier are subjected to belonging judgment, and when the third target coordinate belongs to the coordinate set, the user is considered to select the user-defined mode identifier.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311187245.5A CN117331458A (en) | 2023-09-14 | 2023-09-14 | Interactive control system of sound equipment power amplifier |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311187245.5A CN117331458A (en) | 2023-09-14 | 2023-09-14 | Interactive control system of sound equipment power amplifier |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117331458A true CN117331458A (en) | 2024-01-02 |
Family
ID=89289430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311187245.5A Pending CN117331458A (en) | 2023-09-14 | 2023-09-14 | Interactive control system of sound equipment power amplifier |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117331458A (en) |
-
2023
- 2023-09-14 CN CN202311187245.5A patent/CN117331458A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2939151B1 (en) | Method and apparatus for generating audio information | |
US7505826B2 (en) | Configuration method of digital audio mixer | |
US7672464B2 (en) | Locating and correcting undesirable effects in signals that represent time-based media | |
US20050220309A1 (en) | Sound reproduction apparatus, sound reproduction system, sound reproduction method and control program, and information recording medium for recording the program | |
KR20040022442A (en) | Speakker equalization tool | |
US6343130B2 (en) | Stereophonic sound processing system | |
JP2012194525A (en) | Sound generation control apparatus, identification apparatus, sound generation control system, program and sound generation control method | |
US8170240B2 (en) | Audio device | |
US20190250807A1 (en) | Information Display Apparatus and Information Display Method | |
CN109716794A (en) | Information processing unit, information processing method and program | |
CN117331458A (en) | Interactive control system of sound equipment power amplifier | |
US20040184617A1 (en) | Information apparatus, system for controlling acoustic equipment and method of controlling acoustic equipment | |
CN111176644B (en) | Automatic layout method and device for operation interface and response method and device thereof | |
CN103444161B (en) | Telephonist's management devices and telephonist's management process | |
KR0135451B1 (en) | Method of setting audio in a digital music accompaniment | |
CN105828172B (en) | Control method for playing back and device in audio-video frequency playing system | |
EP3537728B1 (en) | Connection state determination system for speakers, acoustic device, and connection state determination method for speakers | |
JPH1069279A (en) | Sound field supporting system and sound field setting method | |
CN112202913B (en) | Intelligent sound box cloud management system | |
JP7408955B2 (en) | Sound signal processing method, sound signal processing device and program | |
WO2022209227A1 (en) | Information processing terminal, information processing method, and program | |
US20220139405A1 (en) | Audio signal processing apparatus, method of controlling audio signal processing apparatus, and program | |
TWI621067B (en) | Method for recording playback setting of voice and electronic device performing the same | |
CN106792360B (en) | It is a kind of to play stereosonic method, apparatus and terminal | |
GB2551605A (en) | Audio signal processor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |