CN112083843B - Control method and device of application icons - Google Patents

Control method and device of application icons Download PDF

Info

Publication number
CN112083843B
CN112083843B CN202010911380.XA CN202010911380A CN112083843B CN 112083843 B CN112083843 B CN 112083843B CN 202010911380 A CN202010911380 A CN 202010911380A CN 112083843 B CN112083843 B CN 112083843B
Authority
CN
China
Prior art keywords
touch
target
voice information
application icon
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010911380.XA
Other languages
Chinese (zh)
Other versions
CN112083843A (en
Inventor
尹鑫
马颖江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202010911380.XA priority Critical patent/CN112083843B/en
Publication of CN112083843A publication Critical patent/CN112083843A/en
Application granted granted Critical
Publication of CN112083843B publication Critical patent/CN112083843B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/02Feature extraction for speech recognition; Selection of recognition unit
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a control method and device of an application icon. The method is applied to a graphical user interface obtained by rendering in a touch display of the mobile terminal, and comprises the following steps: acquiring first voice information, wherein the first voice information is used for triggering a target application icon to move in the graphical user interface; acquiring a target touch position of a touch medium, wherein the target touch position is a position to be touched by the touch medium within a touch range of the graphical user interface; and controlling the target application icon to move from an initial display position to a target display position based on the first voice information and the target touch position. The invention solves the technical problem that a user is inconvenient to operate the smart phone with one hand along with the increasing screen of the smart phone in the prior art.

Description

Control method and device of application icons
Technical Field
The invention relates to the technical field of control, in particular to a control method and device of an application icon.
Background
With the development of intelligent terminals, from the small screen of the prior small mobile phone to the large screen of the prior smart mobile phone, the operation habit of a user for operating the mobile phone also changes along with the size of the screen, but the convenience of operating the mobile phone with one hand is worse along with the larger screen.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a control method and device of an application icon, and at least solves the technical problem that a user is inconvenient to operate a smart phone with one hand as a smart phone screen is larger and larger in the prior art.
According to an aspect of the embodiments of the present invention, there is provided a method for controlling an application icon, where the method is applied in a graphical user interface rendered in a touch display of a mobile terminal, and the method includes: acquiring first voice information, wherein the first voice information is used for triggering a target application icon to move in the graphical user interface; acquiring a target touch position of a touch medium, wherein the target touch position is a position to be touched by the touch medium within a touch range of the graphical user interface; and controlling the target application icon to move from an initial display position to a target display position based on the first voice information and the target touch position.
Optionally, the method further includes: the method comprises the steps of adding a voice control interface in function interfaces supported by the mobile terminal and adding a voice control switch in a function setting menu of the mobile terminal, wherein the voice control interface is used for expanding a voice control function in the function interfaces supported by the mobile terminal, the voice control function is used for controlling at least one application icon to move in the graphic user interface through voice information, and the voice control switch is used for determining whether to start the voice control function.
Optionally, the obtaining the target touch position of the touch medium includes: detecting a hover position of the touch medium over the graphical user interface; acquiring the projection position of the suspension position in the graphical user interface; and determining the projection position as the target touch position.
Optionally, the obtaining the target touch position of the touch medium includes: acquiring a historical operation record of the touch medium; and determining the target touch position based on the historical operation record.
Optionally, determining the target touch position based on the historical operation record includes one of: determining the latest touch position as the target touch position based on the historical operation record; determining the touch position with the highest touch frequency within a first preset time length as the target touch position based on the historical operation record; and analyzing a plurality of different touch positions respectively contacted within a second preset time length based on the historical operation records, and determining the target touch position.
Optionally, the controlling the target application icon to move from the initial display position to the target display position based on the first voice information and the target touch position includes: determining whether the initial display position is within the touch range; when the initial display position is outside the touch control range, comparing the first voice information with second voice information to obtain a comparison result, wherein the second voice information is preset voice information; when the comparison result meets a preset condition, determining the target display position based on the target touch position; and controlling the target application icon to move from the initial display position to the target display position.
Optionally, the method further includes: respectively acquiring a first voiceprint image and a second voiceprint image in an initialization state, wherein the first voiceprint image and the second voiceprint image are voiceprint images obtained by continuously inputting voice information twice; performing feature extraction on the first voiceprint image to obtain a first feature information point set, and performing feature extraction on the second voiceprint image to obtain a second feature information point set; matching the first characteristic information point set with the second characteristic information point set to obtain a matching result; and when the matching result meets a preset condition, determining the preset voice information.
According to another aspect of the embodiments of the present invention, there is also provided a control apparatus for an application icon, the apparatus being applied in a graphical user interface rendered in a touch display of a mobile terminal, the apparatus including: the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring first voice information, and the first voice information is used for triggering a target application icon to move in the graphical user interface; the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a target touch position of a touch medium, and the target touch position is a position to be touched by the touch medium in a touch range of the graphical user interface; and the control module is used for controlling the target application icon to move from an initial display position to a target display position based on the first voice information and the target touch position.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium, in which a computer program is stored, wherein the computer program is configured to execute the control method of the application icon in any one of the above when running.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program is configured to execute the control method for an application icon described in any one of the above when the program is executed.
According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to execute the control method of the application icon described in any one of the above.
In the embodiment of the invention, a voice and finger combined control mode is adopted, and first voice information is acquired, wherein the first voice information is used for triggering a target application icon to move in the graphical user interface; acquiring a target touch position of a touch medium, wherein the target touch position is a position to be touched by the touch medium within a touch range of the graphical user interface; the target application icon is controlled to move from the initial display position to the target display position based on the first voice information and the target touch position, the purpose of operating the mobile phone with one hand by combining the voice information is achieved, the technical effect of improving the convenience of operating the mobile phone with one hand of a user is achieved, and the technical problem that the user is not convenient to operate the smart phone with one hand along with the fact that the screen of the smart phone is larger and larger in the prior art is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flowchart illustrating steps of a method for controlling an application icon according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an interface for an alternative configuration of a voice control switch, according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating steps of a method for controlling selectable application icons according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a control device for application icons according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of a control method for an application icon, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
The method is applied to a graphical user interface rendered in a touch display of a mobile terminal, fig. 1 is a flowchart of steps of a control method for an application icon according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following steps:
step S102, collecting first voice information, wherein the first voice information is used for triggering a target application icon to move in the graphical user interface;
step S104, acquiring a target touch position of a touch medium, wherein the target touch position is a position to be touched by the touch medium in a touch range of the graphical user interface;
and step S106, controlling the target application icon to move from an initial display position to a target display position based on the first voice message and the target touch position.
In the embodiment of the invention, a voice and finger combined control mode is adopted, and first voice information is acquired, wherein the first voice information is used for triggering a target application icon to move in the graphical user interface; acquiring a target touch position of a touch medium, wherein the target touch position is a position to be touched by the touch medium within a touch range of the graphical user interface; the target application icon is controlled to move from the initial display position to the target display position based on the first voice information and the target touch position, the purpose of operating the mobile phone with one hand by combining the voice information is achieved, the technical effect of improving the convenience of operating the mobile phone with one hand of a user is achieved, and the technical problem that the user is not convenient to operate the smart phone with one hand along with the fact that the screen of the smart phone is larger and larger in the prior art is solved.
Optionally, the first voice message is a message for triggering the target application icon to Move in the graphical user interface, for example, "wechat + Move" spoken by the user may be used to trigger the wechat icon to Move in the graphical user interface.
Optionally, the touch medium is a finger of a user, and the target touch position is a position to be touched by the finger within a touch range of the graphical user interface.
As an optional embodiment, for example, when the target application icon is a wechat icon, when a user operates the wechat App icon with a finger, and an initial display position of the wechat icon is not within an operable range of the finger of the user, the user may send first voice information "wechat + Move" for moving the wechat icon, the operating system may feed back the first voice information to trigger the wechat icon to Move within the graphical user interface, and further, the operating system may identify a target touch position of the finger to Move the wechat icon from the initial display position to the target display position.
In an optional embodiment, the method further includes:
step S202, adding a voice control interface in the function interfaces supported by the mobile terminal, and adding a voice control switch in the function setting menu of the mobile terminal, wherein the voice control interface is used for expanding a voice control function in the function interfaces supported by the mobile terminal, the voice control function is used for controlling at least one application icon to move in the graphic user interface through voice information, and the voice control switch is used for determining whether to start the voice control function.
In the above optional embodiment, a voice control interface is newly added to the functional interfaces supported by the mobile terminal, a voice control function is extended from the functional interfaces supported by the mobile terminal through the voice control interface, a voice control switch as shown in fig. 2 is newly added to the function setting menu of the mobile terminal, and whether the voice control function is enabled is determined through the voice control switch, so that at least one application icon can be controlled to move in the graphical user interface through voice information by using the voice control function.
In an optional embodiment, the obtaining the target touch position of the touch medium includes:
step S302, detecting the floating position of the touch medium above the graphical user interface;
step S304, acquiring the projection position of the suspension position in the graphical user interface;
step S306, determining the projection position as the target touch position.
In the above optional embodiment, the target touch position may be determined in a projection manner, and the hovering position of the finger of the user above the graphical user interface is detected, for example, within 1cm from the screen; under the influence of an external light source, a projection position corresponding to the floating position is projected on a graphical user interface, and the operating system can determine the projection position as the target touch position.
In an optional embodiment, the obtaining the target touch position of the touch medium includes:
step S402, obtaining the historical operation record of the touch medium;
step S404, determining the target touch position based on the historical operation record.
In the above optional embodiment, when the finger of the user touches the graphical user interface, the operating system may automatically record a history operation record (e.g., a coordinate position record) of the finger touching the graphical user interface, and may determine the target touch position according to the history operation record of the previous operation of the user, and directly move the target application icon from the initial display position to the target touch position.
In an optional embodiment, determining the target touch position based on the historical operation record includes one of:
step S502, determining the latest touch position as the target touch position based on the historical operation record;
step S504, determining a touch position with the highest touch frequency within a first preset duration as the target touch position based on the historical operation record;
step S506, based on the historical operation records, analyzing a plurality of different touch positions respectively contacted within a second preset time period, and determining the target touch position.
In the embodiment of the present application, the latest touch position in the history operation record may be determined as the target touch position; or determining the touch position with the highest touch frequency in the first preset time length in the historical operation record as the target touch position; and then, analyzing a plurality of different touch positions respectively contacted within a second preset time period in the historical operation record to determine the target touch position, for example, using a position located within the touch range among the plurality of different touch positions as the target touch position.
In an alternative embodiment, as shown in fig. 3, the controlling the target application icon to move from the initial display position to the target display position based on the first voice message and the target touch position includes:
step S602, determining whether the initial display position is within the touch range;
step S604, when the initial display position is outside the touch control range, comparing the first voice information with the second voice information to obtain a comparison result, wherein the second voice information is preset voice information;
step S606, when the comparison result meets the preset condition, determining a target display position based on the target touch position;
in step S608, the target application icon is controlled to move from the initial display position to the target display position.
In the above optional embodiment, it is first determined whether the initial display position is within the touch range, and if the initial display position is within the touch range, it may be selected that the first voice message and the target touch position are not required to be processed, that is, the target application icon is not required to be controlled to move from the initial display position to the target display position; and if the initial display position is out of the touch range, comparing the first voice information with the second voice information to obtain a comparison result, and when the comparison result meets a preset condition, determining the target display position based on the target touch position and controlling the target application icon to move from the initial display position to the target display position.
In an optional embodiment, the method further includes:
step S702, respectively acquiring a first voiceprint image and a second voiceprint image in an initialization state, wherein the first voiceprint image and the second voiceprint image are voiceprint images obtained by continuously inputting voice information twice;
step S704, extracting the characteristics of the first voiceprint image to obtain a first characteristic information point set, and extracting the characteristics of the second voiceprint image to obtain a second characteristic information point set;
step S706, matching the first characteristic information point set with the second characteristic information point set to obtain a matching result;
step S708, determining the preset voice message when the matching result satisfies a preset condition.
In an optional embodiment of the present application, in an initial state, a first voiceprint image and a second voiceprint image obtained by a user inputting voice information twice continuously are respectively collected, feature extraction is performed on the first voiceprint image to obtain a first feature information point set, and feature extraction is performed on the second voiceprint image to obtain a second feature information point set; matching the first characteristic information point set with the second characteristic information point set to obtain a matching result; and when the matching result meets a preset condition, for example, the similarity of the matching result is 80 percent or 90 percent, determining the preset voice information.
The embodiments of the present application are described in detail below with reference to a specific implementation manner, and the control method for an application icon provided in the embodiments of the present application may include, but is not limited to, the following method steps:
before using the control function of the application icon, a voice control interface needs to be added in the function interfaces supported by the mobile terminal, a voice control function is expanded in the function interfaces supported by the mobile terminal through the voice control interface, a voice control switch is added in the function setting menu of the mobile terminal, whether the voice control function is started or not is determined through the voice control switch, and then at least one application icon can be controlled to move in the graphic user interface through voice information by using the voice control function.
In order to prevent the misjudgment of the non-self voice control, in an initial state, the preset voice information of the user is recorded, for example, the special voice can be recorded in advance as follows: "APP name" + "Move", for example, when the user manipulates the WeChat APP icon with a finger, when the initial display position of the WeChat icon is not within the operable range of the user's finger, without using the control method provided by the embodiment of the present application, when the initial display position of the WeChat icon is not in the operable range of the finger of the user, the user can send out first voice information 'WeChat + Move' for moving the WeChat icon, the operating system can feed back the first voice information to trigger the WeChat icon to Move in the graphical user interface, and the operating system can also recognize the target touch position of the finger, move the WeChat icon from the initial display position to the target display position, since the target display position is within the operable range of the user's finger, all application icons can be operated in the single-hand mode by the above-described embodiment of the present application.
As an optional embodiment, in the embodiment of the present application, the target touch position may be determined in a projection manner, and the hovering position of the finger of the user above the graphical user interface is detected, for example, within 1cm from the screen; under the influence of an external light source, a projection position corresponding to the floating position is projected on a graphical user interface, and the operating system can determine the projection position as the target touch position.
Example 2
According to an embodiment of the present invention, an apparatus embodiment for implementing the control method of the application icon is further provided, where the apparatus is applied in a graphical user interface rendered in a touch display of a mobile terminal, and fig. 4 is a schematic structural diagram of a control apparatus of an application icon according to an embodiment of the present invention, and as shown in fig. 4, the control apparatus of an application icon includes: an acquisition module 40, an acquisition module 42, and a control module 44, wherein:
an acquisition module 40, configured to acquire first voice information, where the first voice information is used to trigger a target application icon to move in the graphical user interface; an obtaining module 42, configured to obtain a target touch position of a touch medium, where the target touch position is a position to be touched by the touch medium within a touch range of the graphical user interface; a control module 44, configured to control the target application icon to move from an initial display position to a target display position based on the first voice information and the target touch position.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the above-mentioned acquisition module 40, the acquisition module 42 and the control module 44 correspond to steps S102 to S106 in embodiment 1, and the above-mentioned modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure of embodiment 1. It should be noted that the modules described above may be executed in a computer terminal as part of an apparatus.
It should be noted that, reference may be made to the relevant description in embodiment 1 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The control device for the application icon may further include a processor and a memory, and the acquisition module 40, the acquisition module 42, the control module 44, and the like are all stored in the memory as program units, and the processor executes the program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
According to the embodiment of the application, the embodiment of the nonvolatile storage medium is also provided. Optionally, in this embodiment, the nonvolatile storage medium includes a stored program, and the control method for controlling the device where the nonvolatile storage medium is located to execute the any one of the application icons is controlled when the program runs.
Optionally, in this embodiment, the nonvolatile storage medium may be located in any one of a group of computer terminals in a computer network, or in any one of a group of mobile terminals, and the nonvolatile storage medium includes a stored program.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: acquiring first voice information, wherein the first voice information is used for triggering a target application icon to move in the graphical user interface; acquiring a target touch position of a touch medium, wherein the target touch position is a position to be touched by the touch medium within a touch range of the graphical user interface; and controlling the target application icon to move from an initial display position to a target display position based on the first voice information and the target touch position.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: the method comprises the steps of adding a voice control interface in function interfaces supported by the mobile terminal and adding a voice control switch in a function setting menu of the mobile terminal, wherein the voice control interface is used for expanding a voice control function in the function interfaces supported by the mobile terminal, the voice control function is used for controlling at least one application icon to move in the graphic user interface through voice information, and the voice control switch is used for determining whether to start the voice control function.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: detecting a hover position of the touch medium over the graphical user interface; acquiring the projection position of the suspension position in the graphical user interface; and determining the projection position as the target touch position.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: acquiring a historical operation record of the touch medium; and determining the target touch position based on the historical operation record.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: determining the latest touch position as the target touch position based on the historical operation record; determining a touch position with the highest touch frequency within a first preset time length as the target touch position based on the historical operation record; and analyzing a plurality of different touch positions respectively contacted within a second preset time length based on the historical operation records, and determining the target touch position.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: determining whether the initial display position is within the touch range; when the initial display position is outside the touch control range, comparing the first voice information with second voice information to obtain a comparison result, wherein the second voice information is preset voice information; when the comparison result meets a preset condition, determining the target display position based on the target touch position; and controlling the target application icon to move from the initial display position to the target display position.
Optionally, the apparatus in which the nonvolatile storage medium is controlled when the program is running performs the following functions: respectively acquiring a first voiceprint image and a second voiceprint image in an initialization state, wherein the first voiceprint image and the second voiceprint image are voiceprint images obtained by inputting voice information twice continuously; performing feature extraction on the first voiceprint image to obtain a first feature information point set, and performing feature extraction on the second voiceprint image to obtain a second feature information point set; matching the first characteristic information point set with the second characteristic information point set to obtain a matching result; and when the matching result meets a preset condition, determining the preset voice information.
According to the embodiment of the application, the embodiment of the processor is also provided. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes the control method of any one of the application icons when running.
An embodiment of the present application provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor is configured to run the computer program to execute any one of the above control methods for an application icon.
The present application also provides a computer program product adapted to perform a program of initializing the steps of the control method of an application icon of any of the above when executed on a data processing device.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technical content can be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A control method of an application icon is applied to a graphical user interface rendered in a touch display of a mobile terminal, and comprises the following steps:
acquiring first voice information, wherein the first voice information is used for triggering a target application icon to move in the graphical user interface;
acquiring a target touch position of a touch medium, wherein the target touch position is a position to be touched by the touch medium within a touch range of the graphical user interface;
controlling the target application icon to move from an initial display position to a target display position based on the first voice information and the target touch position;
wherein obtaining the target touch position of the touch medium comprises: detecting a hover position of the touch medium over the graphical user interface; acquiring the projection position of the suspension position in the graphical user interface; determining the projection position as the target touch position.
2. The method of claim 1, further comprising:
the method comprises the steps of adding a voice control interface in function interfaces supported by the mobile terminal and adding a voice control switch in a function setting menu of the mobile terminal, wherein the voice control interface is used for expanding a voice control function in the function interfaces supported by the mobile terminal, the voice control function is used for controlling at least one application icon to move in the graphic user interface through voice information, and the voice control switch is used for determining whether the voice control function is started or not.
3. The method of claim 1, wherein obtaining the target touch location of the touch medium comprises:
acquiring a historical operation record of the touch medium;
and determining the target touch position based on the historical operation record.
4. The method of claim 3, wherein determining the target touch location based on the historical operating record comprises one of:
determining the latest touch position as the target touch position based on the historical operation record;
determining the touch position with the highest touch frequency within a first preset time length as the target touch position based on the historical operation record;
and analyzing a plurality of different touch positions respectively contacted within a second preset time length based on the historical operation records, and determining the target touch position.
5. The method of claim 1, wherein controlling the target application icon to move from the initial display position to the target display position based on the first voice information and the target touch position comprises:
determining whether the initial display position is within the touch range;
when the initial display position is located outside the touch control range, comparing the first voice information with second voice information to obtain a comparison result, wherein the second voice information is preset voice information;
when the comparison result meets a preset condition, determining the target display position based on the target touch position;
controlling the target application icon to move from the initial display position to the target display position.
6. The method of claim 5, further comprising:
respectively acquiring a first voiceprint image and a second voiceprint image in an initialization state, wherein the first voiceprint image and the second voiceprint image are voiceprint images obtained by continuously inputting voice information twice;
performing feature extraction on the first voiceprint image to obtain a first feature information point set, and performing feature extraction on the second voiceprint image to obtain a second feature information point set;
matching the first characteristic information point set with the second characteristic information point set to obtain a matching result;
and when the matching result meets a preset condition, determining the preset voice information.
7. An apparatus for controlling an application icon, the apparatus being used in a graphical user interface rendered in a touch display of a mobile terminal, the apparatus comprising:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring first voice information, and the first voice information is used for triggering a target application icon to move in the graphical user interface;
the acquisition module is used for acquiring a target touch position of a touch medium, wherein the target touch position is a position to be touched by the touch medium in a touch range of the graphical user interface;
the control module is used for controlling the target application icon to move from an initial display position to a target display position based on the first voice information and the target touch position;
wherein the obtaining module is further configured to: detecting a hover position of the touch medium over the graphical user interface; acquiring the projection position of the suspension position in the graphical user interface; determining the projection position as the target touch position.
8. A non-volatile storage medium, wherein a computer program is stored in the storage medium, wherein the computer program is configured to execute the control method of an application icon according to any one of claims 1 to 6 when running.
9. A processor for running a program, wherein the program is configured to execute the control method of the application icon according to any one of claims 1 to 6 when running.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the method of controlling an application icon as claimed in any one of claims 1 to 6.
CN202010911380.XA 2020-09-02 2020-09-02 Control method and device of application icons Active CN112083843B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010911380.XA CN112083843B (en) 2020-09-02 2020-09-02 Control method and device of application icons

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010911380.XA CN112083843B (en) 2020-09-02 2020-09-02 Control method and device of application icons

Publications (2)

Publication Number Publication Date
CN112083843A CN112083843A (en) 2020-12-15
CN112083843B true CN112083843B (en) 2022-05-27

Family

ID=73731333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010911380.XA Active CN112083843B (en) 2020-09-02 2020-09-02 Control method and device of application icons

Country Status (1)

Country Link
CN (1) CN112083843B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112799530B (en) * 2020-12-31 2024-02-13 科大讯飞股份有限公司 Touch screen control method and device, electronic equipment and storage medium

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324404B (en) * 2012-03-20 2017-02-01 宇龙计算机通信科技(深圳)有限公司 Method for moving icon and communication terminal
CN104345877B (en) * 2013-08-08 2018-04-27 联想(北京)有限公司 The method and electronic equipment of a kind of information processing
CN104360805B (en) * 2014-11-28 2018-01-16 广东欧珀移动通信有限公司 Application icon management method and device
CN105139851A (en) * 2015-09-17 2015-12-09 努比亚技术有限公司 Desktop application icon organization mobile terminal and method
CN106201312A (en) * 2016-06-30 2016-12-07 北京奇虎科技有限公司 A kind of application processing method, device and terminal
CN106610810A (en) * 2016-12-06 2017-05-03 深圳市全智达科技有限公司 Voice inputting method and apparatus
CN106990883B (en) * 2017-02-27 2023-07-07 宇龙计算机通信科技(深圳)有限公司 Icon moving method, icon moving system and terminal
CN107347111A (en) * 2017-05-16 2017-11-14 上海与德科技有限公司 The control method and terminal of terminal
CN109445663A (en) * 2018-11-13 2019-03-08 百度在线网络技术(北京)有限公司 Display methods, device, storage medium and the terminal device of terminal desktop icon
CN109683763A (en) * 2018-12-24 2019-04-26 维沃移动通信有限公司 A kind of icon moving method and mobile terminal
CN109933252B (en) * 2018-12-27 2021-01-15 维沃移动通信有限公司 Icon moving method and terminal equipment
CN110060672A (en) * 2019-03-08 2019-07-26 华为技术有限公司 A kind of sound control method and electronic equipment
CN110557699B (en) * 2019-09-11 2021-09-07 百度在线网络技术(北京)有限公司 Intelligent sound box interaction method, device, equipment and storage medium
CN111338521A (en) * 2020-02-14 2020-06-26 维沃移动通信有限公司 Icon display control method and electronic equipment

Also Published As

Publication number Publication date
CN112083843A (en) 2020-12-15

Similar Documents

Publication Publication Date Title
CN107678644B (en) Image processing method and mobile terminal
CN103064620B (en) Touch screen operation method and touch screen terminal
CN111371988B (en) Content operation method, device, terminal and storage medium
CN105549868A (en) Mobile terminal operation processing method and apparatus and mobile terminal
CN111373358A (en) Application control method, graphical user interface and terminal
CN109766054B (en) Touch screen device and control method and medium thereof
CN107066188A (en) A kind of method and terminal for sending screenshot picture
CN111767108A (en) Application program label generation method, application interface display method and device
CN106776908B (en) Data cleaning method and device and terminal
CN111367457A (en) Content sharing method and device and electronic equipment
CN106959746A (en) The processing method and processing device of speech data
CN112083858A (en) Method and device for adjusting display position of control
CN107450824B (en) Object deleting method and terminal
CN111857497B (en) Operation prompting method and electronic equipment
CN108710457A (en) Interaction method and terminal equipment
CN105677194A (en) Method and terminal for selecting objects
CN111158788B (en) Desktop starter control method and device and storage medium
CN112083843B (en) Control method and device of application icons
CN105446629A (en) Content pane switching method, device and terminal
EP3511813A1 (en) Force touch method and terminal
CN106095230A (en) A kind of method for controlling mobile terminal and terminal
CN109739431A (en) A kind of control method and electronic equipment
CN105159874A (en) Method and apparatus for modifying character
CN104572349A (en) Data backup method
CN108182020A (en) screen display processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant