CN108829371B - Interface control method and device, storage medium and electronic equipment - Google Patents

Interface control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN108829371B
CN108829371B CN201810629387.5A CN201810629387A CN108829371B CN 108829371 B CN108829371 B CN 108829371B CN 201810629387 A CN201810629387 A CN 201810629387A CN 108829371 B CN108829371 B CN 108829371B
Authority
CN
China
Prior art keywords
interface
coordinates
target element
target
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810629387.5A
Other languages
Chinese (zh)
Other versions
CN108829371A (en
Inventor
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810629387.5A priority Critical patent/CN108829371B/en
Publication of CN108829371A publication Critical patent/CN108829371A/en
Application granted granted Critical
Publication of CN108829371B publication Critical patent/CN108829371B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Abstract

The application discloses an interface control method, an interface control device, a storage medium and electronic equipment. The method comprises the following steps: when the first interface is in, acquiring a voice instruction from a user; determining an operation to be completed by the voice instruction, wherein the operation is a second interface of a target element contained in the first interface; acquiring a target position of the target element in the first interface; and carrying out simulated clicking on the target position to enter a second interface of the target element. The embodiment can further improve the voice control function of the terminal.

Description

Interface control method and device, storage medium and electronic equipment
Technical Field
The application belongs to the technical field of terminals, and particularly relates to an interface control method and device, a storage medium and an electronic device.
Background
With the continuous development of the technology, the man-machine interaction mode between the user and the terminal is more and more abundant. For example, many terminals have voice assistant-like applications installed on them. The user can perform voice interaction with the terminal through the application of the voice assistant class, so that the terminal is controlled to execute certain operation. However, in the related art, the voice control function that can be implemented on the terminal is still limited.
Disclosure of Invention
The embodiment of the application provides an interface control method, an interface control device, a storage medium and electronic equipment, which can further improve the voice control function of a terminal.
The embodiment of the application provides an interface control method, which comprises the following steps:
when the first interface is in, acquiring a voice instruction from a user;
determining an operation to be completed by the voice instruction, wherein the operation is to enter a second interface of a target element included in the first interface;
acquiring a target position of the target element in the first interface;
and carrying out simulated clicking on the target position to enter a second interface of the target element.
An embodiment of the present application provides an interface control apparatus, including:
the first acquisition module is used for acquiring a voice instruction from a user when the first interface is in the first interface;
the determining module is used for determining an operation to be completed by the voice instruction, wherein the operation is to enter a second interface of a target element contained in the first interface;
the second acquisition module is used for acquiring a target position of the target element in the first interface;
and the operation module is used for carrying out simulated clicking on the target position so as to enter a second interface of the target element.
The embodiment of the present application provides a storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is caused to execute the steps in the interface control method provided by the embodiment of the present application.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the steps in the interface control method provided in the embodiment of the present application by calling the computer program stored in the memory.
In this embodiment, when the terminal is located in the first interface, the terminal may obtain an instruction sent by the user to instruct the terminal to enter an interface of a certain target element in the first interface. Then, the terminal may determine a target position where the target element is located in the first interface, and perform simulated clicking on the target position, thereby entering a second interface of the target element. The embodiment can control the terminal to enter the interface of a certain element contained in a certain interface through the voice command and the simulated click, so that the embodiment can further improve the voice control function of the terminal.
Drawings
The technical solution and the advantages of the present invention will be apparent from the following detailed description of the embodiments of the present invention with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of an interface control method according to an embodiment of the present application.
Fig. 2 is another schematic flow chart of the interface control method according to the embodiment of the present application.
Fig. 3 to fig. 6 are scene schematic diagrams of an interface control method according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an interface control device according to an embodiment of the present application.
Fig. 8 is another schematic structural diagram of an interface control device according to an embodiment of the present application.
Fig. 9 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Fig. 10 is another schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Detailed Description
Referring now to the drawings, in which like numerals represent like elements, the principles of the present invention are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the invention and should not be taken as limiting the invention with regard to other embodiments that are not detailed herein.
It can be understood that the execution subject of the embodiment of the present application may be a terminal device such as a smart phone or a tablet computer.
Referring to fig. 1, fig. 1 is a schematic flow chart of an interface control method according to an embodiment of the present application, where the flow chart may include:
at 101, voice instructions are obtained from a user while at a first interface.
With the continuous development of the technology, the man-machine interaction mode between the user and the terminal is more and more abundant. For example, many terminals have voice assistant-like applications installed on them. The user can perform voice interaction with the terminal through the application of the voice assistant class, so that the terminal is controlled to execute certain operation. However, in the related art, the voice control function that can be implemented on the terminal is still limited. For example, in the related art, a voice assistant application can only open a certain application for a user, and cannot enter interfaces of some function modules inside the application. If the user wants to enter the interfaces of some function modules inside the application, the user still needs to manually click the function module to enter the interface of the function module.
In this embodiment, for example, when the terminal is in the first interface of the application, the user needs to enter an interface of a certain function module of the application, and at this time, the user may issue a voice instruction, and then the terminal may obtain the voice instruction from the user.
At 102, an operation to be performed by the voice command is determined, wherein the operation is to enter a second interface of the target element included in the first interface.
For example, after acquiring a voice command from a user, the terminal may determine an operation to be performed by the voice command, where the operation is an interface (i.e., a second interface) entering a certain element (i.e., a target element) included in the first interface.
It should be noted that the elements included in the interface refer to function modules or function areas in the interface that allow a user to click. If a user manually clicks a certain function module or function area, the terminal enters an interface of the function module or function area.
For example, when the instant messaging application a is opened, the home interface of the instant messaging application a may be accessed. For example, at the bottom of the home interface, function modules such as "information", "address book", "my", and the like, which allow the user to click on, may be included. Each function module belongs to an element contained in the home interface. In addition, for example, the current default selected is the "information" function module, and a chat window of the user and the friend can be contained on the interface of the "information" function module. If the user clicks the chat window with a friend, the terminal enters the chat interface with the friend. That is, the "information" function module may include a plurality of chat windows, and each chat window also belongs to the elements included in the home page interface.
In 103, a target position of the target element in the first interface is obtained.
For example, after determining that the voice command of the user is to enter the second interface of the target element included in the first interface, the terminal may obtain a position of the target element in the first interface, that is, a target position.
At 104, a simulated click is made on the target location to access a second interface of the target element.
For example, after the target position of the target element is obtained, the terminal may perform simulated clicking on the target position, so as to enter a second interface of the target element.
The simulated click means that the terminal simulates a click action of a user to trigger the terminal to complete an operation corresponding to the click action. For example, when the user manually clicks the "my" function module at the bottom of the home interface of the instant messaging application a, the terminal enters the interface of the "my" function module. Then, the simulated click means that the user does not need to manually click the my function module, but the terminal determines the target position of the my function module on the home page interface, and then the terminal simulates the trigger effect when the user manually clicks the target position, so that the terminal automatically enters the interface of the my function module.
It can be understood that, in this embodiment, when the terminal is in the first interface, the terminal may obtain an instruction issued by the user to instruct the terminal to enter an interface of a certain target element in the first interface. Then, the terminal may determine a target position where the target element is located in the first interface, and perform simulated clicking on the target position, thereby entering a second interface of the target element. The embodiment can control the terminal to enter the interface of a certain element contained in a certain interface through the voice command and the simulated click, so that the embodiment can further improve the voice control function of the terminal.
Referring to fig. 2, fig. 2 is another schematic flow chart of an interface control method according to an embodiment of the present application, where the flow chart may include:
in 201, while in the first interface, the terminal obtains voice instructions from the user.
For example, as shown in fig. 3, the user sends a voice command "please open the instant messaging application a" to the terminal, and after the terminal receives the voice command through the voice assistant, the voice print feature may be extracted from the voice. Then, the terminal may compare the extracted voiceprint features with preset voiceprint features of the owner. After determining that the current user is the owner, the terminal can start the instant messaging application A.
After the instant messaging application A is started, the interface of the instant messaging application A can be accessed. For example, as shown in fig. 4, a first interface after the instant messaging application a is opened for the terminal. Then, for example, the voice assistant cannot acquire the application interface of the function module in the instant messaging application a, so that the user cannot enter the interface corresponding to a certain function module of the instant messaging application a again through the voice assistant.
In this embodiment, when a user needs to enter an interface corresponding to a certain function module of the instant messaging application a, the user may send a voice instruction to the terminal, and the terminal may obtain the voice instruction from the user.
For example, as shown in fig. 4, the user issues a voice instruction of "please click 'my'" to the terminal.
In 202, the terminal determines an operation to be performed by the voice command, wherein the operation is to enter a second interface of the target element included in the first interface.
For example, after acquiring a voice command from a user, the terminal may perform voice recognition on the voice command. The operation to be completed by the voice command can be analyzed through voice recognition. For example, through voice recognition, the terminal determines that the operation to be completed by the voice command is to enter an interface of a certain target element included in the first interface, that is, the second interface.
For example, through voice recognition, the terminal determines that the voice command "please click 'my'" acquired from the user is to enter the interface of the function module "my".
In 203, the terminal identifies the elements contained in the first interface and assigns coordinates to each element.
For example, after determining that the operation to be performed by the voice command acquired from the user is to enter the second interface of the target elements included in the first interface, the terminal may identify the elements included in the first interface and assign coordinates to each element.
In one embodiment, the terminal may establish a planar rectangular coordinate system with a vertex of a lower left corner of the display screen as an origin of coordinates, and assign coordinates to each element included in the first interface according to the planar rectangular coordinate system.
It should be noted that the elements included in the interface refer to function modules or function areas in the interface that allow a user to click. If a user manually clicks a certain function module or function area, the terminal enters an interface of the function module or function area.
For example, in a first interface as shown in fig. 4, the terminal may recognize that the elements contained therein may include: messages, address book, my, first message, second message, and third message. These elements are all functional modules or functional areas that allow the user to manually click on. For example, when the user manually clicks on the "address book" element, the "address book" function module may be entered to view the contact information. Alternatively, when the user manually clicks on the element "first message", the details of the first message may be viewed. In addition, since the status bar on the screen is not clickable, the terminal may not determine information on the status bar as a corresponding element.
After identifying the elements included in the first interface, the terminal may assign a corresponding coordinate to each element. For example, the terminal assigns coordinates of (1,1) to "message", coordinates of (3,1) to "address book", coordinates of (5,1) to "my", coordinates of (3,3) to "first message", coordinates of (3,5) to "second message", and coordinates of (3,7) to "third message".
In one embodiment, the terminal may identify the element included in the first interface by means of picture identification. Namely, the terminal takes the first interface as a picture, recognizes characters or words contained in the first interface by using a picture recognition mode, and determines the recognized characters or words as an element.
For example, in one embodiment, after entering the first interface, the terminal may capture a screenshot of the first interface, thereby obtaining a picture. Then, the terminal may perform picture recognition on the picture, and determine characters or words recognized from the picture as elements.
In 204, the terminal generates a correspondence of each element and its assigned coordinates.
For example, after assigning coordinates to each element included in the first interface, the terminal may generate a corresponding relationship between each element and the coordinates assigned thereto, and store the corresponding relationship.
In 205, the terminal obtains the coordinates of the target element from the allocated coordinates, and determines the coordinates of the target element as a target position of the target element in the first interface.
For example, after assigning corresponding coordinates to each element in the first interface, the terminal may obtain coordinates of the target element from the assigned coordinates, and determine the coordinates of the target element as a target position of the target element in the first interface.
For example, if the target element is "my" and the terminal assigns the target element with the coordinate of (5,1), the terminal may determine the coordinate of (5,1) as the target position of the target element "my" in the first interface.
At 206, the terminal makes a simulated click on the target location to access a second interface of the target element.
For example, after the target position of the target element in the first interface is determined, the terminal may perform simulated click on the target position, so as to enter the interface of the target element, that is, the second interface.
For example, after determining the target position of the target element "my", the terminal may click on the position with the coordinate (5,1) in a simulation manner, so as to enter the "my" interface. For example, the "My" interface may be as shown in FIG. 5, which contains functional modules such as "My wallet", "My favorites", "My photo album", "My settings", etc.
The simulated click means that the terminal simulates a manual click action of a user so as to trigger the terminal to complete an operation corresponding to the click action. For example, the user may enter the interface shown in FIG. 5 by manually clicking the "My" function in FIG. 4. Then, after determining the coordinates (5,1) of the target element "my" in fig. 4, the terminal may also enter the interface shown in fig. 5 by simulating the position of the click (5, 1).
For example, after entering the my interface (i.e., the second interface) shown in fig. 5, the terminal may also recognize the elements included in the second interface through the picture, and assign corresponding coordinates to each element. For example, the elements contained in the second interface may include "message", "address book", "my wallet", "my favorite", "my album", "my settings", and the like.
Thereafter, the terminal may assign coordinates (1,1) for "message", coordinates (3,1) for "address book", coordinates (5,1) for "my", coordinates (3,3) for "my settings", coordinates (3,5) for "my album", coordinates (3,7) for "my favorites", and coordinates (3,9) for "my wallet".
Thereafter, when the user issues a voice instruction of "click my wallet", the terminal may determine "my wallet" as a target element and acquire coordinates assigned by the terminal to the target element of "my wallet". After obtaining the coordinates (3,9) allocated to the my wallet by the terminal, the terminal may click on the position with the coordinates (3,9) in a simulation manner, so as to enter the interface of the my wallet. For example, the interface for "my wallet" may be as shown in FIG. 6.
At 207, when the terminal is in the first interface again and needs to enter the second interface of the target element, the terminal detects whether the current display screen is in a split screen display state.
For example, after entering the interface "my wallet" as shown in fig. 6, the user exits the instant messaging application a. Some time later, the user opens the instant messaging application a again, at which point the terminal is again in the first interface as shown in fig. 4, and the user issues a "click my" voice instruction to the terminal. At this time, the terminal may determine that the interface of the target element "my" (i.e., the second interface) needs to be entered, and then the terminal may first detect whether the current display screen is in the split-screen display state.
If the current display screen is detected to be in the split-screen display state, the terminal can select a target display screen area where the instant messaging application A is located, and picture recognition is carried out on the target display screen area, so that elements contained in the target display screen area are recognized. The terminal may then assign corresponding coordinates to each element in the target display screen area. Then, the terminal may acquire the coordinates to which the target element "my" is assigned and determine the coordinates as the target position. Finally, the terminal may enter the "my" interface by a simulated click of the target location.
If the current display screen is not detected to be in the split-screen display state, 208 is entered.
In 208, if the current display screen is not in the split-screen display state, the terminal obtains the coordinates of the target element according to the corresponding relationship, and performs simulated clicking on the position corresponding to the coordinates of the target element.
For example, when the terminal detects that the current display screen is not in the split-screen display state, the terminal may obtain coordinates of the target element "my" according to the correspondence between each element in the first interface generated in 204 and the coordinates allocated to the element, and perform simulated clicking on a position corresponding to the coordinates, thereby entering the "my" interface.
It can be understood that, when the current display screen is not in the split-screen display state, the terminal may obtain the coordinates of the target element according to the correspondence between the element generated in 204 and the coordinates, without re-identifying the element included in the first interface and allocating coordinates to the unidentified element. This can improve the processing speed of the terminal. However, if the current display screen is in the split-screen display state, the distribution positions of the elements in the first interface may change relative to the first interface shown in fig. 4, and therefore, the terminal needs to re-identify the elements included in the target display screen area and re-assign coordinates to each element. And then the terminal acquires the coordinates of the target element and carries out simulated click on the position of the coordinates of the target element so as to enter the interface of the target element.
In an embodiment, after the step of generating the corresponding relationship between each element and its assigned coordinate by the terminal in 204, the method may further include the following steps: the first version number of the application is recorded.
Then, when the terminal is located on the first interface again and needs to enter the second interface of the target element, the terminal obtains the coordinates of the target element according to the corresponding relationship generated in 204, and performs a simulated click on the position corresponding to the coordinates of the target element, which may include:
when the terminal is in the first interface again and needs to enter a second interface of the target element, the terminal obtains a second version number of the application;
and if the second version number is consistent with the first version number, the terminal acquires the coordinates of the target element according to the corresponding relation and carries out simulated clicking on the position corresponding to the coordinates of the target element.
For example, the elements and/or the distribution positions of the elements included in each interface of the application may change after the version of the application is updated. Thus, the correspondence of each element in the first interface generated in 204 and its assigned coordinates may no longer be applicable to the first interface of the version-updated application.
Thus, in the present embodiment, after the step in which the terminal generates the correspondence of each element and its assigned coordinates in 204, the terminal may record the first version number of the application. When the application is located in the first interface again and needs to enter the second interface of the target element, if it is determined that the version update of the application does not occur, that is, the second version number of the application acquired by the terminal is consistent with the first version number at this time, the terminal may acquire the coordinates of the target element according to the correspondence generated in 204, and perform simulated click on the position corresponding to the coordinates of the target element, thereby accessing the interface of the target element.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an interface control device according to an embodiment of the present application. The interface control apparatus 300 may include: a first obtaining module 301, a determining module 302, a second obtaining module 303, and an operating module 304.
The first obtaining module 301 is configured to obtain a voice command from a user when the user is in the first interface.
A determining module 302, configured to determine an operation to be completed by the voice instruction, where the operation is to enter a second interface of the target element included in the first interface.
A second obtaining module 303, configured to obtain a target position of the target element in the first interface.
And the operation module 304 is configured to perform simulated clicking on the target position to enter a second interface of the target element.
In one embodiment, the second obtaining module 303 may be configured to:
identifying elements included in the first interface and assigning coordinates to each of the elements;
and acquiring the coordinates of the target element from the allocated coordinates, and determining the coordinates of the target element as a target position of the target element in the first interface.
Referring to fig. 8, fig. 8 is another schematic structural diagram of an interface control device according to an embodiment of the present disclosure. In an embodiment, the interface control device 300 may further include: a generation module 305 and a recording module 306.
A generating module 305, configured to generate a corresponding relationship between each of the elements and the coordinates allocated to the elements.
Then, the operation module 304 may be configured to: and when the target element is positioned on the first interface again and needs to enter a second interface of the target element, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element.
In one embodiment, the operation module 304 is further configured to:
when the current display screen is in the first interface again and needs to enter a second interface of the target element, detecting whether the current display screen is in a split screen display state;
and if the current display screen is not in a split screen display state, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element.
A recording module 306 for recording the first version number of the application.
Then, the operation module 304 may be configured to:
when the application is located on the first interface again and needs to enter a second interface of the target element, acquiring a second version number of the application; and if the second version number is consistent with the first version number, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element.
The embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is caused to execute the steps in the interface control method provided in the embodiment.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the steps in the interface control method provided in this embodiment by calling the computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 9, fig. 9 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
The mobile terminal 400 may include components such as a display 401, memory 402, processor 403, and the like. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 9 is not intended to be limiting of mobile terminals and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The display 401 may be used to display information such as images and text.
The memory 402 may be used to store applications and data. The memory 402 stores applications containing executable code. The application programs may constitute various functional modules. The processor 403 executes various functional applications and data processing by running an application program stored in the memory 402.
The processor 403 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing an application program stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the mobile terminal.
In this embodiment, the processor 403 in the mobile terminal loads the executable code corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 403 runs the application programs stored in the memory 402, thereby implementing the steps:
when the first interface is in, acquiring a voice instruction from a user;
determining an operation to be completed by the voice instruction, wherein the operation is to enter a second interface of a target element contained in the first interface;
acquiring a target position of the target element in the first interface;
and carrying out simulated clicking on the target position to enter a second interface of the target element.
Referring to fig. 10, the mobile terminal 500 may include a display 501, a memory 502, a processor 503, an input unit 504, an output unit 505, and the like.
The display 501 may be used to display images, text, and the like.
The memory 502 may be used to store applications and data. Memory 502 stores applications containing executable code. The application programs may constitute various functional modules. The processor 503 executes various functional applications and data processing by running an application program stored in the memory 502.
The processor 503 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing an application program stored in the memory 502 and calling data stored in the memory 502, thereby performing overall monitoring of the mobile terminal.
The input unit 504 may be used to receive input numbers, character information, or user characteristic information (such as a fingerprint), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The output unit 505 may be used to display information input by or provided to a user and various graphic user interfaces of the mobile terminal, which may be configured by graphics, text, icons, video, and any combination thereof. The output unit may include a display panel.
In this embodiment, the processor 503 in the mobile terminal loads the executable code corresponding to the process of one or more application programs into the memory 502 according to the following instructions, and the processor 503 runs the application programs stored in the memory 502, thereby implementing the steps:
when the first interface is in, acquiring a voice instruction from a user;
determining an operation to be completed by the voice instruction, wherein the operation is to enter a second interface of a target element contained in the first interface;
acquiring a target position of the target element in the first interface;
and carrying out simulated clicking on the target position to enter a second interface of the target element.
In one embodiment, when the processor 503 executes the step of obtaining the target position of the target element in the first interface, it may execute: identifying elements included in the first interface and assigning coordinates to each of the elements; and acquiring the coordinates of the target element from the allocated coordinates, and determining the coordinates of the target element as a target position of the target element in the first interface.
In one embodiment, after the step of assigning coordinates to each of the elements, the processor 503 may further perform: and generating corresponding relation of each element and the coordinate distributed to the element.
Then, after the step of entering the second interface of the target element, the processor 503 may further perform: and when the target element is positioned on the first interface again and needs to enter a second interface of the target element, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element.
In an embodiment, when the processor 503 executes the step of obtaining the coordinates of the target element according to the corresponding relationship and performing the simulated click on the position corresponding to the coordinates of the target element when the target element is located on the first interface again and needs to enter the second interface of the target element, the steps may be executed: when the current display screen is in the first interface again and needs to enter a second interface of the target element, detecting whether the current display screen is in a split screen display state; and if the current display screen is not in a split screen display state, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element.
In one embodiment, after the step of generating the corresponding relationship between each of the elements and the coordinates assigned thereto, the processor 503 may further perform: the first version number of the application is recorded.
Then, when the processor 503 executes the step of obtaining the coordinates of the target element according to the corresponding relationship and performing simulated clicking on the position corresponding to the coordinates of the target element when the target element is located on the first interface again and needs to enter the second interface of the target element, the following steps may be executed: when the application is located on the first interface again and needs to enter a second interface of the target element, acquiring a second version number of the application; and if the second version number is consistent with the first version number, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the interface control method, and are not described herein again.
The interface control device provided in the embodiment of the present application and the interface control method in the above embodiments belong to the same concept, and any method provided in the embodiment of the interface control method may be operated on the interface control device, and a specific implementation process thereof is described in detail in the embodiment of the interface control method, and is not described herein again.
It should be noted that, for the interface control method described in the embodiment of the present application, it can be understood by those skilled in the art that all or part of the process of implementing the interface control method described in the embodiment of the present application can be completed by controlling the relevant hardware through a computer program, where the computer program can be stored in a computer-readable storage medium, such as a memory, and executed by at least one processor, and during the execution process, the process of the embodiment of the interface control method can be included. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the interface control device according to the embodiment of the present application, each functional module may be integrated into one processing chip, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The interface control method, the interface control device, the storage medium and the electronic device provided by the embodiments of the present application are described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present invention, and the description of the above embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for those skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (5)

1. An interface control method is applied to a terminal and is characterized by comprising the following steps:
when the first interface is in, acquiring a voice instruction from a user;
determining an operation to be completed by the voice instruction, wherein the operation is to enter a second interface of a target element included in the first interface;
identifying elements included in the first interface and assigning coordinates to each of the elements;
generating a corresponding relation between each element and the coordinates distributed to the elements;
acquiring the coordinates of the target element from the allocated coordinates, and determining the coordinates of the target element as a target position of the target element in the first interface;
performing simulated clicking on the target position to enter a second interface of the target element;
when the current display screen is in the first interface again and needs to enter a second interface of the target element, detecting whether the current display screen is in a split screen display state;
if the current display screen is not in a split-screen display state, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element;
and if the current display screen is in a split-screen display state, re-identifying the elements contained in the current display screen, allocating coordinates to each element, and performing simulated clicking on the position corresponding to the coordinates of the target element.
2. The interface control method according to claim 1, further comprising, after the step of generating a correspondence between each of the elements and its assigned coordinates:
recording a first version number of the application;
when the target element is located on the first interface again and needs to enter a second interface of the target element, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on a position corresponding to the coordinates of the target element, including:
when the application is located on the first interface again and needs to enter a second interface of the target element, acquiring a second version number of the application;
and if the second version number is consistent with the first version number, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element.
3. An interface control device applied to a terminal is characterized by comprising:
the first acquisition module is used for acquiring a voice instruction from a user when the first interface is in the first interface;
the determining module is used for determining an operation to be completed by the voice instruction, wherein the operation is to enter a second interface of a target element contained in the first interface;
the second acquisition module is used for identifying elements contained in the first interface and distributing coordinates to each element;
the generating module is used for generating the corresponding relation of each element and the coordinates distributed to the element;
the second obtaining module is further configured to obtain coordinates of the target element from the allocated coordinates, and determine the coordinates of the target element as a target position of the target element in the first interface;
the operation module is used for:
performing simulated clicking on the target position to enter a second interface of the target element;
when the current display screen is in the first interface again and needs to enter a second interface of the target element, detecting whether the current display screen is in a split screen display state;
if the current display screen is not in a split-screen display state, acquiring the coordinates of the target element according to the corresponding relation, and performing simulated clicking on the position corresponding to the coordinates of the target element;
and if the current display screen is in a split-screen display state, re-identifying the elements contained in the current display screen, allocating coordinates to each element, and performing simulated clicking on the position corresponding to the coordinates of the target element.
4. A storage medium having stored thereon a computer program, characterized in that the computer program, when executed on a computer, causes the computer to execute the method according to any of claims 1-2.
5. An electronic device comprising a memory, a processor, wherein the processor is configured to perform the method of any of claims 1-2 by invoking a computer program stored in the memory.
CN201810629387.5A 2018-06-19 2018-06-19 Interface control method and device, storage medium and electronic equipment Active CN108829371B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810629387.5A CN108829371B (en) 2018-06-19 2018-06-19 Interface control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810629387.5A CN108829371B (en) 2018-06-19 2018-06-19 Interface control method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108829371A CN108829371A (en) 2018-11-16
CN108829371B true CN108829371B (en) 2022-02-22

Family

ID=64142544

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810629387.5A Active CN108829371B (en) 2018-06-19 2018-06-19 Interface control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108829371B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113794800B (en) 2018-11-23 2022-08-26 华为技术有限公司 Voice control method and electronic equipment
CN111681658A (en) * 2020-06-05 2020-09-18 苏州思必驰信息科技有限公司 Voice control method and device for vehicle-mounted APP
CN112304325A (en) * 2020-10-23 2021-02-02 上海博泰悦臻网络技术服务有限公司 Automatic navigation method, terminal, system and storage medium based on social software
CN112634896B (en) * 2020-12-30 2023-04-11 智道网联科技(北京)有限公司 Operation method of application program on intelligent terminal and intelligent terminal
CN112286486B (en) * 2020-12-30 2021-04-09 智道网联科技(北京)有限公司 Operation method of application program on intelligent terminal, intelligent terminal and storage medium
CN112732379B (en) * 2020-12-30 2023-12-15 智道网联科技(北京)有限公司 Method for running application program on intelligent terminal, terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105302619A (en) * 2015-12-03 2016-02-03 腾讯科技(深圳)有限公司 Information processing method and device and electronic equipment
CN106201177A (en) * 2016-06-24 2016-12-07 维沃移动通信有限公司 A kind of operation execution method and mobile terminal
CN106373570A (en) * 2016-09-12 2017-02-01 深圳市金立通信设备有限公司 Voice control method and terminal
CN107608652A (en) * 2017-08-28 2018-01-19 三星电子(中国)研发中心 A kind of method and apparatus of Voice command graphical interfaces
CN107919129A (en) * 2017-11-15 2018-04-17 百度在线网络技术(北京)有限公司 Method and apparatus for controlling the page

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104217152A (en) * 2014-09-23 2014-12-17 陈包容 Implementation method and device for mobile terminal to enter application program under stand-by state

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105302619A (en) * 2015-12-03 2016-02-03 腾讯科技(深圳)有限公司 Information processing method and device and electronic equipment
CN106201177A (en) * 2016-06-24 2016-12-07 维沃移动通信有限公司 A kind of operation execution method and mobile terminal
CN106373570A (en) * 2016-09-12 2017-02-01 深圳市金立通信设备有限公司 Voice control method and terminal
CN107608652A (en) * 2017-08-28 2018-01-19 三星电子(中国)研发中心 A kind of method and apparatus of Voice command graphical interfaces
CN107919129A (en) * 2017-11-15 2018-04-17 百度在线网络技术(北京)有限公司 Method and apparatus for controlling the page

Also Published As

Publication number Publication date
CN108829371A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108829371B (en) Interface control method and device, storage medium and electronic equipment
US20200342338A1 (en) Method for preloading application, storage medium, and terminal
CN109240576B (en) Image processing method and device in game, electronic device and storage medium
US20130321314A1 (en) Method and terminal for activating application based on handwriting input
US20180173614A1 (en) Technologies for device independent automated application testing
EP3575963A2 (en) Method for preloading application, storage medium, and terminal
CN107608609B (en) Event object sending method and device
CN107977155B (en) Handwriting recognition method, device, equipment and storage medium
CN112114734B (en) Online document display method, device, terminal and storage medium
CN108984089B (en) Touch operation method and device, storage medium and electronic equipment
CN113268212A (en) Screen projection method and device, storage medium and electronic equipment
CN112199135A (en) Information guiding method, device, electronic equipment and storage medium
CN114092608B (en) Expression processing method and device, computer readable storage medium and electronic equipment
CN108415746B (en) Application interface display method and device, storage medium and electronic equipment
CN114741144B (en) Web-side complex form display method, device and system
CN115687146A (en) BIOS (basic input output System) test method and device, computer equipment and storage medium
CN112596883B (en) Application switching method and device, storage medium and electronic equipment
CN114443022A (en) Method for generating page building block and electronic equipment
CN108958929B (en) Method and device for applying algorithm library, storage medium and electronic equipment
CN108228307B (en) Application display method and device, storage medium and electronic equipment
CN111626021A (en) Presentation generation method and device
CN111666160A (en) Method and system for accessing application program to multiple interactive systems and computer equipment
WO2024065097A1 (en) Blackboard-writing content display method, electronic device, and storage medium
CN108268297B (en) Application interface display method and device, storage medium and electronic equipment
EP3635527B1 (en) Magnified input panels

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant