CN115495177A - Interface display method, device, storage medium and terminal - Google Patents

Interface display method, device, storage medium and terminal Download PDF

Info

Publication number
CN115495177A
CN115495177A CN202110682516.9A CN202110682516A CN115495177A CN 115495177 A CN115495177 A CN 115495177A CN 202110682516 A CN202110682516 A CN 202110682516A CN 115495177 A CN115495177 A CN 115495177A
Authority
CN
China
Prior art keywords
character
target
character object
wearable device
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110682516.9A
Other languages
Chinese (zh)
Inventor
吴靓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110682516.9A priority Critical patent/CN115495177A/en
Publication of CN115495177A publication Critical patent/CN115495177A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/90335Query processing
    • G06F16/90344Query processing by using string matching techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9038Presentation of query results

Abstract

The embodiment of the application discloses an interface display method, an interface display device, a storage medium and a terminal. The method comprises the following steps: the method comprises the steps that a terminal obtains characters input in a character input scene at present, at least one reference character object corresponding to the characters is determined, each reference character object is displayed, and in response to the selection operation of a target character object in the at least one reference character object, the wearable equipment executes the target operation aiming at the target character object. By the method, the terminal can assist in displaying the reference character object corresponding to the character input by the wearable device in the character input scene, and when the user selects the target character object from the reference character object, the terminal can also control the wearable device to execute the target operation on the target character object, so that the convenience of the display interface of the wearable device can be improved.

Description

Interface display method, device, storage medium and terminal
Technical Field
The present application relates to the field of computer technologies, and in particular, to an interface display method and apparatus, a storage medium, and a terminal.
Background
With the development of the internet, wearable devices are widely used in daily life of users, such as smart watches. Wearable equipment has set up operating system to integrated orientation module, communication module and multiple sensor, made the user can be independent of the cell-phone outside, realized daily functions such as news warning, partial health index measurement, conversation, payment, listening music, by bus, thereby brought very big facility for people's life. However, limited by volume, wearable device configured displays are smaller compared to cell phones.
Disclosure of Invention
The embodiment of the application provides an interface display method and device, a computer storage medium and a terminal, and aims to solve the technical problem of how to improve convenience of a display interface of wearable equipment. The technical scheme is as follows:
in a first aspect, an embodiment of the present application provides an interface display method, which is applied to a terminal, and the method includes:
when the wearable device is in a character input scene, acquiring characters currently input in the character input scene;
determining at least one reference character object corresponding to the character, and displaying each reference character object;
in response to a selection operation of a target character object of the at least one reference character object, performing, by the wearable device, a target operation for the target character object.
In a second aspect, an embodiment of the present application provides an interface display method, which is applied to a wearable device, and the method includes:
when the wearable device is in a character input scene, acquiring characters input in the character input scene at present, wherein the characters are used for indicating a terminal to display at least one reference character object corresponding to the characters;
and if the terminal determines a target character object from the at least one reference character object, executing target operation on the target character object.
In a third aspect, an embodiment of the present application provides an interface display apparatus, which is applied to a terminal, and the apparatus includes:
the character acquisition module is used for acquiring characters currently input in a character input scene when the wearable equipment is in the character input scene;
the character matching module is used for determining at least one reference character object corresponding to the character and displaying each reference character object;
a character manipulation module to perform, by the wearable device, a target manipulation for a target character object of the at least one reference character object in response to a selection manipulation of the target character object.
In a fourth aspect, an embodiment of the present application provides an interface display apparatus, which is applied to a wearable device, and the apparatus includes:
the character acquisition module is used for acquiring characters input in a character input scene when the wearable device is in the character input scene, wherein the characters are used for indicating a terminal to display at least one reference character object corresponding to the characters;
and the character operation module is used for executing target operation on the target character object if the terminal determines the target character object from the at least one reference character object.
In a fifth aspect, embodiments of the present application provide a computer storage medium having a plurality of instructions adapted to be loaded by a processor and to perform the above-mentioned method steps.
In a sixth aspect, an embodiment of the present application provides a terminal, which may include: a memory and a processor; wherein the memory stores a computer program adapted to be loaded by the memory and to perform the above-mentioned method steps.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
when the scheme of the embodiment of the application is executed, when the wearable device is in a character input scene and the wearable device receives the character input, the terminal acquires the characters currently input in the character input scene, determines at least one reference character object corresponding to the characters, and displays each reference character object, and the terminal also responds to the selection operation of a target character object in the at least one reference character object and executes the target operation aiming at the target character object by the wearable device. By the method, the terminal can assist in displaying the reference character object corresponding to the character input by the wearable device in the character input scene, and when the user selects the target character object from the reference character object, the terminal can also control the wearable device to execute the target operation on the target character object, so that the convenience of the display interface of the wearable device can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is also possible for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a schematic system architecture diagram of an interface display method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of an interface display method according to an embodiment of the present application;
FIG. 3 is a schematic interface diagram in another interface display method provided in the embodiments of the present application;
FIG. 4 is a schematic flow chart diagram illustrating another interface display method provided in an embodiment of the present application;
fig. 5 is a schematic interface diagram in another interface display method provided in the embodiment of the present application;
FIG. 6 is a schematic flowchart of another interface display method provided in the embodiments of the present application;
FIG. 7 is a schematic flowchart of another interface display method provided in the embodiments of the present application;
fig. 8 is a schematic structural diagram of an interface display device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an interface display device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a wearable device provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application;
FIG. 12 is a schematic structural diagram of an operating system and a user space provided in an embodiment of the present application;
FIG. 13 is an architectural diagram of the android operating system of FIG. 12.
FIG. 14 is an architectural diagram of the IOS operating system of FIG. 12.
Detailed Description
In order to make the objects, features and advantages of the embodiments of the present application more obvious and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
In the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present application, it is noted that, unless explicitly stated or limited otherwise, "including" and "having" and any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. The specific meaning of the above terms in this application will be understood to be a specific case for those of ordinary skill in the art. In addition, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Fig. 1 is a schematic diagram of a system architecture of an interface display method according to an embodiment of the present disclosure.
As shown in fig. 1, comprises a wearable device 101, a network 102 and a terminal device 103. Network 102 is the medium used to provide a communication link between wearable device 101 and terminal device 103. Network 102 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may use wearable device 101 to interact with terminal device 103 through network 102 to receive or send messages or the like. The terminal device 103 may be various electronic devices having a display screen, including but not limited to a smart phone, a tablet computer, a portable computer, a desktop computer, a television, and so on. The wearable device 101 may be an electronic device with a communication function, and in this embodiment, may be a smart watch. Electronic devices in different networks may be called different names, such as: user equipment, access terminal, subscriber unit, subscriber station, mobile station, remote terminal, mobile device, user terminal, wireless communication device, user agent or user equipment, cellular telephone, cordless telephone, personal Digital Assistant (PDA), electronic device in a 5G network or future evolution network, and the like.
In some embodiments, data (e.g., target compression packets) exchanged over a network is represented using techniques and/or formats including HyperText Mark-up Language (HTML), extensible Mark-up Language (XML), and the like. All or some of the links may also be encrypted using conventional encryption techniques such as Secure Socket Layer (SSL), transport Layer Security (TLS), virtual Private Network (VPN), internet Protocol Security (IPsec). In other embodiments, custom and/or dedicated data communication techniques may also be used in place of, or in addition to, the data communication techniques described above.
The terminal device 103 in the present application may be a terminal device that provides various services. For example, a user obtains a currently input character in the character input scene through the terminal device 103, determines at least one reference character object corresponding to the character, and displays each reference character object, and the terminal further performs a target operation on a target character object in response to a selection operation on the target character object in the at least one reference character object by the wearable device.
The present application will be described in detail with reference to specific examples.
Please refer to fig. 2, which is a schematic flowchart of an interface display method according to an embodiment of the present disclosure. As shown in fig. 2, for convenience of description, the embodiment of the present application takes an execution subject of each step as a terminal for description, and the method of the embodiment of the present application may include the following steps:
s201, when the wearable device is in a character input scene, acquiring characters input in the character input scene currently.
The character input scene may include, but is not limited to, a number input scene and a text input scene. The character input scenario may be understood as a scenario in which a phone number is input on the wearable device, a scenario in which a bank card number is input on the wearable device, a scenario in which a contact name of an address book is input on the wearable device, and the like.
Specifically, in a general practical application, the redundant display area on the wearable device and the display area of the reference character object corresponding to the character may be unmatched, that is, because the display area of the wearable device is limited, the reference character object corresponding to the currently input character cannot be displayed. Generally, when a user inputs characters on a wearable device, and the user inputs a number on a dial interface of the wearable device (smart watch), the smart watch does not display corresponding contact information, the user does not know which contact number the currently input number is, and needs to input a complete number to dial a call, and after the user clicks a dial button, a contact name corresponding to the number is displayed on a dial interface. Therefore, the reference character object corresponding to the character can be understood as a contact name and a contact number when the reference character object is in a scene of inputting a telephone number or in a scene of inputting a contact name; when the scene of inputting the bank card number is in, the reference character object is the bank name and the bank card number. In one possible embodiment, when the user inputs a character on the wearable device, the wearable device sends the character to the terminal, and the terminal can thus obtain the character input by the user. In another possible implementation, when the user inputs characters on the wearable device, a character input interface on the wearable device may be synchronously displayed on the terminal, and the terminal may thus obtain the characters input by the user from the character input interface of the terminal.
S202, determining at least one reference character object corresponding to the character, and displaying each reference character object.
Specifically, after the terminal acquires the characters, the characters may be matched to obtain reference character objects, and each reference character object may be displayed. The matching process of the characters can be understood as matching the characters in a specific character object set, wherein the character object set comprises a plurality of character objects, at least one reference character object exists in the character objects, and the character object set can be pre-stored locally in the terminal. The character objects in the set of character objects may or may not contain characters entered in the input scene. Since the user inputs characters one by one when inputting characters on the wearable device, the terminal can acquire each character in real time when acquiring the input characters. When a user inputs a first character, the first character object set containing the first character can be screened out according to the matching of the first character in a specific character object set, when a second character is input, the first character object set can be further matched according to the second character, a second character object set containing the first character and the second character can be screened out simultaneously, and in the same way, the terminal can screen out at least one reference character object corresponding to the input character and display the reference character objects.
For example: if the character input scene is a telephone number input scene, a user inputs a number 1 on the wearable device, the terminal acquires the number 1, matches the number 1 in the number set to obtain a first number set, and displays the first number set, wherein the first number set comprises contact names and contact numbers, and the first digit of each contact number is 1; the user continues to input the number 3, the terminal obtains the number 3, the number 3 is matched in the first number set, a second number set is obtained, the second number set is displayed, the second number set comprises contact names and contact numbers, and the first two digits of each contact number are 13; the user continues to input 4 again, the terminal obtains the number 4, matching is performed in the second number set to obtain a third number set, and the third number set is displayed, wherein the third number set comprises contact names and contact numbers, the first three bits of each contact number are 134, see the interface schematic diagram of the terminal shown in fig. 3, and the four bits of contact numbers shown in the diagram are character reference objects obtained by matching the terminal 134.
S203, responding to the selection operation of the target character object in the at least one reference character object, and executing the target operation aiming at the target character object by the wearable device.
It can be understood that, since at least one reference character object corresponding to a character is displayed on the display interface of the terminal, a user may perform a selection operation on any one reference character object from the at least one reference character object, that is, the user may select one reference character object from the at least one reference character object as a target character object. Further, the terminal may instruct (that is, be controlled by) the wearable device to perform the target operation on the target character object in response to the selection operation of the user on the target character object, which may be understood as that, after receiving the selection operation of the user on the target character object, the terminal may generate a sending instruction carrying the target character object, and may send the sending instruction to the wearable device. After the wearable device receives the sending instruction, the wearable device can analyze the sending instruction and acquire the target character object. Further, the wearable device may also display the target character object on the display interface, and if the user performs some touch operations on the target character object on the wearable device, the wearable device may perform a target operation corresponding to the touch operation on the target character object.
For example, following the example in S202, an application scenario of the embodiment of the present application is a scenario in which a phone number is input on a wearable device, and after the user selects the contact "xiaolu" on the interface shown in fig. 3, the terminal may send a sending instruction carrying the contact name "xiaolu" and the contact number "134" to the wearable device, where the sending instruction is used to instruct the wearable device to perform a target operation on the contact number and the contact name. The wearable device may receive the sending instruction, analyze the sending instruction, and obtain a contact number "134 × 3622", then the user may perform a dialing operation on the wearable device for the contact number, and if the user clicks a "dial" control on a display interface of the wearable device for the contact number, the wearable device may initiate a call to the number.
When the scheme of the embodiment of the application is executed, when the wearable device is in a character input scene and the wearable device receives the character input, the terminal acquires the characters currently input in the character input scene, determines at least one reference character object corresponding to the characters, and displays each reference character object, and the terminal also responds to the selection operation of a target character object in the at least one reference character object and executes the target operation aiming at the target character object by the wearable device. By the method, when the redundant display area of the wearable device is insufficient, the terminal can assist in displaying the reference character object corresponding to the character input by the wearable device in the character input scene, and when the user selects the target character object from the reference character object, the terminal can also control the wearable device to execute target operation on the target character object, so that convenience of a display interface of the wearable device can be improved.
Fig. 4 is a schematic flow chart of an interface display method according to an embodiment of the present disclosure. As shown in fig. 4, for convenience of description, in the embodiment of the present application, an execution subject of each step is taken as a terminal for description, and the method of the embodiment of the present application may include the following steps:
s401, when the wearable device starts a target interface corresponding to a character input scene, determining that the wearable device is in the character input scene, and acquiring the characters input on the wearable device currently.
It can be understood that, in an application scenario of the embodiment of the present application, when a user inputs a character on a wearable device, a redundant display area on the wearable device and a display area of a reference character object corresponding to the character are not matched, that is, because the display area of the wearable device is limited, the reference character object corresponding to the currently input character cannot be displayed. The character input scene of the embodiment of the present application may include, but is not limited to, a number input scene and a text input scene. The character input scenario may be understood as a scenario in which a phone number is input on the wearable device, a scenario in which a bank card number is input on the wearable device, a scenario in which a contact name of an address book is input on the wearable device, and the like.
Specifically, when the wearable device opens the target interface, the user can input characters on the target interface, the wearable device can acquire the input characters in real time, further, the wearable device can send the input characters to the terminal in real time, and the terminal can receive the input characters in real time.
S402, when the wearable device starts a target interface corresponding to a character input scene, determining that the wearable device is in the character input scene, synchronously starting a target associated interface corresponding to the target interface, and acquiring the characters input on the target associated interface currently.
Specifically, the application scenario of S402 may refer to S401, which is not described herein again. When the wearable device opens the target interface, the wearable device can send a synchronization instruction to the terminal, the synchronization instruction is used for indicating the terminal to synchronously open the target association interface, the wearable device can send input data on the target interface to the terminal in real time, and the terminal can receive the input data in real time and synchronously display the input data on the target association interface. It should be noted that the target associated interface of the terminal and the target interface of the wearable device are corresponding, the target interface is an interface corresponding to a character input scene on the wearable device side, and the target associated interface is an interface corresponding to the character input scene on the terminal side. When the user inputs characters on the target interface of the wearable device side, the target association interface of the terminal side can synchronously display the characters input by the user on the wearable device. It can be understood that, when the user inputs characters on the target interface in real time, the target association interface displays the input characters in real time, and then the terminal can acquire the characters input on the target association interface in real time.
And S403, receiving a first character object set sent by the wearable device, wherein the first character object set comprises at least one reference character object corresponding to the character, the reference character object is a character object obtained by the wearable device performing object matching processing on the character, and each reference character object is displayed.
It is understood that the reference character object may be a character object obtained by the wearable device performing an object matching process on the character. The wearable device performs object matching on the character, which may be understood as presetting a specific character object set on the wearable device, where the character object set includes a plurality of character objects, and performing matching processing on the character in the character object set according to a matching condition to obtain a reference character object, for example, the matching condition may be that the character is included.
Specifically, the terminal may receive a first character object set sent by the wearable device and subjected to object matching processing on characters, and display each reference character object in the first character object set. If the characters are address numbers, the wearable device can perform object matching processing on the address numbers in an address book set to obtain a first character object set, and can send the first character object set to the terminal. If the character is an account number, the wearable device may perform object matching processing on the account number in the account set to obtain a first character object set, and may send the first character object set to the terminal.
S404, carrying out object matching processing on the characters to obtain a second character object set, wherein the second character object set comprises at least one reference character object corresponding to the characters, and displaying each reference character object.
It is to be understood that the reference character object may be a character object obtained by performing an object matching process on a character by the terminal. The terminal performs object matching on the character, which may be understood as that a specific character object set is preset on the terminal, where the character object set includes a plurality of character objects, and the reference character object may be obtained by performing matching processing on the character in the character object set according to a matching condition, for example, the matching condition may be that the character is included.
Specifically, the terminal may perform object matching processing on the characters to obtain a second character object set, and display each reference character object in the second character object set. If the characters are address numbers, the terminal may perform object matching processing on the address numbers in the address book set to obtain a first character object set, and it may be understood that the address book set may include a first address book set corresponding to an address book application of the terminal, may further include a second address book set corresponding to an address book application of the wearable device, and may further include a third address book set corresponding to a yellow page number, where the yellow page number refers to an international general telephone directory of a business enterprise arranged according to enterprise properties and product categories. If the character is an account number, the terminal may perform object matching processing on the account number in the account set to obtain a first character object set, and may send the first character object set to the terminal.
S405, responding to the selection operation of a target character object in the at least one reference character object, and acquiring a standard character corresponding to the target character object, wherein the character quantity of the standard character is greater than or equal to that of the character.
S406, sending the standard character to the wearable device, wherein the standard character is used for instructing the wearable device to execute a target operation aiming at the target character object.
S405 and S406 are explained below.
Wherein a standard character refers to a numeric character or a literal character in a reference character object.
Specifically, the terminal displays at least one reference character object corresponding to the character, and the user selects a target character object from the at least one reference character object. Because the reference character object in the embodiment of the application refers to an object containing numbers and characters, if the characters input by the user are communication numbers, the standard characters can be the communication numbers corresponding to the target character object, and further, the terminal can send the communication numbers corresponding to the target character object to the wearable equipment, so that the user can dial the communication numbers on the wearable equipment; if the characters input by the user are bank account numbers, the standard characters can be the bank account numbers corresponding to the target character objects, and the terminal can send the bank account numbers corresponding to the target character objects to the wearable device, so that the user can perform operations such as payment on the bank card account numbers on the wearable device.
For example: referring to the contact number matching interface shown in fig. 3, fig. 3 shows reference character objects corresponding to the character "134", where each reference character object includes a contact name and a contact number, and the contact number is a standard character. Referring to the interface for matching bank account numbers shown in fig. 5, fig. 5 shows reference character objects corresponding to the character "62", where each reference character object includes a bank name and a bank account number, and the bank account number is a standard character.
S407, detecting a response state of the wearable device to the target operation.
S408, if the response state is a response success state, stopping displaying each reference character object.
S407 and S408 are explained below.
Specifically, after the terminal sends the standard character to the wearable device, a response state of the wearable device for the target operation may be detected, and a response state of the wearable device for the target operation may be detected. If the character is a communication number, the target operation for the communication number may be a dialing operation, and then the response instruction indicating successful response for the dialing operation may be that the user performs the dialing operation on the communication number on the wearable device. If the character is a bank card account number, the target operation for the bank card account number may be a payment operation, and the response instruction indicating successful response for the payment operation may be that the user performs the payment operation on the bank card number on the wearable device.
When the scheme of the embodiment of the application is executed, when the wearable device is in a character input scene and the wearable device receives the character input, the surplus display area corresponding to the wearable device is not matched with the object display area of the at least one reference character object corresponding to the character, the terminal acquires the character currently input in the character input scene, on one hand, the terminal can determine the at least one reference character object corresponding to the character and display each reference character object, on the other hand, the terminal can receive the at least one reference character object obtained after the wearable device performs matching processing on the character and display each reference character object, and the accuracy of character matching can be improved. The terminal can also send the standard characters corresponding to the target character object selected by the user to the wearable device, so that the wearable device can execute target operation on the target character object based on the standard characters, and when the redundant display area of the wearable device is insufficient, the reference character object corresponding to the characters is displayed on the terminal, and convenience of a display interface of the wearable device can be improved. In addition, the terminal can also stop displaying each reference character object when detecting that the wearable device successfully responds to the target operation, so that the data security can be ensured.
Fig. 6 is a schematic flow chart of an interface display method according to an embodiment of the present disclosure. As shown in fig. 6, for convenience of description, in the embodiment of the present application, an execution subject of each step is described as a wearable device, and the method of the embodiment of the present application may include the following steps:
s601, when the wearable device is in a character input scene, obtaining a character currently input in the character input scene, wherein the character is used for indicating a terminal to display at least one reference character object corresponding to the character.
It can be understood that, in an application scenario of the embodiment of the present application, when a user inputs a character on a wearable device, a redundant display area on the wearable device and a display area of a reference character object corresponding to the character are not matched, that is, because the display area of the wearable device is limited, the reference character object corresponding to the currently input character cannot be displayed. The character input scene of the embodiment of the present application may include, but is not limited to, a number input scene and a text input scene. The scenario of inputting characters may be a scenario of inputting a phone number on a wearable device, a scenario of inputting a bank card number on a wearable device, a scenario of inputting a contact name in an address list on a wearable device, or the like. The reference character object corresponding to the character can be understood as a contact name and a contact number when the reference character object is in a scene of inputting a telephone number or in a scene of inputting a contact name; when the scene of inputting the bank card number is in, the reference character object is the bank name and the bank card number.
In particular, in one possible implementation, when the user inputs characters on the wearable device, the wearable device may send the characters to the terminal, and the terminal may thus obtain the characters input by the user. Because the user inputs the characters on the wearable device one by one, the wearable device can acquire each character in real time when acquiring the input characters, and each character can be sent to the terminal in real time, and the character is mainly used for indicating the terminal to display at least one reference character object corresponding to the character. When the terminal receives a first character input by a user, the terminal can perform matching in a specific character object set according to the first character, can screen out a first character object set containing the first character, can perform matching in the first character object set according to a second character when the terminal receives a second character input by the user, can screen out a second character object set simultaneously containing the first character and the second character, and so on, the terminal can screen out at least one reference character object corresponding to the character input by the user and display the reference character objects.
S602, if the terminal determines a target character object from the at least one reference character object, executing a target operation on the target character object.
Specifically, the terminal displays at least one reference character object corresponding to the character, the reference character object includes a numeric character and a literal character, if the user selects one target character object from the at least one reference character object, the terminal can send the numeric character corresponding to the target character object to the wearable device, and the wearable device can perform corresponding target operation on the numeric character after receiving the numeric character corresponding to the target character object. For example, if the numeric character is a telephone number, the target operation may be a dialing operation; if the digital character is the card number of the bank card, the target operation can be a payment operation.
When the scheme of the embodiment of the application is executed, when a wearable device is in a character input scene and the wearable device receives the character input, a surplus display area corresponding to the wearable device is not matched with an object display area of at least one reference character object corresponding to the character, the wearable device acquires the character currently input in the character input scene, the character is used for indicating a terminal to display the at least one reference character object corresponding to the character, and if the terminal determines a target character object from the at least one reference character object, target operation on the target character object is executed. By the method, when the surplus display area of the wearable device is insufficient, the wearable device can control the terminal to display the reference character object corresponding to the character input by the wearable device in the character input scene, and when the user selects the target character object from the reference character object, the wearable device can execute target operation on the target character object, so that convenience of a display interface of the wearable device can be improved.
Please refer to fig. 7, which is a flowchart illustrating an interface display method according to an embodiment of the present disclosure. As shown in fig. 7, for convenience of description, in the embodiment of the present application, an execution subject of each step is described as a wearable device, and the method of the embodiment of the present application may include the following steps:
s701, when the wearable device is in a character input scene, starting a target interface corresponding to the character input scene, acquiring characters currently input in the target interface, and sending the characters to a terminal, wherein the characters are used for indicating the terminal to display the at least one reference character object.
Specifically, an application scenario of the embodiment of the present application may refer to S601 in fig. 6, which is not described herein again. When the wearable device opens the target interface, the user can input characters on the target interface, the wearable device can acquire the input characters in real time, further, the wearable device can send the input characters to the terminal in real time, and the terminal can receive the input characters in real time.
The character is used to instruct the terminal to display the at least one reference character object, and in a possible implementation, the instructing terminal may be understood to perform object matching processing on the character to obtain at least one reference character object corresponding to the character, and display each reference character object. The method includes that an instruction terminal performs object matching processing on characters, and can be understood that after acquiring input characters, a wearable device can generate a control instruction carrying the input characters, and can send the control instruction to the terminal, and the control instruction is used for instructing the terminal to perform object matching processing on the input characters. The terminal performs object matching on the character, which may be understood as that a specific character object set is preset on the terminal, where the character object set includes a plurality of character objects, and the reference character object may be obtained by performing matching on the character in the character object set according to a matching condition, for example, the matching condition may be that the character is included. If the character is an address book, the terminal may perform object matching processing on the address book to obtain a first character object set, where it may be understood that the address book set may include a first address book set corresponding to an address book application of the terminal, a second address book set corresponding to an address book application of the wearable device, and a third address book set corresponding to a yellow page number, where the yellow page number refers to an international general business telephone directory arranged according to business properties and product categories. If the character is an account number, the wearable device may perform object matching processing on the account number in the account set to obtain a first character object set, and may send the first character object set to the terminal.
The character is used to instruct the terminal to display the at least one reference character object, and in another possible implementation, it may be understood that the wearable device performs object matching processing on the input character to obtain the at least one reference character object corresponding to the input character, the wearable device may generate a control instruction carrying the at least one reference character object, and may send the control instruction to the terminal, where the control instruction is used to instruct the terminal to display the at least one reference character object. The wearable device performs object matching on the character, which may be understood as that a specific character object set is preset on the wearable device, the character object set includes a plurality of character objects, and the reference character object may be obtained by performing matching processing on the character in the character object set according to a matching condition, for example, the matching condition may include the character. The terminal can receive a first character object set which is sent by the wearable device and is used for carrying out object matching processing on characters, and displays each reference character object in the first character object set. If the characters are address numbers, the wearable device can perform object matching processing on the address numbers in an address book set to obtain a first character object set, and can send the first character object set to the terminal. If the character is an account number, the wearable device may perform object matching processing on the account number in the account set to obtain a first character object set, and may send the first character object set to the terminal.
S702, when the wearable device is in a character input scene, starting a target interface corresponding to the character input scene, synchronously starting a target association interface corresponding to the target interface based on the target interface indication terminal, acquiring characters input on the target association interface at present, and displaying at least one reference character object corresponding to the characters.
Specifically, the application scenario of S702 may refer to S701, and is not described herein again. When the wearable device opens the target interface, the wearable device can send a synchronization instruction to the terminal, the synchronization instruction is used for indicating the terminal to synchronously open the target association interface, the wearable device can send input data on the target interface to the terminal in real time, and the terminal can receive the input data in real time and synchronously display the input data on the target association interface. It should be noted that the target association interface of the terminal and the target interface of the wearable device are corresponding, the target interface is an interface corresponding to a character input scene on the wearable device side, and the target association interface is an interface corresponding to the character input scene on the terminal side. When the user inputs characters on the target interface of the wearable device side, the target association interface of the terminal side can synchronously display the characters input by the user on the wearable device. It can be understood that, when the user inputs characters on the target interface in real time, the target association interface displays the input characters in real time, and then the terminal can acquire the characters input on the target association interface in real time.
The method includes that an indication terminal acquires a character currently input on the target association interface and displays at least one reference character object corresponding to the character, and can be understood as that the indication terminal performs object matching processing on the character to obtain at least one reference character object corresponding to the character and displays each reference character object. The method includes that an indication terminal performs object matching processing on characters, and can be understood that when the wearable device sends input data on a target interface, a matching instruction can be sent to the terminal, the matching instruction is used for indicating that the terminal can perform object matching processing on the input data synchronously displayed on the target association interface, that is, the matching instruction is used for indicating the terminal to perform object matching processing on the characters synchronously displayed on the target association interface. The terminal performs object matching on the character, which may be understood as that a specific character object set is preset on the terminal, where the character object set includes a plurality of character objects, and the reference character object may be obtained by performing matching on the character in the character object set according to a matching condition, for example, the matching condition may be that the character is included.
S703, when the terminal responds to the selection operation of the target character object in the at least one reference character object, receiving a standard character corresponding to the target character object sent by the terminal.
S704, executing target operation aiming at the target character object based on the standard character.
S703 and S704 are explained below.
Wherein a standard character refers to a numeric character or a literal character in a reference character object.
Specifically, the terminal displays at least one reference character object corresponding to the character, the user selects a target character object from the at least one reference character object, the terminal can send a standard character corresponding to the target character object to the wearable device, and the wearable device receives the standard character. Because the reference character object in the embodiment of the application refers to an object containing numbers and characters, if the characters input by the user are communication numbers, the standard characters can be the communication numbers corresponding to the target character object, and further, the terminal can send the communication numbers corresponding to the target characters to the wearable equipment, so that the user can dial the communication numbers on the wearable equipment; if the characters input by the user are the bank card account numbers, the standard characters can be the bank card account numbers corresponding to the target character objects, and the terminal can send the bank card account numbers corresponding to the target character objects to the wearable device, so that the user can perform operations such as payment on the bank card account numbers on the wearable device.
S705, reporting a response state aiming at the target operation to the terminal.
S706, if the response status is a response success status, indicating the terminal to stop displaying each reference character object based on the response success status.
S705 and S706 are explained below.
Specifically, after the wearable device receives the standard character sent by the terminal and performs a corresponding target operation on the standard character, the wearable device may report a response state for the target operation to the terminal, so that the terminal stops displaying each reference character object based on the response state. If the character is a communication number, the target operation aiming at the communication number can be dialing operation, the wearable device can send a dialing success instruction to the terminal, and the terminal can stop displaying each reference character object after receiving the dialing success instruction. If the characters are bank card account numbers, the target operation aiming at the bank card account numbers can be payment operation, the wearable device can send payment success instructions to the terminal, and the terminal can stop displaying each reference character object after receiving the payment success instructions.
When the scheme of the embodiment of the application is executed, when the wearable device is in a character input scene, and the wearable device receives the character input, the surplus display area corresponding to the wearable device is not matched with the object display area of the at least one reference character object corresponding to the character, the wearable device opens the target interface corresponding to the character input scene, obtains the current character input in the target interface, sends the character to the terminal, controls the terminal to display the at least one reference character object, and sends the reference character object to the terminal after the object matching processing is carried out on the wearable device, so that the accuracy of character matching can be improved. When the terminal receives a selection instruction input for a target character object in the at least one reference character object, the wearable device may receive a standard character corresponding to the target character object sent by the terminal, and perform a target operation for the target character object based on the standard character. Therefore, when the redundant display area of the wearable device is insufficient, the reference character object corresponding to the character is displayed on the terminal, and convenience of a display interface of the wearable device can be improved. And the wearable device can also report a response state aiming at the target operation to the terminal, and if the response state is a response success state, the wearable device can instruct the terminal to stop displaying each reference character object, so that the data security can be ensured.
Please refer to fig. 8, which is a schematic structural diagram of an interface display device according to an embodiment of the present disclosure. The interface display device 800 may be implemented as all or a portion of a terminal by software, hardware, or a combination of both. The apparatus 800 comprises:
a character acquisition module 810, configured to acquire a currently input character in a character input scenario when a wearable device is in the character input scenario;
a character matching module 820, configured to determine at least one reference character object corresponding to the character, and display each reference character object;
a character operation module 830, configured to, in response to a selection operation of a target character object of the at least one reference character object, perform a target operation for the target character object by the wearable device.
Optionally, the character matching module 820 includes:
and the character matching unit is used for acquiring a character object set after the character is subjected to object matching processing, and the character object set comprises at least one reference character object corresponding to the character.
Optionally, the character matching unit includes:
a first matching unit, configured to receive a first character object set sent by the wearable device, where the first character object set includes at least one reference character object corresponding to the character, and the reference character object is a character object obtained by performing object matching processing on the character by the wearable device; and/or the presence of a gas in the gas,
and the second matching unit is used for carrying out object matching processing on the characters to obtain a second character object set, wherein the second character object set comprises at least one reference character object corresponding to the characters.
Optionally, the character operation module 830 further includes:
the first operation unit is used for acquiring a standard character corresponding to a target character object in the at least one reference character object based on selection operation of the target character object; the character quantity of the standard character is greater than or equal to that of the character;
the second operation unit is used for sending the standard character to the wearable equipment, and the standard character is used for indicating the wearable equipment to execute target operation aiming at the target character object.
Optionally, the apparatus 800 further comprises:
a first response module for detecting a response status of the wearable device for the target operation;
and the second response module is used for stopping displaying each reference character object if the response state is a response success state.
Optionally, the character matching module 820 includes:
the third matching unit is used for acquiring at least one reference character object obtained by the communication number through object matching processing in an address list set and displaying each reference character object; and/or the presence of a gas in the gas,
and the fourth matching unit is used for acquiring at least one reference character object obtained by object matching processing of the account number in an account set and displaying each reference character object.
Optionally, the third matching unit includes:
a fifth matching unit, configured to obtain a first number object set obtained by performing object matching processing on the communication number in a first address list set on the wearable device; and/or acquiring a second number object set obtained by the communication number in a second address book set on the terminal through object matching processing; and/or acquiring a third number object set obtained by carrying out object matching processing on the communication number in a yellow page number set;
a sixth matching unit, configured to display at least one number object in the first number object set, the second number object set, and the third number object set.
Optionally, the character obtaining module 810 includes:
the first obtaining unit is used for determining that the wearable device is in a character input scene when the wearable device starts a target interface corresponding to the character input scene, and obtaining a character currently input on the wearable device; and/or the presence of a gas in the gas,
the second obtaining unit is used for determining that the wearable device is in a character input scene when the wearable device opens a target interface corresponding to the character input scene, synchronously opening a target associated interface corresponding to the target interface, and obtaining the characters currently input on the target associated interface.
When the scheme of the embodiment of the application is executed, when a wearable device is in a character input scene and the wearable device receives character input, a surplus display area corresponding to the wearable device is not matched with an object display area of at least one reference character object corresponding to the character, a terminal acquires the character currently input in the character input scene, determines the at least one reference character object corresponding to the character, displays each reference character object, and further receives a selection instruction for a target character object in the at least one reference character object to control the wearable device to execute a target operation aiming at the target character object. By the method, when the redundant display area of the wearable device is insufficient, the terminal can assist in displaying the reference character object corresponding to the character input by the wearable device in the character input scene, and when the user selects the target character object from the reference character object, the terminal can also control the wearable device to execute target operation on the target character object, so that convenience of a display interface of the wearable device can be improved.
Fig. 9 is a schematic structural diagram of an interface display device according to an embodiment of the present application. The interface display apparatus 900 may be implemented as all or a part of a wearable device by software, hardware, or a combination of both. The apparatus 900 comprises:
a character obtaining module 910, configured to obtain, when the wearable device is in a character input scene, a character currently input in the character input scene, where the character is used to instruct a terminal to display at least one reference character object corresponding to the character;
a character operation module 920, configured to execute a target operation on a target character object if the terminal determines the target character object from the at least one reference character object.
Optionally, the character obtaining module 910 includes:
a first obtaining unit, configured to obtain a character currently input in the character input scene, perform object matching processing on the character to obtain a first character object set, where the first character object set includes at least one reference character object corresponding to the character, and send the first character object set to a terminal, and the first character object set is used to instruct the terminal to display the at least one reference character object; and/or the presence of a gas in the atmosphere,
and the second acquisition unit is used for acquiring the characters currently input in the character input scene and sending the characters to the terminal, wherein the characters are used for indicating the terminal to determine at least one reference character object corresponding to the characters and display the at least one reference character object.
Optionally, the character acquiring module 910 includes:
a third obtaining unit, configured to open a target interface corresponding to the character input scene, obtain a character currently input in the target interface, and send the character to the terminal; and/or the presence of a gas in the atmosphere,
and the fourth acquisition unit is used for starting a target interface corresponding to the character input scene, synchronously starting a target associated interface corresponding to the target interface based on the target interface indication terminal and acquiring the characters currently input on the target associated interface.
Optionally, the character operation module 920 includes:
the terminal comprises a first operation unit, a second operation unit and a third operation unit, wherein the first operation unit is used for receiving a standard character corresponding to a target character object in at least one reference character object sent by the terminal when the terminal responds to a selection operation of the target character object;
a second operation unit configured to perform a target operation for the target character object based on the standard character.
Optionally, the apparatus 900 further includes:
a first state reporting module, configured to report a response state for the target operation to the terminal;
and a second state reporting module, configured to instruct, if the response state is a response success state, the terminal to stop displaying each reference character object based on the response success state.
When the scheme of the embodiment of the application is executed, when a wearable device is in a character input scene and the wearable device receives the character input, a surplus display area corresponding to the wearable device is not matched with an object display area of at least one reference character object corresponding to the character, the wearable device acquires the character currently input in the character input scene, the character is used for indicating a terminal to display the at least one reference character object corresponding to the character, and if the terminal determines a target character object from the at least one reference character object, target operation on the target character object is executed. By the method, when the surplus display area of the wearable device is insufficient, the terminal can assist in displaying the reference character object corresponding to the character input by the wearable device in the character input scene, and when the user selects the target character object from the reference character object, the wearable device can execute target operation on the target character object, so that convenience of a display interface of the wearable device can be improved.
Please refer to fig. 10, which provides a schematic structural diagram of a wearable device according to an embodiment of the present application. As shown in fig. 10, a wearable device 2100 may include: at least one processor 2101, at least one network interface 2104, a user interface 2103, memory 2105, at least one communication bus 2102.
The wearable device 2100 also includes the display screen assembly 1800 of the embodiments described above.
The communication bus 2102 is used to implement, among other things, connection communication between these components.
User interface 2103 may comprise a Display screen (Display), a Camera (Camera), and optional user interface 2103 may further comprise a standard wired interface, a wireless interface.
The network interface 2104 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
The processor 2101 may include one or more processing cores, among other things. The processor 2101 interfaces with various interfaces and wiring throughout the wearable device 2100 to perform various functions of the wearable device 2100 and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 2105 and invoking data stored in the memory 2105. Optionally, the processor 2101 may be implemented in at least one hardware form of Digital Signal Processing (DSP), field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 2101 may be integrated with one or a combination of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is to be understood that the modem may not be integrated into the processor 2101, but may be implemented by using one chip.
The Memory 2105 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 2105 includes a non-transitory computer-readable medium. The memory 2105 may be used for storing instructions, programs, code sets, or instruction sets. The memory 2105 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 2105 may optionally be at least one storage device located remotely from the processor 2101. As shown in fig. 10, the memory 2105, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an interface display program.
In the wearable device 2100 shown in fig. 10, the user interface 2103 is mainly used as an interface for providing input for a user, and acquiring data input by the user; the processor 2101 may be configured to call the interface display program stored in the memory 1005, and specifically perform the following operations:
when the wearable device is in a character input scene, acquiring characters input in the character input scene at present, wherein the characters are used for indicating a terminal to display at least one reference character object corresponding to the characters;
and if the terminal determines a target character object from the at least one reference character object, executing target operation on the target character object.
In one embodiment, when the step of acquiring a character currently input in the character input scene is executed, where the character is used to instruct a terminal to display at least one reference character object corresponding to the character, the processor 2101 specifically executes the following operations:
acquiring a character input in the character input scene at present, performing object matching processing on the character to obtain a first character object set, wherein the first character object set comprises at least one reference character object corresponding to the character, and sending the first character object set to a terminal, and the first character object set is used for indicating the terminal to display the at least one reference character object; and/or the presence of a gas in the gas,
and acquiring the characters input in the character input scene at present, and sending the characters to a terminal, wherein the characters are used for indicating the terminal to determine at least one reference character object corresponding to the characters and display the at least one reference character object.
In one embodiment, when the step of acquiring the character currently input in the character input scene is performed, the processor 2101 specifically performs the following operations:
starting a target interface corresponding to the character input scene, acquiring characters input in the target interface at present, and sending the characters to the terminal; and/or the presence of a gas in the gas,
and starting a target interface corresponding to the character input scene, indicating a terminal to synchronously start a target associated interface corresponding to the target interface based on the target interface and acquiring the characters currently input on the target associated interface.
In an embodiment, when the step of performing the target operation on the target character object if the terminal determines the target character object from the at least one reference character object is performed, the processor 2101 specifically performs the following operations:
when the terminal responds to the selection operation of a target character object in the at least one reference character object, receiving a standard character corresponding to the target character object sent by the terminal;
and executing target operation aiming at the target character object based on the standard character.
In one embodiment, after the processor 2101 performs the target operation on the target character object, the following operations are also performed:
reporting a response state aiming at the target operation to the terminal;
and if the response state is a response success state, indicating the terminal to stop displaying each reference character object based on the response success state.
Referring to fig. 11, a block diagram of a terminal according to an exemplary embodiment of the present application is shown. A terminal in the present application may include one or more of the following components: a processor 110, a memory 120, an input device 130, an output device 140, and a bus 150. The processor 110, memory 120, input device 130, and output device 140 may be connected by a bus 150.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the entire terminal using various interfaces and lines, performs various functions of the terminal 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and calling data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), field-programmable gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a read-only Memory (ROM). Optionally, the memory 120 includes a non-transitory computer-readable medium. The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, and the like), instructions for implementing various method embodiments described below, and the like, and the operating system may be an Android (Android) system, including a system based on Android system depth development, an IOS system developed by apple, including a system based on IOS system depth development, or other systems. The data storage area can also store data created by the electronic equipment in use, such as a phone book, audio and video data, chatting record data and the like.
Referring to fig. 12, the memory 120 may be divided into an operating system space, where an operating system is run, and a user space, where native and third-party applications are run. In order to ensure that different third-party application programs can achieve a better operation effect, the operating system allocates corresponding system resources to the different third-party application programs. However, the requirements of different application scenarios in the same third-party application program on system resources also differ, for example, in a local resource loading scenario, the third-party application program has a higher requirement on the disk reading speed; in the animation rendering scene, the third-party application program has a high requirement on the performance of the GPU. The operating system and the third-party application program are independent from each other, and the operating system often cannot timely sense the current application scene of the third-party application program, so that the operating system cannot perform targeted system resource adaptation according to the specific application scene of the third-party application program.
In order to enable the operating system to distinguish a specific application scenario of the third-party application program, data communication between the third-party application program and the operating system needs to be opened, so that the operating system can acquire current scenario information of the third-party application program at any time, and further perform targeted system resource adaptation based on the current scenario.
Taking an operating system as an Android system as an example, programs and data stored in the memory 120 are as shown in fig. 13, and a Linux kernel layer 320, a system runtime library layer 340, an application framework layer 360, and an application layer 380 may be stored in the memory 120, where the Linux kernel layer 320, the system runtime library layer 340, and the application framework layer 360 belong to an operating system space, and the application layer 380 belongs to a user space. The Linux kernel layer 320 provides underlying drivers for various hardware of the electronic device, such as a display driver, an audio driver, a camera driver, a bluetooth driver, a Wi-Fi driver, power management, and the like. The system runtime library layer 340 provides a main feature support for the Android system through some C/C + + libraries. For example, the SQLite library provides support for a database, the OpenGL/ES library provides support for 3D drawing, the Webkit library provides support for a browser kernel, and the like. Also provided in the system runtime library layer 340 is an Android runtime library (Android runtime), which mainly provides some core libraries capable of allowing developers to write Android applications using the Java language. The application framework layer 360 provides various APIs that may be used in building an application, and developers may build their own applications by using these APIs, such as activity management, window management, view management, notification management, content provider, package management, session management, resource management, and location management. At least one application program runs in the application layer 380, and the application programs may be native application programs carried by the operating system, such as a contact program, a short message program, a clock program, a camera application, and the like; or a third-party application developed by a third-party developer, such as a game application, an instant messaging program, a photo beautification program, a file processing program, and the like.
Taking an operating system as an IOS system as an example, programs and data stored in the memory 120 are shown in fig. 14, and the IOS system includes: a Core operating system Layer 420 (Core OS Layer), a Core Services Layer 440 (Core Services Layer), a Media Layer 460 (Media Layer), and a touchable Layer 480 (Cocoa Touch Layer). The kernel operating system layer 420 includes an operating system kernel, drivers, and underlying program frameworks that provide functionality closer to hardware for use by program frameworks located in the core services layer 440. The core services layer 440 provides system services and/or program frameworks such as a Foundation (Foundation) framework, an account framework, an advertisement framework, a data storage framework, a network connection framework, a geographic location framework, a motion framework, and the like, as needed by the application. The media layer 460 provides audiovisual related interfaces for applications, such as graphics image related interfaces, audio technology related interfaces, video technology related interfaces, audio video transmission technology wireless playback (AirPlay) interfaces, and the like. Touchable layer 480 provides various common interface-related frameworks for application development, and touchable layer 480 is responsible for user touch interaction operations on the electronic device. Such as a local notification service, a remote push service, an advertising framework, a game tool framework, a messaging User Interface (UI) framework, a User Interface UIKit framework, a map framework, and so forth.
In the framework illustrated in FIG. 14, the framework associated with most applications includes, but is not limited to: a base framework in the core services layer 440 and a UIKit framework in the touchable layer 480. The base framework provides many basic object classes and data types, provides the most basic system services for all applications, and is UI independent. While the class provided by the UIKit framework is a basic library of UI classes for creating touch-based user interfaces, iOS applications can provide UIs based on the UIKit framework, so it provides an infrastructure for applications for building user interfaces, drawing, processing and user interaction events, responding to gestures, and the like.
The Android system can be referred to as a mode and a principle for realizing data communication between the third-party application program and the operating system in the IOS system, and details are not repeated herein.
The input device 130 is used for receiving input instructions or data, and the input device 130 includes, but is not limited to, a keyboard, a mouse, a camera, a microphone, or a touch device. The output device 140 is used for outputting instructions or data, and the output device 140 includes, but is not limited to, a display device, a speaker, and the like. In one example, the input device 130 and the output device 140 may be combined, and the input device 130 and the output device 140 are touch display screens for receiving a touch operation of a user on or near the touch display screens by using a finger, a touch pen or any other suitable object, and displaying user interfaces of various applications. Touch displays are typically provided on the front panel of an electronic device. The touch display screen may be designed as a full-screen, a curved screen, or a profiled screen. The touch display screen can also be designed to be a combination of a full-face screen and a curved-face screen, and a combination of a special-shaped screen and a curved-face screen, which is not limited in the embodiment of the present application.
In addition, those skilled in the art will appreciate that the configurations of the electronic devices illustrated in the above-described figures do not constitute limitations on the electronic devices, which may include more or fewer components than illustrated, or some components may be combined, or a different arrangement of components. For example, the electronic device further includes a radio frequency circuit, an input unit, a sensor, an audio circuit, a wireless fidelity (WiFi) module, a power supply, a bluetooth module, and other components, which are not described herein again.
In the embodiment of the present application, the main body of execution of each step may be the terminal described above. Optionally, the execution subject of each step is an operating system of the terminal. The operating system may be an android system, an IOS system, or another operating system, which is not limited in this embodiment of the present application.
The terminal of the embodiment of the present application may further have a display device installed thereon, and the display device may be various devices capable of implementing a display function, for example: a cathode ray tube display (CR), a light-emitting diode display (LED), an electronic ink panel, a Liquid Crystal Display (LCD), a Plasma Display Panel (PDP), and the like. The user can view information such as displayed text, images, video, etc. using the display device on the terminal 101. The terminal may be a smart phone, a tablet computer, a gaming device, an AR (Augmented Reality) device, an automobile, a data storage device, an audio playing device, a video playing device, a notebook, a desktop computing device, a wearable device such as an electronic watch, an electronic glasses, an electronic helmet, an electronic bracelet, an electronic necklace, an electronic garment, or the like.
In the terminal shown in fig. 11, the processor 110 may be configured to call the interface display program stored in the memory 120, and specifically perform the following operations:
when the wearable device is in a character input scene, acquiring characters currently input in the character input scene;
determining at least one reference character object corresponding to the character, and displaying each reference character object;
in response to a selection operation of a target character object of the at least one reference character object, performing, by the wearable device, a target operation for the target character object.
In one embodiment, when the processor 110 performs the step of determining at least one reference character object corresponding to the character, the following operations are specifically performed:
and acquiring a character object set after the character is subjected to object matching processing, wherein the character object set comprises at least one reference character object corresponding to the character.
In an embodiment, when the step of obtaining a character object set obtained by performing object matching processing on the character is executed, where the character object set includes at least one reference character object corresponding to the character, the processor 110 specifically executes the following operations:
receiving a first character object set sent by the wearable device, wherein the first character object set comprises at least one reference character object corresponding to the character, and the reference character object is a character object obtained by the wearable device performing object matching processing on the character; and/or the presence of a gas in the gas,
and performing object matching processing on the characters to obtain a second character object set, wherein the second character object set comprises at least one reference character object corresponding to the characters.
In one embodiment, the processor 110, when performing the step of performing, by the wearable device, a target operation for a target character object in response to a selection operation of the target character object in the at least one reference character object, specifically performs the following operations:
acquiring a standard character corresponding to a target character object based on the selection operation of the target character object in the at least one reference character object; the character quantity of the standard character is greater than or equal to that of the character;
sending the standard character to the wearable device, wherein the standard character is used for instructing the wearable device to execute a target operation aiming at the target character object.
In one embodiment, the processor 110 also performs the following operations:
detecting a response state of the wearable device for the target operation;
and if the response state is a response success state, stopping displaying each reference character object.
In an embodiment, when the step of determining at least one reference character object corresponding to the character and displaying each reference character object is executed, the processor 110 specifically executes the following operations:
the character communication number acquires at least one reference character object obtained by the communication number through object matching processing in an address book set, and displays each reference character object; and/or the presence of a gas in the gas,
the character is an account number, at least one reference character object obtained by object matching processing of the account number in an account set is obtained, and each reference character object is displayed.
In an embodiment, when the step of obtaining at least one reference character object obtained by performing object matching processing on the address book set by the address number and displaying each reference character object is executed by the processor 110, the following operations are specifically executed:
acquiring a first number object set obtained by the communication number in a first address book set on the wearable device through object matching processing; and/or acquiring a second number object set obtained by the communication number in a second address book set on the terminal through object matching processing; and/or acquiring a third number object set obtained by carrying out object matching processing on the communication number in a yellow page number set;
and displaying at least one number object in the first number object set, the second number object set and the third number object set.
In one embodiment, when the step of acquiring the currently input character in the character input scenario is performed when the wearable device is in the character input scenario, the processor 110 specifically performs the following operations:
when the wearable device starts a target interface corresponding to a character input scene, determining that the wearable device is in the character input scene, and acquiring characters input on the wearable device at present; and/or the presence of a gas in the atmosphere,
when the wearable device starts a target interface corresponding to a character input scene, determining that the wearable device is in the character input scene, synchronously starting a target associated interface corresponding to the target interface, and acquiring the characters currently input on the target associated interface.
It is clear to a person skilled in the art that the solution of the present application can be implemented by means of software and/or hardware. The "unit" and "module" in this specification refer to software and/or hardware that can perform a specific function independently or in cooperation with other components, where the hardware may be, for example, a Field-ProgrammaBLE Gate Array (FPGA), an Integrated Circuit (IC), or the like.
It should be noted that for simplicity of description, the above-mentioned embodiments of the method are described as a series of acts, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some service interfaces, indirect coupling or communication connection of devices or units, and may be electrical or in other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable memory. Based on such understanding, the technical solutions of the present application, in essence or part of the technical solutions contributing to the prior art, or all or part of the technical solutions, can be embodied in the form of a software product, which is stored in a memory and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by a program, which is stored in a computer-readable memory, and the memory may include: flash disks, read-Only memories (ROMs), random Access Memories (RAMs), magnetic or optical disks, and the like.
The above description is only an exemplary embodiment of the present disclosure, and the scope of the present disclosure should not be limited thereby. That is, all equivalent changes and modifications made in accordance with the teachings of the present disclosure are intended to be included within the scope of the present disclosure. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (15)

1. An interface display method is applied to a terminal, and comprises the following steps:
when the wearable device is in a character input scene, acquiring characters currently input in the character input scene;
determining at least one reference character object corresponding to the character, and displaying each reference character object;
in response to a selection operation of a target character object of the at least one reference character object, performing, by the wearable device, a target operation for the target character object.
2. The method of claim 1, wherein said determining at least one reference character object corresponding to said character comprises:
and acquiring a character object set after the character is subjected to object matching processing, wherein the character object set comprises at least one reference character object corresponding to the character.
3. The method according to claim 2, wherein the obtaining a character object set after performing object matching processing on the character, the character object set including at least one reference character object corresponding to the character, comprises:
receiving a first character object set sent by the wearable device, wherein the first character object set comprises at least one reference character object corresponding to the character, and the reference character object is a character object obtained by the wearable device performing object matching processing on the character; and/or the presence of a gas in the gas,
and performing object matching processing on the characters to obtain a second character object set, wherein the second character object set comprises at least one reference character object corresponding to the characters.
4. The method according to any one of claims 1-3, wherein the performing, by the wearable device, a target operation for a target character object of the at least one reference character object in response to a selection operation of the target character object comprises:
responding to the selection operation of a target character object in the at least one reference character object, and acquiring a standard character corresponding to the target character object; the character quantity of the standard character is greater than or equal to that of the character;
sending the standard character to the wearable device, wherein the standard character is used for instructing the wearable device to execute a target operation aiming at the target character object.
5. The method of claim 1, further comprising:
detecting a response state of the wearable device for the target operation;
and if the response state is a response success state, stopping displaying each reference character object.
6. The method of claim 1, wherein said determining at least one reference character object corresponding to said character, displaying each said reference character object, comprises:
the character communication number acquires at least one reference character object obtained by the communication number through object matching processing in an address book set, and displays each reference character object; and/or the presence of a gas in the gas,
the character is an account number, at least one reference character object obtained by object matching processing of the account number in an account set is obtained, and each reference character object is displayed.
7. The method of claim 6, wherein the obtaining at least one reference character object obtained by performing object matching processing on the address book set by the address number, and displaying each reference character object comprises:
acquiring a first number object set obtained by the communication number in a first address book set on the wearable device through object matching processing; and/or acquiring a second number object set obtained by the communication number in a second address book set on the terminal through object matching processing; and/or acquiring a third number object set obtained by object matching processing of the communication number in the yellow page number set;
and displaying at least one number object in the first number object set, the second number object set and the third number object set.
8. The method of claim 1, wherein obtaining the character currently entered in the character entry scenario while the wearable device is in the character entry scenario comprises:
when a wearable device starts a target interface corresponding to a character input scene, determining that the wearable device is in the character input scene, and acquiring a character currently input on the wearable device; and/or the presence of a gas in the gas,
when the wearable device starts a target interface corresponding to a character input scene, determining that the wearable device is in the character input scene, synchronously starting a target associated interface corresponding to the target interface, and acquiring the characters currently input on the target associated interface.
9. An interface display method is applied to a wearable device, and comprises the following steps:
when the wearable device is in a character input scene, acquiring a character currently input in the character input scene, wherein the character is used for indicating a terminal to display at least one reference character object corresponding to the character;
and if the terminal determines a target character object from the at least one reference character object, executing target operation on the target character object.
10. The method according to claim 9, wherein the obtaining a character currently input in the character input scene, the character being used for instructing a terminal to display at least one reference character object corresponding to the character, comprises:
acquiring a character input in the character input scene at present, performing object matching processing on the character to obtain a first character object set, wherein the first character object set comprises at least one reference character object corresponding to the character, and sending the first character object set to a terminal, and the first character object set is used for indicating the terminal to display the at least one reference character object; and/or the presence of a gas in the gas,
and acquiring the characters input in the character input scene at present, and sending the characters to a terminal, wherein the characters are used for indicating the terminal to determine at least one reference character object corresponding to the characters and display the at least one reference character object.
11. The method of claim 9, wherein obtaining the character currently entered in the character entry scenario comprises:
starting a target interface corresponding to the character input scene, acquiring characters input in the target interface at present, and sending the characters to the terminal; and/or the presence of a gas in the gas,
and starting a target interface corresponding to the character input scene, indicating a terminal to synchronously start a target associated interface corresponding to the target interface based on the target interface and acquiring the characters currently input on the target associated interface.
12. The method according to claim 9, wherein if the terminal determines a target character object from the at least one reference character object, performing a target operation on the target character object, comprises:
when the terminal responds to the selection operation of a target character object in the at least one reference character object, receiving a standard character corresponding to the target character object sent by the terminal;
and executing target operation aiming at the target character object based on the standard character.
13. The method of claim 9, wherein after performing the target operation for the target character object, further comprising:
reporting a response state aiming at the target operation to the terminal;
and if the response state is a response success state, indicating the terminal to stop displaying each reference character object based on the response success state.
14. A computer storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to perform the method steps according to any one of claims 1 to 8 or 9 to 13.
15. A terminal, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1 to 8 or 9 to 13.
CN202110682516.9A 2021-06-18 2021-06-18 Interface display method, device, storage medium and terminal Pending CN115495177A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110682516.9A CN115495177A (en) 2021-06-18 2021-06-18 Interface display method, device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110682516.9A CN115495177A (en) 2021-06-18 2021-06-18 Interface display method, device, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN115495177A true CN115495177A (en) 2022-12-20

Family

ID=84464331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110682516.9A Pending CN115495177A (en) 2021-06-18 2021-06-18 Interface display method, device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN115495177A (en)

Similar Documents

Publication Publication Date Title
EP4130994A1 (en) Remote assistance method and apparatus, and storage medium and terminal
CN112653670A (en) Service logic vulnerability detection method, device, storage medium and terminal
CN112214653A (en) Character string recognition method and device, storage medium and electronic equipment
CN110572815A (en) Network access method, device, storage medium and terminal
CN113438614A (en) Flow package obtaining method and device, terminal and storage medium
CN110442416B (en) Method, electronic device and computer-readable medium for presenting information
CN113950043B (en) Communication method, device, storage medium and terminal
CN111857480B (en) Icon alignment method and device, storage medium and electronic equipment
CN115495177A (en) Interface display method, device, storage medium and terminal
CN113419650A (en) Data moving method and device, storage medium and electronic equipment
CN113114849A (en) Alarm clock reminding method and device, storage medium and terminal
CN113312572A (en) Resource processing method and device, storage medium and electronic equipment
CN113286349A (en) Personal hotspot connection method, device, terminal and storage medium
CN113268414A (en) Distribution method and device of experimental versions, storage medium and computer equipment
CN111859999A (en) Message translation method, device, storage medium and electronic equipment
CN113450762A (en) Character reading method, device, terminal and storage medium
CN111538997A (en) Image processing method, image processing device, storage medium and terminal
CN113972989A (en) Data verification method and device, storage medium and electronic equipment
CN113098859B (en) Webpage page rollback method, device, terminal and storage medium
CN113068252B (en) Positioning method, positioning device, terminal and storage medium
CN113315687B (en) Proxy network management method, device, storage medium and terminal
CN111212411B (en) File transmission method, device, storage medium and terminal
CN107800618B (en) Picture recommendation method and device, terminal and computer-readable storage medium
CN116204538A (en) Rarely used word processing method and device, storage medium and electronic equipment
CN115314588A (en) Background synchronization method, device, terminal, equipment, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination